His visit follows a series of business deals announced by the Trump Organization, which is run by the president's sons, Eric and Donald Jr.
Gaming developer John O'Reilly, 28, noticed the systems in their local Tesco in Woolwich in south-east London, and wonders how anyone can consent to "such deeply invasive technology".John says that everyone needs groceries so the number of people whose shopping can be tracked is huge.
"Are there even any rules informing customers before they enter? How on earth can the average person understand the extent of the tracking?"Are children even kept out of the dataset? Who can access this data? Is it shared with police? Is my data being sold? We need answers to these questions!"Heather, 30, from Nottingham says the tech makes her feel uncomfortable and punishes shoppers who are honest and use the self-scan as intended.
"Yes, you have cameras following you everywhere in the store, but this is simply too invasive," she told the BBC."If stores are so paranoid about shoppers stealing goods, then they should go back to staffed tills instead of wasting money on this invasive technology."
The move marks the latest attempt by retailers to try to stem the rise in shoplifting.
at its Gateshead store, which prompted a similarly mixed response from shoppers earlier this year."The fact that this is not a real person is so much easier to handle."
People around the world have shared their private thoughts and experiences with AI chatbots, even though they are widely acknowledged as inferior to seeking professional advice. Character.ai itself tells its users: "This is an AI chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice."But in extreme examples chatbots have been accused of giving harmful advice.
Character.ai is currently the subject of legal action from a mother whose 14-year-old son took his own life after reportedly becoming obsessed with one of its AI characters. According to transcripts of their chats in court filings he discussed ending his life with the chatbot. In a final conversation he told the chatbot he was "coming home" - and it allegedly encouraged him to do so "as soon as possible".Character.ai has denied the suit's allegations.