Does AI live up to the hype of retailers using it?

Does AI live up to the hype of retailers using it?

Image via Pixabay

Over a month ago, Tesla hosted its second annual AI Day (you’ll be forgiven if you’ve already forgotten since Tesla’s CEO is making headlines in his latest business venture). While critics called the event over the top, it is a great example of how artificial intelligence (AI) is marketed (customer-centric AI) versus how it is actually used (behind the scenes).

According to research co-authored by Matthew Schneider, PhD, associate professor at Drexel University’s LeBow College of Business, the benefits of AI for businesses — especially in retail — may not be as pronounced as it has been suggested in the popular press, but it can still be a valuable tool for retailers, especially in non-customer-facing applications.

Posted in the Retail Journal in 2021, the study used previous research and interviews with senior executives to examine how senior retail executives should adopt AI, as well as factors to consider when adopting AI. AI. In addition to focusing on AI in retail, the research team also compared customer-facing and non-customer-facing AI applications.

For the study, the researchers defined “AI” as “the ability of a system to correctly interpret external data, learn from that data, and use that learning to achieve specific goals and tasks through a flexible adjustment”.

The customer-facing AI the authors use as an example are AI-powered “nudge bots” that interact with, influence, and make suggestions to online shoppers. This type of automated guidance – called customer journey management – ​​is important for retailers because it pushes customers to complete their purchases and has been shown to increase overall sales. But risks, such as privacy and bias, are more likely to be noticed by customers with customer-facing apps and can negatively affect customers’ perception of the retail brand. Therefore, the authors predict that retailers are more likely to adopt non-customer-facing AI first.

“Our research seems prescient. Softbank has abandoned “Pepper”, a customer-oriented robot. But “Whiz” – who is not in contact with the customer – is fine,” said Abhijit Guha, PhD, first author of the study with Schneider and an associate professor at the University of South Carolina.

In the post, examples of non-customer-facing applications are the use of AI to help customer service representatives respond to customer service requests, supply chain optimization, and quantity analysis. enormously important data.

The use of AI is not without risks. Schneider highlights two areas of concern when adopting customer-facing AI applications: privacy and bias. Retailers are less likely to adopt AI applications aimed at in-store customers because in-store customers interact directly with the technology.

“Retailers will use video cameras that detect emotions in stores, called facial analysis,” Schneider said. “These cameras usually record all data on a server. Then the AI ​​analyzes it and can detect if someone was happy or not so that retailers can use it to make different decisions within their store, like which products to display.

But there is a risk because customers don’t know what happens to the recorded data next, which goes against the promise of confidentiality made by the retailer to its customer.

“The initial goal of data collection can be very different from the second, third, or last goal, especially if the AI ​​is measuring thousands of patterns,” Schneider said. “All they have to do is push a button and say, ‘now use the detected emotion to see if people are more likely to steal from the store.’

According to Schneider, the bias is related to privacy in the use of AI in a retail environment. He explains that if retailers know a person’s age, race and other demographic characteristics, they can likely determine who they are.

“With privacy, what you’re doing is synthetically modifying those features on [the customers] therefore impossible to identify them. So now you can’t really make a decision based on their age or race, unless there are other characteristics that also correlate with those demographics,” Schneider said. “Future research should continue in this direction to identify different ways to increase privacy and allow retailers to obtain less biased information.”

For consumer-facing AI applications, there is an ethical concern that bias in the system could lead companies to take discriminatory action. Researchers have found that concerns about this issue can reduce the likelihood that companies will use AI. For example, Walmart was exposed for locking up black beauty products and accused of racial discrimination. “Although Walmart has since ended this practice, such results are still possible at other retailers if AI is used to recommend products for lockdown,” Schneider said.

Another example of risky consumer-facing AI adoption is in-store robots, like Giant’s Marty.

“These robots move around to count soup cans to restock inventory or look for spills so people don’t slip. But customers have to guess whether it’s to watch them (privacy) or track them if they’re going to steal (bias),” Schneider said.

He further explained that these things can harm a business, “even if the AI ​​robot can save broken bones and lawsuits by cleaning up a spill or restocking shelves faster than humans could, there could be have a net loss of business due to fewer customers trust the retailer.

Although there are concerns about data privacy, bias, and ethics when a company considers adopting AI systems, Schneider and his co-authors believe that the use of AI applications has added value. And despite the authors’ caution, they are optimistic about the impact of AI on retail and believe that “retailers who can appropriately harness the power of AI will thrive.”

For future solutions, Schneider said, retailers want usable data without privacy and bias concerns. “The questions are not whether AI should be adopted by retailers, but how AI should be adopted and who should oversee the adverse effects of AI.”

“We can’t expect the people who created the problems to solve the problems,” Schneider said. “We need people who are primarily knowledgeable about the philosophical and statistical aspects of privacy and bias, with no intention of destroying the commercial value of the data.”

Media interested in speaking with Schneider should contact Annie Korp, Deputy Director, News and Media Relations, at 215-571-4244 or

#live #hype #retailers

Leave a Comment

Your email address will not be published. Required fields are marked *