Why read a review when AI can do it for you?

Amazon is adopting generative artificial intelligence (AI) in a new feature which summarises human-written product reviews into bite-sized keywords.

Amazon, the e-commerce giant and fifth-largest company in the world, is deploying a generative AI solution to ease the review-scouring process for its shoppers.

The company is well-known for normalising public customer feedback after it first launched reviews on its platform in 1995 – a radical idea at the time which eventuated into a mainstay for contemporary online vendors.

Reviews have been a long-running focal point on the platform with the company routinely prioritising review accessibility and accuracy over falsely embellishing a product.

“Last year alone, 125 million customers contributed nearly 1.5 billion reviews and ratings to Amazon stores,” said Vaughn Schermerhorn, Amazon’s Director of Community Shopping.

“That’s 45 reviews every second, making reviews at Amazon an incredible resource for customers.”

But as the internet evolves and the average attention span of online shoppers dwindles, reading a lengthy review or product description may be enough to squash a potential purchase.

Amazon’s solution? AI.

“We want to make it even easier for customers to understand the common themes across reviews, and with the recent advancements in generative AI, we believe we have the technical means to address this long-standing customer need,” said Schermerhorn.

The new AI-powered feature analyses written reviews for frequently mentioned product features and customer sentiments, then highlights them in a short paragraph on the product detail page to “to help customers determine at a glance whether a product is right for them.”

Already available to a subset of mobile shoppers in the US, the AI-generated feature also highlights key product insights – such as ‘picture quality’, ‘ease of use’ and ‘value’ – and allows customers to surface reviews which mention specific product attributes.

“We are always testing, learning, and fine-tuning our AI models to improve the customer experience and, based on customer feedback, may expand our review highlights feature to additional categories and customers in the coming months,” said Schermerhorn.

AI models levelled at dodgy reviews

Amazon’s new feature arrives alongside two new lawsuits from the e-commerce giant against fake review brokers.

In 2022, the company proactively blocked more than 200 million suspected fake reviews from its stores and has taken legal action against 120 review fraudsters across the US, China and Europe as at the end of July 2023.

According to Amazon, this influx of fake reviews is driven by the emergence of a “fake review broker” industry – where illicit brokers solicit customers to write fake or favourable reviews in exchange for money, free products or other incentives.

“These fraudsters knowingly conduct illicit activity in an attempt to deceive Amazon customers and harm Amazon selling partners through the facilitation of fake reviews and other fake content,” said Amazon.

In its efforts to halt fake reviews on its platform, Amazon utilises machine learning models which analyse thousands of data points to detect risk.

These data points include sign-in activity, review history, relations to other accounts and “other indications of unusual behaviour”.

“A combination of machine-learning models and expert investigators that use sophisticated fraud-detection tools analyse reviews for fraudulent patterns and suspicious activity prior to publication,” said Amazon.

While the company’s foray into AI-driven review highlights may help illegitimate reviews avoid scrutiny by encouraging customers to forego the reading process, Amazon said its AI-generated highlights use only its trusted review corpus.

Amazon has further called for collaboration from governments and private organisations to combat review brokers – labelling the prevalence of inauthentic reviews a “global problem”.

In addition to calling for “greater information sharing about known bad actors” and “better controls for services that facilitate fake review solicitation”, the company has voiced support for increased funding and clearer enforcement authority to hold bad actors accountable.

“These fraudsters need to be held accountable for the harms that they perpetrate on consumers, small businesses, and companies like Amazon,” said Dharmesh Mehta, Amazon's vice president of Worldwide Selling Partner Services.

“The fact is that this requires government bodies that have the appropriate enforcement authority and funding to pursue these fake review brokers.”