Menu Menu

Willy Wonka-inspired Glasgow experience scam shows danger of AI

An ‘immersive’ Glasgow adventure day based on Charlie and the Chocolate Factory has been described as a scam, after ticket-goers complained of a lacklustre experience. It’s an example of how unchecked AI advertisements can lead to serious product misrepresentation.

Nearly everyone has seen or read the classic tale of Charlie and the Chocolate Factory.

Written by Roald Dahl in 1964 and later adapted into several films starring Gene Wilder and Johnny Depp, it is considered a literary classic. It follows the experiences of a small boy, Charlie Bucket, as he takes a tour of Willy Wonka’s famous chocolate factory.

The novel describes fanciful lakes of chocolate, towering pipes of liquorice, and magical tunnels that spin Charlie’s sensory perceptions upside down. What it does not depict is an empty, grey warehouse in Glasgow with a few AI-generated banners crudely taped to a concrete wall.

That’s what unfortunate ticket-goers for ‘Welcome to Willy’s Chocolate Experience’ were treated to this past weekend.


What was ‘Welcome to Willy’s Chocolate Experience’ supposed to be?

Organised by events company House of Illuminati, ‘Welcome to Willy’s Chocolate Experience’ was advertised as an immersive, in-person experience that could turn ‘chocolate dreams [into] reality.’

The official website boasted an ‘enchanted garden, giant sweets, vibrant blooms, mysterious looking sculptures, and magical surprises.’

Tickets started at £35 and intended to include a host of decorated spaces and exhibits that would ‘transport you into a realm of creativity’. Live performances were promised alongside a ‘twilight tunnel’ and ‘imagination lab’.

According to reports, hundreds of parents purchased tickets and were sorely disappointed when they arrived to a mostly-barren events space that boasted a single, small bouncy castle, a few plastic chairs, and a school cafeteria table.

Proceedings were so poor that the police had to intervene and halt operations halfway through the first day.

Customers described the day as ‘appalling’, and an ‘absolute shambles of an event’.

Attendee Stuart Sinclair said to Sky News that his children ‘only received a couple of jelly babies and a quarter of a can of Barr’s Limeade’. That’s a far cry from the life-altering adventure promised by House of Illuminati.

Decorations appeared to be products from a mix of various franchises, including Santa’s Grotto candy canes, model mushrooms from an Alice in Wonderland exhibit, a non-descript ‘factory’ archway, and AI-generated posters. They’re sparsely scattered across the large events space, and do little to balance out the overwhelming grey walls and flooring.

Full refunds have been forcefully issued onto House of Illuminati, who now promises to give all customers their money back despite a ‘no refunds’ policy on its website. This may take up to ten days, it says. A Facebook group has been set up by disgruntled visitors who were let down by the day’s events.

A spokesperson added that the company was ‘devastated’ and that it ‘understood people’s anger and frustration’.


What went wrong here?

While this ordeal is undoubtedly funny, it does demonstrate the dangers that AI poses when it comes to authentic advertising, scams, and misrepresentation of real world experiences.

Anyone who has generated their own images or experimented with DALL-E before could likely deduce that House of Illuminati’s advertorial images were all entirely made using AI. It’s fair to assume that much of the text was put together using ChatGPT too, as it seldom makes complete sense and is full of errors.

Given how obviously fake the whole thing seemed, is there some onus on customers who weren’t able to tell that something was ‘off’ here?

Perhaps. Consumers need to be more rigorously thorough when researching events they’re paying hefty money for in this new, AI-generated age. Any scam artist can chuck together impressive images at a moment’s notice. If something seems off, it probably is.

By that same token, however, it’s becoming increasingly easier to present professional imagery, branding, and products as if they are genuine experiences.

With OpenAI’s new generating software, Sora, it will soon be possible for anyone to generate convincing video with a few simple prompts.

Imagine if House of Illuminati had got its hands on Sora? We’d have seen striking, moving images of this fictional ‘experience’ that would have likely swindled even more customers.

It’s unreasonable to expect the general public to accurately determine whether a product is real or made with AI, especially older people or those unfamiliar with artificial intelligence.

We already struggle to determine real and fake news as it is, let alone with increasingly muddled imagery and video.

Ultimately, ‘Welcome to Willy’s Chocolate Experience’ was a prime (if not hilarious) real-world example of AI fooling the general public into a day-out that was anything but immersive.

It isn’t the first and won’t be the last.

Accessibility