Menu Menu
[gtranslate]

The Oakley Meta smart glasses reek of surveillance capitalism

Billed as a wearable device to optimise physical performance, the Oakley Meta smart glasses may just be a catalyst for 24-7 surveillance and targeted advertising on an unprecedented level.

Meta loves itself a hefty privacy scandal, and this could be another in the making.

By now you’ve probably seen ads featuring Kylian Mbappe and Patrik Mahomes for the Oakley Meta ‘Performance AI glasses’. Taking a colossal dump on the second-gen Ray-Ban Meta frames that rolled out in 2023, this wearable’s specs have been upgraded across the board.

We’re talking an ultra-wide 12MO camera capable of 3K footage, a lens and LED privacy light, POV video calling, livestreaming, and voice-activated query solving abilities. You could ask it, for instance, details about the wind direction and speed before shanking your golf shot miles wide anyway.

Single use battery life is reportedly eight hours, while standby mode would achieve 19 hours before requiring charging. The regular and premium models will drop this month for $399 and $499 respectively, complete with Meta AI capable of analysing and describing what the camera is seeing.

The marketing spiel pushes the smart glasses as performance enhancers, showing active folk skating, hooping, sky diving, and various other activities while utilising live contextual information and other neat features.

At face value, you may see the device as the natural progression of consumer technology and the culmination of a recent boom in biohacking. However, those of us more sceptical – and let’s face it, Meta has given us ample reason to be – are more interested in the company’s grander motivations beyond making a few dollars in an emerging market.

When it comes to privacy scandals, Meta has a rap sheet longer than Pablo Escobar in narcotics. Since the humdinger of Cambridge Analytica in 2018, Zuckerberg and co, undeterred, have been in and out of court regularly for offences related to dodgy data gathering and targeted ads.

It’s no secret that Meta generates the vast majority of its revenue from advertising (over 97%, in-fact), meaning understanding consumer behaviour is central to maximising profit and providing return on investment for brands.

What better way to understand a consumer, than to follow them literally attached to their eyeballs?

This isn’t tinfoil hat stuff, either. Given Meta’s reputation for data brokering under the table, are we to believe the company isn’t interested in the locations we’re visiting, what we’re eating, or what subway ads our eyes linger on for a few moments? Its integrated AI doesn’t exactly lay dormant until we lay on a bench-press or hop on a surfboard.

Could this product actually be a front for surveillance capitalism, packaged and sold as performance enhancement?

We know Meta is determined to train its Meta AI app using customer data. The company is reportedly testing camera-roll access to analyse unpublished photos on our phones, and has been criticised for duping people out of sensitive information using dark patterns – default features that process private data unless physically toggled off.

The team witnessed WhatsApp first-hand at Cannes Lions 2025 selling free ice creams in exchange for people’s data. That’s what Meta is on.

By the book categorically isn’t Meta’s style, and we’re to believe it would pass up the opportunity to train its AI with a golden dataset built of round-the-clock live consumer behaviour. It’s not just the wearer either, family members and friends unaware that your glasses are packed with surveillance equipment are susceptible to being scraped.

Even if you know your cookie preferences like that back of your hand, there’s no explicit guarantee your data won’t be profiled and you’ll be exempt from targeted ads. Besides, who knows what other types of monitoring could be under the bonnet.

We don’t need a weatherman to tell us it’s raining outside, or yet more privacy scandals to wake us up to Meta’s motivations. Reading the terms and conditions may not save you.

Accessibility