News
Tech giants are locked in an arms race to dominate AI-powered ecommerce. At Google I/O, we saw a preview of AI Mode in Search – a search experience where agents recommend products, populate visual panels, and complete purchases.
Next up is Apple’s WWDC, where the company is expected to provide an update on Apple Intelligence, but is believed to be taking a more incremental approach to AI in contrast to the rapid-fire rollouts from other companies racing to define the future of ecommerce.
These developer conferences, previously intended for insiders and engineers, are now mainstream moments because they’re shaping the future of shopping in real time.
Beneath the product demos and flashy interfaces, there’s a problem: today’s internet was counterintuitively built for humans, not machines.
AI Search Is Running on Outdated InfrastructureThe web we use today is a patchwork of skimmable layouts and visual cues meant to guide people, not machines, through a shopping experience. AI agents don’t browse like people do, and they need structured metadata with real-time pricing, inventory and clear product attributes.
When the data is inconsistent, unstructured, or breadcrumbed across interactions, AI agents struggle to extract meaning or skip over it entirely. This means that for brands and shoppers products could become harder to find with AI, even if they are the best choice.
As AI-driven search platforms increasingly mediate product discovery, brands are losing visibility, traffic, and the ability to influence how they show up in the customer journey. The internet isn’t being rebuilt for AI, it’s being retrofitted. Many new interfaces look advanced on the surface but are layered over brittle, outdated infrastructure that machines struggle to understand.
Discovery Is DisappearingWe’re already seeing the early signals. Traffic from generative AI sources increased by 1,200% between July 2024 and February 2025, reflecting increased interest from consumers turning to AI tools for product discovery. The wave is arriving, but most brands aren’t yet positioned to take advantage of it because their websites aren’t designed to continue the AI user journey. Interfaces and product data often aren’t structured for agent interactions or optimized for LLM workflows.
Google’s AI Overviews can siphon off up to 64% of organic traffic, depending on the industry. It’s a dramatic shift in how discovery happens. For brands, it means fewer clicks, fewer opportunities for engagement, and far less control over how they're presented in the shopping journey.
As consumers increasingly use AI agents for shopping and product recommendations, they’ll discover a narrower range of brands and products. Those that are optimized will be more easily found because AI models prefer sources that provide clean, well-structured, commerce-ready data like real-time pricing, inventory, and agentic checkout capabilities.
Without that data, AI agents may surface outdated or absent production information, forcing shoppers back into traditional, clunky checkout processes. The brands that proactively become AI-friendly will significantly benefit, making the path forward clear for their shoppers.
Some platforms are starting to recognize the problem. Shopify’s new Catalog API gives agents access to structured product data, making it easier to surface listings in agent-led environments like Perplexity. The API improves visibility, but not interactivity. One way infrastructure allows agents to access existing product data like descriptions and pricing, but the interaction ends there.
Two-way systems enable brands to proactively influence the experience, maybe by offering a discount, surfacing related products or offering free shipping depending on the customer interaction. Without two-way systems, brands will lose out on the control and context they’re accustomed to having.
What Brands Stand to LoseAI innovation moves too quickly for brands to rely on incremental website updates. New model capabilities and consumer expectations emerge weekly, and without a flexible foundation built for constant adaptation, brands risk permanently falling behind.
Brands depend on search as the backbone of their visibility strategy to reach shoppers. Organic and paid search drove up to 80% of website traffic until Google’s AI overviews launched a year ago. Now, with AI Mode, agents are changing how information is retrieved and displayed, threatening not just traffic but the entire infrastructure of how brands reach, understand, and convert consumers.
This amounts to more than a visibility problem. As AI agents handle more of the customer journey, brands are losing the direct connections they’ve spent years building and the rich data that comes along with it: No more behavioral signals, preference data, or owned loyalty loops. When agents become the interface, the relationship gets rewritten.
Without traffic to their own websites, they forfeit first-party analytics, personalized engagement, and long-term insight into customer behavior. Without clear data connections, they can't optimize experiences, measure ROI, or retain relevance. And without direct visibility, even brand affinity is at risk of erosion. In an AI-mediated internet, consumer choice gets collapsed into a single output. Unless a brand is structurally positioned to appear in that output, it might as well not exist.
A Programmatic Commerce LayerThis demands intelligent infrastructure. Brands should already be thinking about how they present their product information to make it legible to two important audiences: people and machines. Structured, real-time data is not optimization. It’s the baseline requirement for visibility, participation, and growth in an AI-first ecosystem.
In the AI internet, new subdomains like ai.brandname.com serve as intelligent storefronts that can serve both human customers and AI agents in one unified experience. Unlike traditional websites, which are updated piecemeal and built for human browsing, AI storefronts are built for speed, natural language, and agent-friendly architecture.
It’s time to rebuild nowBrands know they’re losing clicks, but the big picture is that they’re losing the ability to participate in the next era of commerce. AI agents are rewriting the script for how discovery and conversion happen; brands that aren’t structurally visible won’t be outcompeted, they’ll be invisible. In the AI internet, visibility is engineered. This starts with rebuilding digital storefronts for humans and machines.
We list the best website monitoring software.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
- James Gunn has teased why Jason Momoa's Lobo is vital to Supergirl's plot
- His inclusion helped the DC movie's creative team to crack the story they want to tell
- Gunn also confirmed which actor will play the film's primary villain
James Gunn has confirmed who'll play the villain in Supergirl – and opened up on the importance of Jason Momoa's Lobo in the forthcoming DC Universe (DCU) movie.
In a broad-ranging interview on episode 15 of the official DC Studios podcast, Gunn revealed that Belgian actor Matthias Schoenaerts will portray Krem.
For the uninitiated: Krem is the Big Bad in 'Supergirl: Woman of Tomorrow', an eight-part graphic novel series that the second DCU film, which releases on June 26, 2026, is heavily inspired by. In fact, the movie bore the title of its comic book namesake until very recently, with Gunn admitting Supergirl: Woman of Tomorrow was now known by its much simpler and cleaner title Supergirl.
But back to Krem. Last October, Deadline claimed Schoenaerts had been cast as the movie's terrifying antagonist, but it's only now that Gunn has admitted The Regime and Amsterdam star is part of its cast. For more details on everyone else you'll see in Kara Zor-El's first feature film outing in over 40 years (the first, 1984's Supergirl, is available to stream on Max, FYI) , check out my dedicated Supergirl guide.
Krem is the central antagonist of Superman: Woman of Tomorrow and the DCU movie it's influenced (Image credit: DC Comics)That's not the only interesting information that Gunn discussed. Indeed, the DC Studios co-chief also provided more details on why Milly Alcock was cast as Supergirl, how director Craig Gillespie positively fought to include certain scenes in the superhero flick, and the initial text that Jason Momoa sent to Gunn to persuade him to let Momoa play Lobo.
It's a continuation of that final conversation that'll pique the interest of DC devotees. Indeed, as co-host/comic book expert Coy Jandreau mentioned during the podcast's latest installment, the original draft for Supergirl: Woman of Tomorrow's eight-part literary series was set to feature the immortal, motorbike-riding bounty hunter. Tom King, who wrote the graphic novel, confirmed this was the case in an interview with ComicBook.com in February 2023.
The invulnerable mercenary known as Lobo will have a small but important role in Supergirl (Image credit: DC Comics)Given Lobo was due to appear in King and Bilquis Evely's comic book series before he was eventually cut from the story, plus the fact that Momoa will play the last surviving Czarnian in Supergirl, Landreau asked if the forthcoming DCU Chapter One film would incorporate "some of the [comics'] original draft ideas" concerning how Lobo fits into the story that Supergirl will tell.
"Woman of Tomorrow, in the comics, is a bunch of little stories," Gunn said, "And we needed to create one through-line, one three-act, more traditional story. So, Lobo helps us to do that.
"It's not an amalgamation of him and Krem," Gunn added about rumors that Lobo and Krem would somehow be combined into a single character. "He [Lobo] is a totally separate character. I love Lobo. I always thought he was a great character to adapt and, maybe, in some way, the biggest comic book character that's never been in a film. So, I think it was a cool thing to do [include him in Supergirl], yeah."
Are you happy that Lobo is in Supergirl? And what do you make of Schoenaerts playing its main villain? Let me know in the comments.
You might also like- First image for Supergirl shows Milly Alcock's Kara Zor-El in a place that'll be very familiar to DC comic book fans
- Superman will include characters who haven't been revealed yet, James Gunn says – and I think I know who one of them is
- 'We were blown away by this guy': Clayface lands unlikely star for its lead role as DC's budget horror movie continues to take shape
- ASAF offers Dolby Atmos-style spatial audio, with more effects
- Available for all Apple platforms bar watchOS
- Focused primarily on Vision Pro
Apple has introduced a new format for head-tracking spatial audio: ASAF. Apple Spatial Audio Format promises "truly immersive audio experiences" and was unveiled quietly at last week's WWDC 2025 event – not in the keynote, but in a session for app developers.
As FlatpanelsHD explains, there are two components here: ASAF, which is used in audio and video production to position audio elements in a three-dimensional space, and APAC (Apple Positional Audio Codec), which is the codec that's used to deliver it.
If you're thinking "not another audio format" you're not alone: Samsung and Google are promoting Eclipsa Audio as a Dolby Atmos rival, too.
However, Apple's both is and isn't a Dolby Atmos rival – FlatpanelsHD reports that Dolby Atmos can be delivered within Apple's new format, which is then able to add some additional spatial audio tricks on top of it. So this appears to be less about replacing Atmos than expanding… though providing an alternative could be a big part of Apple's plan.
(Image credit: Apple)What does ASAF mean for the future of audio?That's a very good question, because at the moment ASAF is for Apple devices: tvOS, iOS, iPadOS, macOS and visionOS. The iPhone 16 able to be used to capture ASAF audio, and that ease of capture is probably something to pay attention to.
Initially, according to Apple's presentation, it looks like the focus – pun very much intended – is on the visionOS headset. Apple has mandated the use of APAC with all Immersive Video titles, although the codec can be used as a container for Dolby Atmos data instead of ASAF if the creator is already using that format.
The idea with ASAF's extra 3D skills are that they can alter the spatial sound not just based on your own head tracking and positioning, but also based on the virtual environment you're in, changing elements such as the volume and reverb to make the sound seem like it matches the world you're in. So you can see why it goes beyond Dolby Atmos, which just assume you're sitting still in the center of a virtual theater.
However, a further appeal may be to offer another simple way for smaller creators to offer spatial content. Samsung told us that one of the goals of Eclipsa Audio was in part to ensure that smaller-scale content creators could create and deliver spatial audio videos easily as well, for example.
Apple may be able to offer this too for podcasts and more, in the future: where previously it kept its formats proprietary, it's become more open in recent years and its Apple Lossless Audio Codec dropped its royalty scheme back in 2011. ASAF can apparently be created using industry-standard software and plugins.
The APAC codec reportedly works at bitrates as low as 64kbps and maxes out at 768kbps, which may seem low, is the same maximum bitrate that Apple, Netflix and others use to stream Dolby Atmos at the moment, so it'll match current quality standards.
You might also likeSamsung's best earbuds, the Galaxy Buds 3 Pro could be said to a bit overpriced at launch – but at half price, they're a bargain. And that's what they are over at Woot, where the Galaxy Buds 3 Pro are down from their $249 MSRP to just $119.99.
In our in-depth Galaxy Buds 3 Pro review we praised their "fantastic sound", comfortable fit and excellent ANC, and our only real niggle was the price: at just shy of $250 they were "some of the most expensive earbuds designed for general consumers". We suggested the $219 Sony WF-XM10005 instead, but at $119 the Galaxy Buds Pro 3 are now $100 cheaper than the Sonys and they deserve your cash: at this price, they're a steal.
Today's best Samsung Galaxy Buds 3 Pro dealI'm not being dramatic here: at this price the Samsung Galaxy Buds 3 are an absolute steal. By making them half-price Woot has made them $100 cheaper than their closest rival, Sony's WF-1000XM5, and less than you'd pay for many less capable earbuds. At full price the Galaxy Buds 3 Pro get four out of five stars. At half price, they get six out of five. View Deal
As we said in our review, " these are top-end earbuds, especially in the audio quality and noise cancellation departments". They are up there with the very best earbuds we've ever tested in terms of sound quality, and theyr'e capable of up to 24-bit/96kHz with compatible phones and tablets. Their adaptive EQ is very effective, and the sound stage is exciting and wide. Immersive audio is excellent too.
The ANC is "really great", we said. "the buds throw a blanket over whatever background noise is going on when you’re trying to listen to music." And you can dial down the intensity when you need to be aware of what's going on around you. Battery life is a decent 6 hours with ANC on, and seven with ANC off. With the charging case you get a total of 26 hours with ANC and 30 without.
I'll be honest. At $249 I don't think the Galaxy Buds 3 Pro are good value: they're great earbuds, but the market for non-Apple earbuds is packed with very good buds that cost considerably less. But at half price I think they offer superb value for money.