News
Experience Level Objectives (XLOs) represent a fundamental evolution in monitoring philosophy, moving beyond the conventional Service Level Objectives (SLOs) and SLAs that have dominated IT operations for years.
This post examines the key differences between these approaches and explains why XLOs provide a more business-aligned framework for modern digital operations.
User-Centric vs. infrastructure-centric measurementsTraditional SLA and SLO monitoring has primarily focused on system availability and IT infrastructure health. This approach centers on technical metrics like uptime percentages, server response times, and infrastructure resource utilization. While these metrics provide valuable insights into system health, they create a significant disconnect between technical indicators and actual business metrics.
In contrast, XLO monitoring prioritizes metrics that directly gauge user experience and satisfaction. This shift reflects a growing recognition that digital service quality cannot be measured solely by whether systems are functioning, but rather by how well they are functioning from the user's perspective. As research increasingly shows, "slow is the new down"—acknowledging that poor performance, even without complete failure, can severely impact user satisfaction and business outcomes.
This philosophical difference addresses a critical blind spot in traditional monitoring approaches. A system can report 100% uptime while delivering a frustratingly slow experience that drives users away. XLOs close this gap by measuring what actually matters to users: the quality and speed of their interactions with digital services.
The importance of monitoring from where it mattersMost monitoring tools rely on cloud-based vantage points for digital experience monitoring —convenient (for the vendor), but disconnected from the actual user experience. These first-mile checks confirm whether the infrastructure is up, but say little about how your application is experienced by users in the real world. Hence, it is primarily useful for QA purposes, especially for new code releases.
XLOs shift the perspective. They depend on insights captured from where users truly are—whether that’s on a connection inside an office through a regional ISP, a mobile connections through a mobile operator, or even a laptop connected via Starlink. This visibility uncovers the real issues users face: congestion, routing delays, delays from third part code, and other last-mile failures that cloud monitoring can’t see.
If SLOs tell you your system is available, XLOs tell you whether it’s delivering the experience the business expects to real users. This outside-in view is what turns data into real business insight. It closes the visibility gap between infrastructure health and user experience—and that’s where the real value lies.
End-to-End Journey PerspectiveTraditional SLOs often focus on individual components or services, creating a fragmented view of performance. XLOs, by contrast, are designed to capture the complete user journey across multiple systems and services. This end-to-end perspective reflects the reality that users experience services holistically, not as isolated components. Modern digital services span multiple providers, platforms, and technologies, making isolated component monitoring inadequate for ensuring overall service quality.
While an SLA may measure the uptime of an S3 storage bucket, or the uptime of your DNS or CDN provider, these are only three of dozens or hundreds of components in an entire system. As a rule of thumb, the quality of the experience delivered by a system is as good as the worst of its components. Thus, while most components could be working perfectly, an issue in a third-party API may be resulting in the entire experience for your users to be unacceptable.
The XLO, by contrast, is less concerned about CPU utilization or database response time, while entirely focused in the resulting experience for a user – whether the user is a customer, an internal user, or an API consumed by an internal or external system.
Business alignment and value demonstrationA critical difference between XLOs and traditional SLOs is their alignment with business outcomes. Traditional SLOs primarily serve technical teams, measuring system health in terms that may not translate directly to business impact, while SLAs establish accountability from vendors that deliver a component of the functionality of a system. This creates challenges in demonstrating IT's value to business stakeholders and securing resources for performance improvements.
XLOs fundamentally change this dynamic by providing metrics that directly correlate with business performance. By moving beyond "Is it up?" to answer "Is it meeting our users’ expectations?", XLOs address what business stakeholders actually care about. This alignment helps prove the value of IT Operations and justify investments in performance improvements by demonstrating clear connections between technical performance and business outcomes.
As more components of our business and personal lives are based on digital experiences or supported by digital processes, delivering on the expectations is a business priority. In a recent survey of thousands of users showed bad digital experiences are the main reason why consumers switch to different banking providers.
As a specific example, a team can set specific XLO targets that reflect business priorities, such as ensuring the critical part of loading a page, measured as Largest Contentful Paint (LCP), does not exceed 2.5 seconds 90% of the time in a given month. This specific threshold directly impacts bounce rates and user engagement, providing clear business value.
Accelerating maturity with XLOsAccording to the GigaOm Maturity Model for IPM, organizations progress through five stages—from chaotic, reactive operations to optimized, business-driven monitoring. Traditional SLOs keep teams stuck in the early stages, focused on infrastructure uptime and siloed metrics. XLOs act as a catalyst for maturity by:
Aligning with advanced stages: XLOs introduce user-focused metrics that resonate with the 'Quantitative' and 'Optimized' stages, emphasizing business outcomes.
Facilitating proactive issue detection: Tools like burndown charts enable early identification of performance degradations, a hallmark of mature operations.
Fostering cross-functional collaboration: XLOs unify teams around shared objectives, essential for achieving higher maturity levels.
For example, a retail company using XLOs to monitor checkout flow performance (e.g., Time to Interactive across regions) isn’t just fixing errors—they’re optimizing a revenue-critical journey, a hallmark of GigaOm’s value-based observability.
Proactive vs. Reactive MonitoringTraditional SLO monitoring often creates a reactive posture, where teams respond to issues after they've already impacted users. This approach typically waits for error thresholds to trigger alerts before teams mobilize to address problems. Once these thresholds are crossed, the business is already suffering some impact.
XLO monitoring enables a substantially more proactive approach. By tracking performance trends over time, proactively simulating user experiences from their real-world locations, businesses can detect gradual degradations before they breach critical thresholds – and often before they impact users.
Tracking XLOs over time is where burn-down charts come into play. Burn down charts help track the progress of your performance against your set objectives, showing how much of your performance budget is left as time goes on.
When a team adopts XLOs as a KPI, it influences how the teams make decisions, how they see success, and what risks are acceptable. Operations can evaluate whether to release changes based on their projected impact on experience metrics, maintaining consistently high user satisfaction. In this way, burn down charts offer a clear status of service health over periods of time.
Breaking down organizational silosA significant practical difference between XLO and traditional SLO approaches lies in their organizational impact. Traditional SLOs often reinforce existing silos between development, operations, and business teams, as each group focuses on their own specialized metrics.
XLOs, by contrast, create a common language and shared objectives across organizational boundaries. By providing metrics that matter to both technical and business stakeholders, XLOs facilitate cross-functional collaboration and shared accountability for user experience. This collaborative approach enables faster problem resolution and more effective performance optimization.
Building a digital operations center (DOC)For a long time, IT operations teams have built NOCs and SOCs to manage network operations and security. In today’s world where most business interactions are digital, as organizations mature, many are formalizing their cross-functional efforts by building Digital Operations Centers (DOCs).
A DOC brings together teams across IT, engineering, and business functions to monitor experience-centric metrics in real time. With XLOs at the core, a DOC isn’t just a control room—it’s a shared space for aligning around user outcomes, accelerating response times, and making performance a business-wide priority. It’s a sign of maturity and a strategic investment in digital resilience.
A DOC puts digital user experience at the center of the business and provides visibility into how every critical digital operation in the business performs - and what is the performance of all the key components that contribute to delivering that experience – from internet backbone to third party components, cloud services, APIs, DNS, front-end servers, databases, microservices, down to application code.
A DOC is a natural evolution of a NOC and a SOC as IT operations teams evolve from a systems-uptime focus to becoming a true operational intelligence team that is a critical component of how the business operates, and not only the team keeping the lights on.
Specific Experience MetricsXLO monitoring to measure specific performance metrics that directly impact user experience can include:
Wait Time: The duration between the user’s request and the server’s initial response
Response Time: The total time taken for the server to process a request and send back the complete response
First Contentful Paint (FCP): The time it takes for the browser to render the first piece of content on the screen
Largest Contentful Paint (LCP): Time when the largest content is visible within the browser
Cumulative Layout Shift (CLS): A measure of how much the layout of the page shifts unexpectedly during loading
Time to Interactive: The time it takes for a page to become fully interactive and responsive to user inputs
These metrics create a multidimensional view of the user experience that traditional infrastructure-focused SLOs simply cannot provide.
The Strategic Value of XLO MonitoringSLOs and Experience Level Objectives (XLOs) aren’t just buzzwords; they're guiding principles for ensuring performance indicators align with real customer expectations.
The SRE Report 2025According to the SRE Report 2025, 40% of businesses are prioritizing the adoption of SLOs and XLOs over the next 12 months. By focusing on user experience rather than just system availability, providing specific experience-focused metrics, aligning with business outcomes, enabling proactive optimization, capturing end-to-end journeys, and breaking down organizational silos, XLOs provide a more comprehensive and business-relevant approach to monitoring.
This evolution reflects changing expectations from both users and businesses.
For organizations seeking to improve digital experience quality while demonstrating clear business value from IT investments, XLOs offer a powerful framework that goes beyond traditional SLO limitations. By implementing XLO monitoring, organizations can align technical performance with business objectives, ultimately delivering superior digital experiences that drive competitive advantage.
We've listed the best Active directory documentation tool.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
- PlayStation's Project Defiant fight stick is officially called FlexStrike
- The fight stick will pack mechanical switch buttons, PS Link support, and instantly swappable stick gates
- It's set to launch sometime in 2026
PlayStation's Project Defiant fight stick finally has an official name, alongside brand new details and a vague release window.
A new PlayStation Blog post has revealed that Project Defiant is officially called the FlexStrike, and it's currently set to arrive sometime in 2026. The news comes right before Sony's own EVO 2025 fighting game tournament event in Las Vegas, where the FlexStrike will be on display (but not playable) for the first time.
FlexStrike will be compatible with both PS5 and PC, and it supports Sony's proprietary PlayStation Link wireless tech. Here, a PlayStation Link USB adapter can be used to hook up a compatible gaming headset - like the Pulse Elite or Pulse Explore earbuds - as well as up to two FlexStrike controllers for local play.
Like many of the best fight sticks, the FlexStrike will also be customizable to a degree. One really cool feature shown in the trailer (above) is a 'toolless' gate swap. By opening the non-slip grip at the bottom, players will be able to swap between square, circular, and octagonal gates on the fly with the joystick. This means you won't have to buy a separate joystick or gate, or use any additional tools to get the job done.
The controller has several amenities you'll find on other top fight sticks, including a stick input swap for menu navigation, and a lock switch that disables certain buttons (like pausing) for tournament play. The eight face buttons are also mechanical, which means they should register clicky, instantaneous inputs.
Lastly, players can use a DualSense Wireless Controller in tandem with the FlexStrike for menu navigation, not unlike what we see with the PlayStation Access controller.
PlayStation appears to be investing quite heavily in fighting game hardware and software. It's likely that the FlexStrike will launch around the same time as Marvel Tokon: Fighting Souls, published by PlayStation Studios and developed by Arc System Works; the team behind Guilty Gear Strive, Granblue Fantasy Versus: Rising, and many more of the best fighting games.
TechRadar Gaming will be very keen to deliver a verdict on the FlexStrike when it launches next year, so stay tuned for a potential review in 2026.
You might also like...- Spider-Man: Brand New Day's position on the MCU timeline might have been revealed
- Filming will begin in Glasgow, Scotland in August
- A Marvel fan has snapped some set photos that indicate when it could be set
Spider-Man: Brand New Day won't arrive in theaters until July 2026, but some fans think they've already worked out where it'll sit on the Marvel timeline.
With filming due to begin on Spider-Man: Brand New Day in August, preparations have been underway in Glasgow for a number of weeks now. The Scottish city is being used a stand-in for New York City (NYC), so Glaswegians have seen their hometown receive a US makeover before the cameras start rolling.
One eagle-eyed Marvel fan has wasted no time snapping images of the sets being erected for Spider-Man 4, too. Indeed, X/Twitter user lukec1605 recently uploaded some photographs that indicate what year it might take place in.
Photos from set on #SpiderManBrandNewDay @eavoss @NewRockstars pic.twitter.com/LZICv2IohfJuly 28, 2025
As the above post reveals, the Marvel Cinematic Universe's (MCU) version of NYC is being renovated, with numerous construction builds in progress. This might have something to do with events that occurred in Thunderbolts*, aka one of three new movies released by Marvel Studios this year. That film is set in the MCU's present, which is believed to be the year 2027. You can read more about what happened in that flick via by Thunderbolts* ending explained piece.
But I'm getting off-track. Two of the images in the aforementioned post reveal that work is due to be completed on these renovations and new builds by December 2027. Cue MCU fans jumping to conclusions and convincing themselves that the next Marvel Phase 6 movie will take place in late 2027.
I'm not satisfied this is the case, though. Those pictures only indicate that the buildings will be erected before that year ends. Depending on the size of said build, it can take multiple years to complete work on them, too. It's entirely possible, then, that Spider-Man's next outing in the MCU could be set in early or mid-2027, or even sometime in 2026.
Some Marvel fans don't think Spider-Man 4 will be set in late 2027 (Image credit: Reddit)There's evidence that Brand New Day could take place well before December 2027 as well. Season 1 of Daredevil: Born Again, whose story is thought to play out between late 2026 and early 2027, sees Wilson Fisk become NYC's latest mayor. Throughout the Disney+ show's first installment, Fisk fast-tracks a number of developments in the city, so it's plausible that the ongoing construction work was greenlit by him. If that's the case, events in Spider-Man 4 might run concurrent to Daredevil: Born Again season 1.
That said, Jon Bernthal's Frank Castle/The Punisher will a supporting role to play in Brand New Day. The last time we saw him, i.e. in Born Again's season 1 finale, he escaped captivity after being incarcerated in a secret prison facility patrolled by Fisk's Anti-Vigilante Task Force. In order to show up in Spider-Man 4, he'll need to have broken out of jail before that film begins. This would mean Brand New Day has to take place from mid-2027 onwards.
Hopefully, we'll get a better idea of when the film is set, plus who Stranger Things' Sadie Sink is playing in Spider-Man 4, when principal photography finally gets underway. In the meantime, find out why Spider-Man: Brand New Day's release was delayed or learn more about how its official title takes its cue from the most controversial moment in Spidey's comic book history.
You might also like- Marvel superstar Robert Downey Jr sparks new Avengers: Doomsday fan theory over possible dual role in the MCU movie, and I hope it's not true
- Spider-Man: Brand New Day reportedly set to feature one of the first Avengers in a big role, but I hope it's not true for two huge reasons
- Find out how to watch the Marvel movies in order
- New data claims European companies hold 15% of the European cloud market, down from 29% in 2017
- Amazon, Microsoft and Google hold a combined 70% of the European market
- Geopolitical tensions could change things somewhat
New data from Synergy Research has claimed European providers of cloud storage and other services only account for 15% of their own regional market, highlighting the hold that US rivals have even in foreign territories.
Overall market share dropped to around 15% in 2022, remaining steady ever since, but in the five years from 2017 to 2022 European cloud providers lost half of their share, down from 29%.
While European providers were able to triple their revenues between 2017 and 2024, the market grew sixfold in that same period – it's now worth an estimated €61 billion.
Europe's cloud market is dominated by... the USAmazon, Microsoft and Google now control around 70% of the European cloud market, Synergy found, with SAP and Deutsche Telekom confirmed to be the leading EU providers, but with just 2% of the market each. OVHcloud, Telecom Italia and Orange rounded up the top five.
Synergy described the dominance of US cloud giants as an "impossible hill to climb" for European challengers, with US providers typically investing around €10 billion every single quarter into European infrastructure. On the flip side, European firms typically lack the long-term investment support required by the cloud sector.
"The cloud market is a game of scale where aspiring leaders have to place huge financial bets, must have a long-term view of investments and profitability, must maintain a focused determination to succeed, and must consistently achieve operational excellence," Synergy Chief Analyst John Dinsdale explained.
However, change could be on the horizon with data privacy issues bubbling to the surface under Trump-era US policies - as Microsoft recently admitted it can't guarantee data sovereignty in Europe if the US government demands access.
Still, Dinsdale believes the US cloud dominance could be hard to shake off now that it's embedded in Europe: "While many European cloud providers will continue to grow, they are unlikely to move the needle much in terms of overall European market share."
You might also like- Non-US businesses want to cut back on using US cloud systems
- We've listed the best cloud backup tools around today
- Check out our roundup of the best cloud hosting providers
- Windows 11 24H2 had a strange bug that messed with the mouse
- It made the mouse cursor larger after the PC woke from sleep (or was rebooted)
- Microsoft has seemingly fixed this problem with the July update
Microsoft has reportedly fixed a bug in Windows 11 which caused the mouse cursor to supersize itself in irritating fashion under certain circumstances.
Windows Latest explained the nature of the bug, and provided a video illustrating the odd behavior. It shows the mouse cursor being at its default size (which is '1' in the slider in settings for the mouse), and yet clearly the cursor is far larger than it should be.
When Windows Latest manipulates the slider to make the mouse cursor larger, then returns it to a size of '1', the cursor ends up being corrected and back to normal. Apparently, this issue manifests after resuming from sleep on a Windows 11 PC.
Windows Latest says this bug has been kicking around since Windows 11 24H2 first arrived (in October last year), but the issue hasn't been a constant thorn in its side. Seemingly it has only happened now and again – but nonetheless, it's been a continued annoyance.
Not anymore, though, because apparently with the July update for Windows 11, the problem has been fixed.
(Image credit: Zachariah Kelly / TechRadar)Analysis: Mouse mattersOddly enough, Microsoft never acknowledged this issue, although other Windows 11 users certainly have – Windows Latest hasn't been alone in suffering at the hands of this bug.
I've spotted a few reports on Reddit regarding the issue, and some posters have experienced the supersized cursor after rebooting their machine rather than coming back from sleep mode (and there are similar complaints on Microsoft's own help forums).
Whatever the case, the issue seems to be fairly random in terms of when or whether it occurs, but the commonality is some kind of change of state for the PC in terms of sleeping or restarting.
While the mouse cursor changing size may not sound like that big a deal, it's actually pretty disruptive. As Windows Latest observes, having a supersized cursor can make it fiddlier and more difficult to select smaller menu items in apps or Windows 11 itself.
And if you weren't aware of the mentioned workaround – to head into the Settings app, find the mouse size slider, and adjust it – you might end up rebooting your PC to cure the problem. And that's if a reboot does actually fix things, because, as some others have noted, restarting can cause the issue, too.
This was an irksome glitch, then, so it's good to hear that it's now apparently resolved with the latest update for Windows 11.
You might also like...- Microsoft promises to crack one of the biggest problems with Windows 11: slow performance
- Windows 11's handheld mode spotted in testing, and I'm seriously excited for Microsoft's big bet on small-screen gaming
- No, Windows 11 PCs aren't 'up to 2.3x faster' than Windows 10 devices, as Microsoft suggests – here's why that's an outlandish claim
- Xbox will now require age verification under the UK's Online Safety Act
- Microsoft says "starting early next year", certain Xbox social features will be limited to friends only in the UK unless age verification is complete
- Players will need to use a government ID, passport, credit card, or other forms of identification to complete the process
Microsoft has announced that it will require age verification for the continued use of Xbox social features, per the UK's Online Safety Act.
In a new Xbox support post, Microsoft said: "As part of our compliance programme for the UK Online Safety Act and our ongoing investments in tools and technologies that help ensure age-appropriate experiences, we're introducing age verification for Microsoft accounts in the UK."
The company explained that players over the age of 18 who don't verify their age between now and the beginning of 2026 can still play their Xbox console, but "starting early next year", certain social features will be limited to friends only unless age verification is complete.
For now, accounts belonging to players 18 and over in the UK are being asked to verify their accounts and will begin seeing notifications encouraging them to verify their age. This is an optional process for now, but it will change come early 2026.
Until an account's age is verified, users will only be able to use voice and text communication, party functionality, and game invites, and user-generated content like the Activity Feed.
Without age verification, the Looking for Group and custom clubs features won't be accessible.
"If you have an existing account or are setting up a new one, you may be asked to verify your age using Yoti, a trusted and secure third-party identity verification service," the post reads.
There are several ways to verify identity, including with a government-issued photo ID, like a passport, residency card, or any other government-issued identification document with the user's picture on it.
They can also use a live photo, ID verification, a mobile number to verify age through their carrier, and a credit card check.
"Whether a player verifies their age will not affect any previous purchases, entitlements, gameplay history, achievements, or the ability to play and purchase games, however we encourage players to verify their age via this one-time process now to avoid uninterrupted use of social features on Xbox in the future," said Xbox vice president of gaming trust and safety Kim Kunes in a separate Xbox Wire post.
"As this age verification process rolls out across the UK, we’ll continue to evaluate how we can keep players around the world safe and learn from the UK process. We expect to roll out age verification processes to more regions in the future. There is no one-size-fits-all solution to player safety, so these methods may look different across regions and experiences."
Xbox isn't the first platform to be affected by the UK's Online Safety Act. Reddit and Discord have also implemented new age verification systems to access 18+ content; however, gamers are already getting around Discord's tool by using Death Stranding's photo mode.
You might also like...- Best Xbox Series X games 2025: smash hits for Microsoft's top console
- The Nintendo Switch 2 is the company’s least ambitious console to date, but its improvements are astronomical
- Battlefield 6 gets an action-packed first look along with a promise of a multiplayer showcase next week, as new leaks claim the game will launch in October
- Xbox will be showcasing Hollow Knight: Silksong at Gamescom 2025
- The game will be playable on the upcoming Asus ROG Xbox Ally and ROG Xbox Ally X handhelds at the Xbox booth
- Other titles present include Grounded 2 and Ninja Gaiden 4
Xbox has unveiled its plans for Gamescom 2025, which will include the opportunity to play a Hollow Knight: Silksong demo.
The brand will have a strong presence at the European gaming event, which runs from August 20 to 24 in Cologne, Germany. The Xbox booth will show off more than 20 games across a whopping 120 demo stations, alongside offering photo opportunities and unique experiences.
Big highlights include hands-on time with the Asus ROG Xbox Ally and ROG Xbox Ally X, the two recently revealed Xbox PC handhelds. A demo of Hollow Knight: Silksong will be playable on the handheld, potentially giving us our first substantial look at the long-awaited game in years.
Hollow Knight: Silksong was first announced back in 2019 but we have hardly heard a peep about it aside from a few brief appearances at various showcase events such as the Nintendo Switch 2 Direct earlier this year. The game also featured prominently in the Asus ROG Xbox Ally and ROG Xbox Ally X reveal, where it was confirmed that it would be available in time for the handheld's launch.
Could this mean that a Hollow Knight: Silksong release is around the corner? It definitely seems so, especially with the handhelds slated for later this year.
The Xbox booth will also offer visitors the chance to try the likes of Grounded 2, the first public hands-on demo of Ninja Gaiden 4, in addition to some third-party titles like Borderlands 4 and Metal Gear Solid Delta: Snake Eater.
You might also like...- The Nintendo Switch 2 GameCube controller combines the best of the original gamepad and its wireless Wavebird counterpart into one faithful package
- Wuchang: Fallen Feathers is 36 hours of pure soulslike bliss, even if one level threatened to put me on my villain arc
- Arcade Archives 2 Ridge Racer lovingly preserves Namco’s legendary arcade game and is perfect for quick handheld Switch 2 sessions
- Google Cloud survey finds even cybersecurity experts are overwhelmed by too many threat notifications
- The security field is suffering from a skills shortage, putting firms at risk
- Perhaps unsurprisingly, researchers say the answer is AI
Security professionals have long been reporting high levels of stress and burnout, which is only compounded by a skills shortage in the industry, and new research claims the sheer volume of threats, as well as the data those threats produce, is putting firms at risk.
Research from Google Cloud found threat notifications aren’t the helpful tool they could be, and in fact can be overwhelming security teams, with nearly two-thirds (61%) of security practitioners saying they think there are, ‘too many threat intelligence data feeds’, and 60% believing there are too few threat analysts to sift through the data efficiently.
“Rather than aiding efficiency, myriad [threat intelligence] feeds inundate security teams with data, making it hard to extract useful insights or prioritize and respond to threats. Security teams need visibility into relevant threats, AI-powered correlation at scale, and skilled defenders to use actionable insights, enabling a shift from a reactive to a proactive security posture,” the study argued.
Needles in a haystackToo much data leads to analysts stuck in ‘reactive mode’, with 86% of respondents saying their organisation has gaps in its understanding of the threat landscape, as well as 85% saying more focus could be put on emerging threats, and 72% are mostly reactive to threats, not able to get ahead of trends.
Adjacent research from SentinelOne shows that a large proportion of Cloud security alerts are false positives (not relevant to the organisation). The majority of respondents (53%) say that over half of the alerts they receive are a false positive, outlining just how real the ‘alert fatigue’ is.
This makes securing cloud environments difficult, say 92% of respondents, with too many point solutions leading to management and integration issues, creating more alerts, lower quality alerts, and therefore slower reactions to attacks thanks to the confusion.
Perhaps unsurprisingly, both sets of research have one suggestion to solve this issue - and it’s not investing in better training and support to address the skills shortage. Instead, you guessed it, it’s AI.
AI can help ease the pressure by improving an organisation’s ability to operationalise threat intelligence, generating ‘easy-to-read summaries’ and recommending next-steps to ‘uplevel junior analysts’, Google's research says.
"We believe the key is to embed threat intelligence directly into security workflows and tools, so it can be accessed and analyzed quickly and effectively," noted Jayce Nichols, Google Cloud Director, Intelligence Solutions.
"AI has a vital role in this integration, helping to synthesize the raw data, manage repetitive tasks, and reduce toil to free human analysts to focus their efforts on critical decision-making."
You might also like- Take a look at our picks for the best AI tools around
- Check out our choice for best antivirus software
- Cybersecurity executives love AI, cybersecurity analysts distrust it
While mainstream media channels were earlier considered the primary destination by brands for digital marketing of inspiration, consideration, and conversion, that is no longer true today.
With growing diversification of the media landscape, Retail Media Networks (RMN), a collection of digital channels owned by retailers, have emerged among the fastest growing digital media channels.
With a healthy annual double-digit growth, the global retail media market is expected to reach ($179.5 billion) by 2025. In the UK alone, retail media ad spending is expected to outdo TV ad spending in 2025 and exceed £7 billion in 2028.
Amazon leads the pack with the lion’s share of retail media revenue (~$60bn in 2024). Walmart is a distant second (~$4bn). This gap speaks of the market’s growth potential and intense competition for other RMNs.
Compared to the thin traditional retail margins, RMN revenues typically exceed 70%. Many retailers have entered the fray considering this additional revenue stream and margin contribution potential – over 200 RMNs – have been launched in the last few years.
The rise of RMNs:The availability of various social media and online channels has ensured the path to purchase is not linear anymore and follows multiple channels. Post-pandemic, consumer behavior has changed significantly, as seen in the emergence of the Research Online Purchase Offline or ‘ROPO’ effect.
Both local and large brands are constantly seeking opportunities to create brand awareness across available channels. They want to reach consumers with the right messages, right content, and at the right moment on their path to purchase.
Today’s retailers offer a variety of ad units and ad formats with audience reach across an extended ecosystem. It includes their own onsite, in-store, and partner networks. Most importantly, retailers with right shopper loyalty programs have high quality first-party (1P) data that advertisers want to capitalize. Therefore, advertisers are more willing to invest in retail media which can deliver incrementality and ROI.
Well-established RMN can create a true fly-wheel effect for retailers in growing sales, consumer experience, and ad revenue.
Challenges to effectiveness of RMNs:Despite the opportunity for retailers in the RMN business, they may not generate expected revenues from brands and their agencies due to various reasons like lack of relevant operating model and technology capabilities. The retail business requires a buyer mindset, while the media requires a seller mindset.
The absence of integrated joint business planning (JBP) hampers collaboration between retailers and brands organizations. Insufficient technology capabilities lead to poor 1P data, limited ad inventory and formats, without a self-service model or supplier insights to verify ROI and incrementality. Often organizations apply the wrong measurement metrics to measure success. RMNs also face intense competition from various retailers.
Ingredients of a successful RMN:Currently, over 80% of the RMN spend by brands is for onsite (retailer’s .com and mobile app) channels in the form of sponsored products, brands, display ads, and videos - their primary focus is bottom-of-the-funnel marketing.
Retailers have a high-margin revenue stream in monetizing the 1P data in their omni channels by becoming full funnel player – ecommerce sites, mobile apps, in-store ad units, magazines, themed events.
With offsite channels like Meta, Google, Tik Tok, CTVs and in-store digital screens, RMNs can transform into full-funnel marketing channels. Many have already become omni-channel, media owners through strategic partnerships like Tesco Media & Insights + ITVX, Walmart + Tiktok.
The following steps will support the success of RMNs:
- The right organization structure, well-defined operating model and strategically aligned retail teams and RMN teams
- High-quality 1P data that can activate various audience segments
- Ad Units and Ad formats across onsite, offsite and in-store channels which can shape in-store themed experiences
- ML based audience activation and measurement across the channels
- Data clean rooms to combine 1P data with partner data, safely and securely
- Self-service and managed service options for an integrated campaign booking, activation and reporting
- Experimentation platform that can perform A/B testing
- Real-time insights - metrics like Return on Ad Spend (ROAS), Incremental Return on Ad Spend (iROAS), incrementality, brand lift, sales lift
- Standardization of metrics and definition of measurement
When it comes to instore, the ability to integrate ad servers and screens delivering ad content and including a feedback loop on aspects like the number of impressions shown, view time etc., is crucial. By mapping these metrics against in-store purchases, retailers help brands get an accurate view of sales incrementality, iROAS, and other key metrics to close the marketing loop.
RMNs that offer a 360-degree view of customer interactions across retailer touchpoints will help brands achieve micro-segmentation and hyper-personalization.
From media buyer to agency mindset:To compete against the likes of Amazon, Google, and Meta, RMNs must demonstrate how they can provide superior ROI to the brand advertisers by leveraging AI and ML technologies, impacts consumer behavior. A consulting partner like Infosys can draw from their vast experience in implementing and integrating such technology platforms for global retailers.
Above all, retailers must begin to view RMN earnings as an additional revenue stream derived from a brand’s marketing spends. Those able to effectively don an agency’s hat in selling ad performance will encourage brands to entrust these precious marketing resources with them.
We've featured the best productivity tool.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
- PayPal now comes as a built-in part of Wix Payments
- Merchants can now manage all transactions in a single dashboard
- The new feature is currently only available in the US
Top online payments system, Paypal, and one of the best website builders, Wix, have strengthened their partnership with new integrations, making operations simpler for ecommerce website owners, and checkouts easier for customers.
PayPal now comes as a built-in part of Wix Payments, meaning merchants will be able to connect their PayPal Business accounts, and manage all transactions in a single dashboard, alongside other Wix Payments activity. Previously, if merchants running Wix websites wanted to offer PayPal as a payment gateway, they had to switch between two platforms for all operations, including reports, chargeback alerts, and payouts.
Furthermore, the money from PayPal purchases will now flow directly into the Wix Payments account, giving merchants clearer visibility over their income, and reducing the need to reconcile between two systems.
Wix PayPalMerchants will also be able to benefit from PayPal’s broader suite of features, such as PayPal Pay Later (BNPL) and Venmo, it was said. Finally, PayPal will also now serve as a Payment Service Provider (PSP), processing card purchases within Wix Payments.
“We’re always looking for ways to create more seamless experiences for our users and provide them with the best way to accept payments and manage funds online, in person, and on the go,” said Amit Sagiv and Volodymyr Tsukur, Co-Heads of Wix Payments.
“By bringing PayPal under the Wix Payments umbrella, we gain significantly more control over the user experience and how PayPal’s products are delivered to our merchants. This deeper integration allows us to improve conversion, offer more value, and drive stronger profitability, while giving our users a faster, more unified checkout flow.”
At press time, the new integration is only available to Wix Payments users in the US - however, the company said there are plans to make this feature available in more regions "over time".
You might also like- Report finds 73% are aware of Copilot+ PCs, but only 33% see AI as important purchasing factors
- Many business buyers are more interested in Windows 11 support
- Price, lack of use cases and interoperability are also concerns
Although AI PCs are becoming increasingly available to both consumers and businesses, it seems firms are still not rushing to buy them.
New data from Canalys found around three-quarters (73%) of B2B partners are aware of Copilot+ PCs between March and April 2025, yet only one in three considered AI capabilities important in purchasing decisions.
Despite the huge performance updates, businesses still look to be prioritizing Windows 11 refreshes and battery life over Copilot+ exclusive features, particularly with the Windows 10 end of life on the horizon.
Copilot+ PCs don't seem to be taking offInitially launched with Qualcomm Snapdragon X chips and later available with Intel Core Ultra 200V and AMD Risen AI 300 series chips, Copilot+ PCs are seen as high-end devices with 40+ TOPS NPUs for local AI processing.
Canalys' data shows nearly one in four (23%) PCs sold globally in the final three months of 2024 was an AI PC, however this is a generalized term that means different things across the industry. For Canalys, it means that the devices include a "chipset or block for dedicated AI workloads such as an NPU."
However, Context Senior Analyst Marie-Christine Pygott explained (via The Register) only 9% of the 1.2 million AI-capable PCs shipped by European distributors in Q2 2025 classified as Copilot+ PCs, meeting the 40 TOPS requirement.
Pygott blamed the slow uptake on high pricing, a lack of use cases and low perception of what a Copilot+ PC is and what it can do. Some enterprise customers have also been reluctant to moving to Arm-based Snapdragon chips due to software compatibility issues.
However, things could be on the verge of changing, with a recent Dell survey revealing that around three in five (62%) IT decision-makers would prefer a Copilot+ PC over a regular PC.
Looking ahead, Canalys expects 60% of the PCs shipped in 2027 to be AI-capable, with 2025 potentially seeing them hold a 40% market share.
You might also like- Check out the best workstations if it's time to upgrade
- Geekom IT15 mini PC's power impressed so much, I'm rethinking my desktop
- We've listed the best mobile workstations and best business laptops
Today, our world relies on maps – think about how many apps and services you use daily, both personally and professionally, that use a location-based component.
Given how much of the world relies on maps, you’d think there are lots of maps designed to allow businesses and their developers to solve specific problems. Surprisingly, there are few maps for businesses to build with and integrate into their own applications and use cases.
While proprietary maps do come with much-needed quality and reliability, they also come with the huge sacrifice of not being able to combine useful data from other map ecosystems, providers and open sources. They’re not interoperable. As a result, most maps will never be as rich as they could be for their specific use case.
So, what challenges does this pose for organizations and developers innovating with digital maps and location data? And how can they find the right commercial mapping solution to enable new services and products to flourish?
Today’s map data integration challengeIn most cases, the digital maps we have today resulted from a single use case. However, when digital maps are built with a single end use, they lose their dynamism, they become static and rigid — more akin to paper maps of old than the powerful, data-rich tools they can be.
This has meant that all kinds of organizations across the private and public sector have had to make do with limitations imposed upon them. Companies that build with map data have had to develop and maintain their own map stacks, balancing data from disparate sources that all reference different base maps and somehow, make it all work.
They’ve had to invest significant time, money and resources into adapting their maps, fitting their data to its structure and making it work for their use case. Over the years, these maps have been modified and adapted to work for other use cases and have become large and unwieldy.
Ultimately, something that’s adapted to solve a single problem is never going to be as good as something bespoke and purpose built to solve many problems – but what is the solution?
Striving for a standardized, interoperable, open futureNow, organizations and their developers must select from the available mapping providers to determine which solution will meet their unique requirements. What has been missing from the market, however, is a solution in which all companies and devices can collaborate and communicate through a single digital representation of the physical world.
In a fast-paced and competitive landscape, companies shouldn’t be restricted on how to build for their customers, rather they should be empowered to utilize maps in the best way possible. They need a geospatial standardized map; one they can add their own data to and innovate on top of.
Think of it like the Internet – if every tech company created its own Internet and data couldn’t be moved between these systems; there would be a huge cost in moving that data around and the Internet wouldn’t have developed into what it is today.
This layered approach, built on an open standard, will ensure that all parts of the digital stack work together, without the need to resolve or conflate data from one platform to another. This level of interoperability saves time, effort and a lot of headaches later down the line when the businesses try to meld data from another source or add additional functionalities.
Most importantly, this will free up resources so developers can focus on creating new services and products that are specific to customers’ needs and wants. With everyone working from the same standard, data becomes much easier to share and work with, acting as a catalyst for innovation.
Putting this into practice – and elevating it with AIAI and machine learning is turning that traditional approach to maps on its head – allowing businesses to create new services faster, more accurately and with fewer developer and operator hours. With AI and machine learning, developers are better equipped to process data and turn observations into edits, updates and features as quickly and accurately as possible.
Humans are still required to check for errors, continually improve algorithms and ensure the AI is doing what it’s supposed to do. However, machines can now do the heavy, laborious lifting. It’s increasing the accuracy and freshness of maps and making developers far more effective and productive.
What does this look like in practice? For the automotive industry, a standardized AI-enabled base map will allow carmakers to integrate real-time traffic data, vehicle-to-infrastructure communication and even electric vehicle charging stations into a cohesive system that supports the future of mobility.
In the public sector, those developing smart cities will benefit from the privacy, precision and flexibility offered by a standardized, AI-driven base map. With real-time data at their disposal, city planners can create more efficient transport networks, improve infrastructure, and develop smart systems that respond to the changing needs of their citizens. Furthermore, with the ability to add their own data into a private layer, it’s incredibly valuable to applications where data protection is paramount.
Meanwhile, in logistics, the ability to quickly adapt to changes in road conditions, optimize delivery routes, and integrate external data – from fuel consumption to environmental impact – into a map is a game-changer for companies seeking to streamline operations and reduce costs.
In the future, maps will continue to be a core tool in the functioning of global business, navigation and our daily lives. However, maps – specifically, the way they are made – need to adapt to give organizations the flexibility and scalability needed to make everything work well together.
When an orchestra is all playing from the same sheet music, guided by an expert conductor, symphonies are created. In the context of maps, standardization brings enhanced accuracy, freshness and interoperability. Only through this unified, collaborative approach will innovation and end user satisfaction skyrocket.
We've featured the best small business app.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
- Rumors suggest Nvidia's RTX 5000 series Super GPU models are on track for launch later in 2025
- The RTX 5080 Super is speculated to have 24GB of VRAM, matching the RTX 4090's
- It appears to be the ideal opportunity for Nvidia to improve upon its Blackwell launch mishaps
It's been a long time since Nvidia launched its RTX 5000 series GPUs in late January, followed by other configurations in later months, after a CES 2025 keynote that showcased the Blackwell GPUs. However, it seems Nvidia might not be done with new GPU launches in 2025 just yet.
According to TweakTown, Nvidia is set to launch RTX 5000 series Super models later this holiday season, which typically means November or December. The RTX 5080, RTX 5070 Ti, and RTX 5070 are the GPUs reported to receive Super upgrades, with the new 5080 and 5070 Ti reportedly set to use 24GB of VRAM.
Pricing isn't finalized, and there aren't any figures to work from at this point. But considering Team Green's previous move was reducing the RTX 4080 Super price (as a slightly more powerful GPU) compared to the standard model, we could see a similar pattern again.
There's no sugarcoating the level of controversy that shrouded the Blackwell GPUs, with missing specs (ROPs), a lack of availability, and most importantly, inflated prices across multiple online retailers. With the RTX 5000 series Super models, Nvidia and, notably, its board partners, have a chance to right those wrongs.
A combination of improved performance across the board and adjusted price points may work wonders – and that mostly applies to the RTX 5080 potentially closing the gap on the RTX 4090 (supposedly using 24GB of VRAM). It may be even more interesting to see an RTX 5060 Super using more VRAM, but we'll have to wait and see.
Analysis: If prices for these Super GPUs are out of whack, then forget I even mentioned this...(Image credit: Future)Above all, if these Super GPU model rumors are legitimate, prices will once again determine their success. While I'm aware that Nvidia may have good intentions with more reasonable pricing, all of that work could be undone by board partners and retailers marking up prices significantly.
It's the same issue that botched AMD's Radeon RX 9000 series launch for many; the Radeon RX 9070 XT was seen as the inexpensive and powerful alternative to Blackwell mid to high-tier GPUs, at $599 / £569 (around AU$944), but the market told a different story with prices soaring far above that.
Fortunately, prices have recently fallen back down to original retail pricing, which I'm seeing with more stock and availability for both Team Green and Team Red GPUs than ever before.
I'm hoping that prices can stabilize and stay within reasonable ranges leading up to the eventual launch of Nvidia's new Super GPU models, as it could decrease the chances of ludicrous pricing. Let's just watch this space...
You might also like...- AMD isn't giving up on high-end GPUs – a new leak hints at a new Radeon GPU challenging Nvidia's next-gen flagship graphics card
- The Steam Deck 2 and a home console are reportedly in the works – and Sony may be helping to turn the handheld into the portable PlayStation game device I really want
- Nobody wants 8GB GPUs from Nvidia and AMD – and this retailer just made that clear
Fortnite is running the Super Showdown event later this week (August 2), and so far we know that it'll involve Superman in a big way. This is the latest in a string of live events that've been airing in Fortnite this year, and I'm expecting it to lead nicely into Season 4.
What's new in Fortnite?(Image credit: Epic Games)Epic Games just launched the collab for The Fantastic Four: The First Steps, with movie-inspired skins available as rewards as part of a new Tournament. Soon, we'll see a brand new Season of Fortnite, launching Chapter 6 Season 4 for players to dive into. At present, we don't have much info on what to expect, though we'll get news as launch day approaches.
We're currently in Fortnite Chapter 6 Season 3, a superhero-themed affair that adds super-powered items and a completely new ranking system. The next season of Fortnite is just around the corner, however, so the game will be getting a big refresh very soon indeed. It's regular updates like these that have kept Fortnite firmly ranked in our best free games to play in 2025 list.
Here's what you need to know about Fortnite Super Showdown, including the start time and how to watch it on the day. It's a live Story Event, and it's set to be a Superman-led battle against a gigantic foe. Let's dive in.
Fortnite Super Showdown - cut to the chase- Start time: August 2 at 2:30pm ET / 11:30am PT (August 1) / 7:30pm BST
- Doors open: August 2 at 2pm ET / 11:00am PT (August 1) / 7pm BST
- Summary: Superman Story Event in Battle Royale
Fortnite Super Showdown will start on August 2 at 2:30pm ET / 11:30am PT (August 1) / 7:30m BST. Doors will open half an hour prior, and it's recommended that you jump in at the following times to secure your place:
- East Coast US: 2pm ET (August 2)
- West Coast US: 11am PT (August 1)
- UK time: 7pm BST (August 2)
Fortnite Super Showdown is a live event that'll begin at the times specified earlier in this article. If you want to watch it live, you can jump in yourself, and there will likely be a safe zone around Demon's Domain where players won't be able to eliminate each other.
If you can't log in yourself, TechRadar Gaming will be covering the event as part of a live blog (as we did recently with the Fortnite OG rocket launch). I'll be giving my impressions as they happen, and providing up-to-date info on how the event is unfolding. You can also join your Twitch or YouTube streamer of choice, as there'll no doubt be many streaming the event. Note that Epic Games doesn't broadcast these events live on its official channels.
Fortnite Super Showdown teaser trailerSuperman returns to help save the island August 2 in this season’s Super Showdown Story Event! pic.twitter.com/Vcr2QmSBQoJuly 27, 2025
The Fortnite X channel tweeted out a teaser trailer for the upcoming Super Showdown event (embedded above). In it, we see the eye of a giant creature, which many believe to be a kraken. Then, the current map is shown with Demon's Domain highlighted as the main location for the event.
Fortnite Super Showdown Story Event - what to expect(Image credit: Epic Games)Fortnite Super Showdown will feature a giant battle between Superman and an as-yet unrevealed foe. We know that it's a huge enemy with a big white eye, and many fans are predicting it to be a kraken. Other than that, we know that it'll all take place in Demon's Domain and will likely give some teases as to what's coming next in Chapter 6.
Epic Games will probably reveal more closer to launch, and once it does, I'll be sure to update this page.
You Might Also Like...