News
As personalized and user-centric offerings become a necessity for modern organizations, utilizing data is a critical component to understanding customer and stakeholder needs. From public sector bodies and healthcare providers to financial institutions and software suppliers, it is now imperative for organizations to collect, store and organize data effectively.
Yet, unfortunately, many organizations are struggling to maintain clean, actionable data. In fact, a recent survey found that two-fifths (39%) of organizations have little to no data governance frameworks1. Years of inconsistent data practices and working in silos have left many departments with ‘dirty’, inadequate data that cannot be actioned.
This ongoing lack of effective data governance has resulted in organizations missing the valuable insights that would otherwise help them become better service providers.
Organizations, across sectors, as well as public sector bodies, urgently need to take decisive action to mitigate against any further damage their current data collecting practices may be having. In addition, they must instill values that make data governance a priority. This would ensure the information they collect, and store, is not only clean but also actionable.
How has this happened?The manifestation of ‘dirty’, disorganized, data stems from a multitude of factors. From collecting duplicate and incomplete records to a lack of integration, too many organizations have unfortunately failed to manage data effectively. According to 2024 research, 44% of financial firms struggle to manage data stored across multiple locations2. This has hit their bottom line, with many incurring inflated costs. However, where, and how data is stored is not the only problem.
In organizations where data governance remains a concern, data is often fragmented and inconsistent across departments. Instead of having integrated systems that deliver a single, dependable, database, teams are working in data silos. For instance, separate sales and marketing teams at a digital bank may want to reach out to the same customers, or prospects, but have their own isolated data sets. In a borough council, the social housing and waste collection teams may need to contact the same residents, yet they do not share their citizens’ records.
This disjointed approach causes ‘dirty’ data that is not only difficult to use because the information is incorrect but also challenging to clean and then maintain. What’s more, ‘dirty’ data leads to conflicting insights, impacting decision-making, customer experience and overall business efficiency.
Commercial organizations risk falling behind competitors who can adjust their product lines in accordance with customer and market demands. Meanwhile, public sector bodies may not be delivering crucial services to the right citizens.
Who is responsible for ‘dirty’ data?Poor data management comes in many forms, but perhaps the most prominent reason for ‘dirty’ data revolves around ownership. While many heads of departments perceive data governance as a responsibility of an organization's IT team, it is their department colleagues who actually use data on a day-to-day basis. An IT team can offer support by ensuring software and systems are working properly, but they are not the ones utilizing information to interact with customers and stakeholders.
After all, it is the departments, such as finance, sales and marketing, that need customer and stakeholder engagement to succeed and that benefit from clean, actionable data. The same can be said for local authorities. For example, the social care and education teams need clean data to ensure they can identify the residents that qualify for their services. With this in mind, it is then reasonable to suggest that the prime beneficiaries of clean data should be the ones managing it. Fostering a culture of data responsibility, driven by a desire to create a single view of customer or citizen information, while investing in staff training, is the first step to resolving the human aspect of effective data governance.
Keeping data cleanThe technical aspect involves adopting appropriate solutions to help with the initial clean up and then maintaining data accuracy. While having the right intentions is fundamental to establishing effective data governance, introducing appropriate technology allows departments to put their drive for change into practice.
The sheer volume of data that organizations need to collect, store and process has led to legacy, or rules-based, software being no longer fit for purpose. Instead, artificial intelligence (AI) and machine learning, have been developed to notice patterns and inconsistencies in data. Newer tools can handle larger volumes, so they are deployed to irradicate data duplication and are even at the stage to offer predictive data modelling.
These technologies maintain clean data and support the generation of actionable insights so organizations can accommodate customers’ and/or citizens’ present and future needs. Successful adoption will happen gradually but once this is achieved, automated data cleansing will boost productivity. By automating the manual processes that eroded people’s time, organizations can empower humans to prioritize and fulfil the tasks they do best.
Benefit from actionable insightsThe responsibility for data governance cannot rest solely with IT teams. It must be a shared priority across departments, where those who rely most on data take an active role in ensuring its quality.
The benefits of clean data go beyond having the easily accessible information that is always in the right place, at the right time. Breaking down data silos allows better cohesion and collaboration, which then in turn helps deliver actionable insights. From personalized marketing campaigns and optimizing supply chains to issuing council tax bills and allocating social care budgets, clean data allows organizations to run more efficiently.
By investing in both technology, such as AI-powered automation tools, and a more responsible, and proactive, culture, companies can develop robust data management practices. Ultimately, the organizations that thrive will be those that treat data not as a by-product, but as a strategic asset.
We've featured the best AI website builder.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
There’s no doubt that AI can offer businesses significant opportunities to enhance efficiency, unlock insights and improve their operations. However, making the leap from concept to effective execution remains a complex journey for many. Organizations are often overly optimistic about how easy AI will be to implement, but quickly find that generating real impact through scalable systems relies on more than ambition alone.
Unfortunately, all too often, promising AI initiatives remain stuck in "proof of concept purgatory", failing to move into production due to integration issues, particularly with back-end data. The truth is that AI will not succeed if the underlying processes and data are disorganized. AI thrives in environments where data is structured, connected, and easily navigable - by both machines and people. It must be embedded into workflows, not added as an afterthought. This is particularly crucial in high-stakes sectors, where the success of AI depends entirely on the quality and accessibility of information.
Beyond the basicsAs automation and AI adoption accelerates, the challenge is no longer whether to adopt AI - but how to do it well. That means moving beyond the low-hanging fruit and prioritizing strategic implementation supported by data readiness and solutions that enable seamless integration.
Terms such as ‘Generative AI’, ‘Agentic AI’, ‘LLMs’ or even more broadly ‘intelligent automation’ have certainly created a buzz in recent years, but unfortunately, many implementations are falling short of their true potential. In many cases, what businesses are actually deploying are advanced chatbots or deterministic systems that don’t fully leverage AI’s potential. For example, a lot of businesses are still at the stage where they are using AI for simple tasks like content generation, speech-to-text, or at most - the automation of simple processes. Whilst using AI for tasks such as these is certainly a valuable step to support productivity and free up employees, these straightforward processes are only just scratching the surface on what AI has to offer.
What does innovative AI look like?True AI innovation often involves handling probabilistic tasks, where uncertainty and variability in data demand more advanced AI systems to guide decisions. To drive impact from AI, it’s time for organizations to move beyond the basic applications and start thinking about how AI can augment and support human decision-making and improve outcomes across a variety of channels.
This isn’t about replacing human workers, but supporting them with real-time insights. For those in contact center roles, effectively integrated AI can provide next-best-action recommendations and contextualized guidance during customer interactions. A significant shift from traditional rule-based systems to intelligent, adaptive support that empowers teams to make faster, more accurate decisions. Moreover, by automating routine and repetitive tasks - such as identifying intent or retrieving customer history - AI can help reduce friction in the customer journey. This not only improves operational efficiency but also elevates customer satisfaction, eliminating the need for customers to repeat themselves across touchpoints.
The integration dilemmaUnfortunately, for many sectors, the biggest roadblock to impactful AI adoption comes from the complexity surrounding its integration with legacy systems. Whilst using an AI bot to automate content generation or customer service tasks is fairly straight forward, getting that system to access and interact with real customer data – such as CRM systems, product databases, or service records, can become a monumental challenge. For example, many public sector organizations have hundreds of different systems concurrently, each managing different aspects of customer service or data collection. The real challenge lies in making sure all these systems talk to each other effectively and that AI can access the relevant data from across the organisation securely.
Without seamless integration, AI cannot function optimally, and its promise of transforming business operations becomes much harder to achieve. After all, AI can only be as effective as the data it relies on. If data is disjointed or stored in silos across different systems it will struggle to deliver meaningful insights, or guide decisions effectively. To overcome this, organizations need to look at their processes and workflows holistically, ensuring data within these systems is well-organized, consistent and accessible.
This may require the reorganization of data and making bold decisions around whether the underlying, legacy technology is still right for the business’s needs. This is where process mapping is an essential starting point. Process mapping is the practice of creating a detailed map of all workflows scattered across the entire business and visualizing them to understand the direct and indirect impact one process may have on another.
From concept to impactShifting the dial on AI from concept to meaningful impact, requires organizations to take a pragmatic and outcome-focused approach. AI should be incorporated intelligently, and is often most successful when it augments existing systems. Platform-based AI tools which combine low-code capabilities can offer organizations a great solution to this by breaking down the barriers to development and removing the need to rip and replace solutions.
Adopting a more systematic and intelligent approach to implementation is equally as important. AI should only be applied where it clearly adds value. Gaining visibility into workflows and identifying process bottlenecks is key to this - helping to ensure AI is targeted to areas that deliver measurable improvements.
By focusing on augmentation over replacement, adopting platform-based AI tools that support integration, and aligning AI initiatives with business needs, organizations can unlock scalable, sustainable AI outcomes that go far beyond the proof-of-concept stage.
We've featured the best productivity tool.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
- Hackers seen targeting misconfigured JuypterLab instances
- They host malware in polyglot files on image sharing sites
- The Koske malware mines different crypto tokens
Security researchers recently discovered a new Linux malware hiding in pictures of cute animals.
Cybersecurity experts from AquaSec recently found a piece of malware called Koske circulating around the web. It relies on polyglot files - documents that can be read and processed differently, depending on the type of program running them.
The threat actors were apparently targeting JupyterLab instances exposed to the internet, and misconfigured in a way that allows remote command execution. After finding and accessing such endpoints, the attackers would pull .JPEG files from legitimate image hosting services such as OVH images, freeimage, or postimage. The pictures were of AI-generated panda bears, innocuous at first sight.
Serbian hackers?Through a script interpreter, the images are turned into a CPU and GPU-optimized cryptocurrency miners, using the server’s resources to generate more than 18 types of crypto tokens.
Cryptocurrency “mining” is essentially a process of supporting a blockchain network. In exchange for lending electricity, internet, and computing power to support the grid, users are given cryptocurrency tokens whose value depends on different things such as the number of users, the number of tokens in circulation, and the cost of mining.
Mining crypto this way generates relatively little profit for the attackers, some researchers said, while raking up huge costs for the victims - cloud compute power and electricity are often quite expensive.
AquaSec could not attribute the malware to a specific group definitively, but it did say that it found Serbia-based IP addresses used in the attacks, Serbian phrases in the scripts, and Slovak language in the GitHub repository hosting the miners.
In that context, the name of the malware would make some sense, since the word “Koske” in colloquial or dialectal form means “bones”.
The researchers believe that besides the image, the malware itself was written with the help of large language models (LLM) or automation frameworks.
Via BleepingComputer
You might also like- Thousands of PostgreSQL servers are being hijacked to mine crypto
- Take a look at our guide to the best authenticator app
- We've rounded up the best password managers
- OpenAI brings Agent mode to its Mac app
- Agent mode is accessible from the toolbar menu
- Now you can leave long tasks running in the background while you work on something else
OpenAI has added its new Agent mode to its ChatGPT macOS app, and it’s available right now, if you’re a Plus subscriber. When you use ChatGPT through the Mac app, you’ll see Agent as one of the options in the toolbar that lives under the prompt window. Select it, and now you’re in Agent mode.
OpenAI launched Agent mode, or ChatGPT Agent, as it is also called, last week, and it’s a way to combine all the power of Deep Research with the agentic properties of its previous agent, called Operator.
During the launch event, Sam Altman and co showed off several uses for Agent, such as planning big events, like a wedding, or producing a presentation based on a wide range of data it has to go off and find.
OpenAI describes ChatGPT Agent as "ChatGPT that can think and act, proactively choosing from a toolbox of agentic skills to complete tasks for you using its own computer."
(Image credit: OpenAI)Alternative accessOpenAI also makes a version of its app for Windows users, but this hasn’t had Agent mode added yet.
We have noticed that some people don’t see the new Agent icon in their toolbar on the Mac app yet.
Presumably, this will be fixed as the change rolls out, but if you are using the Mac app and you don’t see an Agent button at the bottom of your interface, then don’t worry, you can still access Agent mode.
Simply type a / and a list of options will appear, one of which is 'Agent mode'.
(Image credit: OpenAI)The desktop app version of ChatGPT is more integrated into the operating system than simply using ChatGPT in a web browser.
For example, you can launch ChatGPT from any screen on your desktop with a keyboard shortcut. Use Option + Space on macOS or Alt + Space on Windows.
The Desktop version of the ChatGPT app also features Advanced Voice Mode, so you can chat with ChatGPT in real time using the microphone.
While we welcome the new ChatGPT Agent mode, it does open up your Mac to more vulnerabilities. This was something that the OpenAI team addressed at the launch, and Sam Altman said that there could be a new wave of threats that target AI agents who are simply trying to be helpful.
You might also like- This Ozzy Osbourne tribute video may have 5.8M views, but it just shows everything wrong with AI slop (and what the hell is that music?)
- Sam Altman says there’s ‘Something about collectively deciding we're going to live our lives the way AI tells us feels bad and dangerous’ as OpenAI CEO worries about an AI-dominated future
- Sora 2 is coming, but it will have to dazzle viewers to beat Google's Veo 3 model
- For the first time ever, the AMD Ryzen AI Max+ 395 will be used in GPD's new handheld gaming PC
- Its currently AMD's most powerful mobile processor
- The GPD Win 5 is expected to be fully unveiled at Chinajoy 2025
Handheld gaming is expanding with new devices like the MSI Claw A8 and ROG Xbox Ally on the horizon, but pricing has been a significant concern for many. However, a handheld that could easily eclipse other handheld PCs has been teased – and it will likely launch with a hefty price tag.
As reported by our friends at Tom's Hardware, GPD has teased a new handheld gaming PC on X, the GPD Win 5, powered by AMD's most powerful mobile 'Strix Halo' processor, the Ryzen AI Max+ 395. It's expected to be unveiled at Chinajoy 2025, which begins on August 1.
In the video (which you can find below), the GPD Win 5 is running Black Myth Wukong, and achieving up to 200 fps. Now, it's too early to make performance comparisons, as we don't know what graphics settings are enabled or if frame generation is being used (which I would assume it is).
However, we know that the Ryzen AI Max+ 395 gets very close to an RTX 4070 laptop GPU in terms of performance – and that's quite possibly the best feat achieved by any AMD APU or SoC. It can run Cyberpunk 2077 at an average of 77 fps at 1440p on high graphics settings, with FSR 3 quality enabled, as highlighted in this benchmark test by AMD APU Gaming on YouTube.
Breaking news! The GPD WIN 5 is expected to make its debut at Chinajoy 2025 pic.twitter.com/G6cwqajspJJuly 24, 2025
Analysis: This niche handheld gaming PC is the only one with the right to be priced above $1,000Don't get it twisted, this is still a handheld device, and I'd find it very hard to spend a fortune on one personally, but this GPD Win 5 is likely the only device to warrant a hefty price tag.
Its Strix Halo APU is the best gift that any portable device and even small form factor desktop PCs could ask for. The Cyberpunk 2077 benchmarks alone should be a strong indication of that, but it's just a matter of how well the processor is utilized within a handheld chassis.
With power equipped for high-level performance (TDP hitting 58W), it introduces concerns about battery life and adequate cooling in a small portable device. If it isn't a watered-down version of the Ryzen AI Max+ 395 we know of, then the GPD Win 5 handheld would essentially wield the power of a gaming laptop that sits right behind an RTX 4070 model.
One thing is clear though; this will almost certainly cost a fortune, and while I've been critical of mainstream handhelds being too costly, this is a perfect fit for a niche device – and a purchase I could get behind if one wanted a handheld to last years without needing an upgrade.
You might also like this...- The Steam Deck 2 and a home console are reportedly in the works – and Sony may be helping to turn the handheld into the portable PlayStation game device I really want
- Amazon Prime Day might be long gone, but that doesn't mean deals for handhelds are over – the Lenovo Legion Go S is now at the same price as the Steam Deck
- Forget the Switch 2 – this dual-screen handheld gaming PC brings back the glory days of the Nintendo DS, but with all your Steam games