Data Privacy Day: What Is The State Of Data Privacy And Where Do We See It Heading In 2026?

Data Privacy Day has always been a moment for reflection, and this is more true today than ever before.

Indeed, in 2026, it feels less like a symbolic calendar date and more like a necessary pause in an increasingly complex digital reality. Indeed, as tech continues to develop and progress, the need for more advanced and robust data privacy increases.

Indeed, we are generating more data than ever before, sharing it across more platforms and trusting more systems (many of which are powered by AI) to process it on our behalf. From workplaces adopting generative tools to individuals living increasingly hybrid digital lives, the question is no longer whether data privacy matters – we know that it does. Rather, the question has become whether our current approaches are strong enough to keep up with ever-changing and evolving innovations in tech.

What’s changing isn’t just the volume of data, but rather, the nature of how it’s used altogether.

AI systems are now embedded into workflows, apps collect behavioural signals far beyond what users expect, and the boundaries between personal and professional data are becoming more and more blurry every day.

At the same time, regulation is also evolving (all over the world, even though it’s different in different countries), public awareness is growing and organisations are under increasing pressure to prove that they’re handling data responsibly. Which is very different to merely claiming that they are.

 

The New Privacy Challenge Is About Complexity, Not Just Compliance

 

For years, data privacy conversations have been centred on compliance – this meant ticking regulatory boxes, publishing policies, updating consent banners and more regulatory, bureaucratic nonsense. While these things remain undeniably important, they no longer reflect the full reality that organisations are facing.

Modern data environments are sprawling. Data flows between SaaS platforms, internal tools, AI systems, analytics dashboards and third-party vendors, often with limited visibility into how information moves once it enters the ecosystem. This complexity makes traditional, static approaches to privacy feel increasingly outdated.

But, at the same time, technologies like agentic AI are introducing a new dynamic entirely.

These systems don’t just store or analyse data, they actually act on it. They make decisions, trigger workflows and interact with sensitive information at scale. That shift forces organisations to rethink privacy not just as a policy, but as an operational discipline that must be built into identity, access, governance and everyday behaviour.

Indeed, privacy can no longer be a mere add-on to business operations. It has been something that’s an intrinsic part of the every day.

For individuals, the challenge is just as nuanced. Many people are more aware of privacy now than they were a decade ago. But for some reason, they still underestimate just how much data is passively collected through everyday digital activity. Images, metadata, browsing behaviour and app interactions all contribute to detailed digital profiles, often without explicit user awareness. In fact, if most of us knew exactly what and how much of our data is used to contribute to these digital profiles, we’d likely be horrified.

The result is a growing gap between how people believe their data is being used and how it actually is.

 

 

Why Does 2026 Feel Like a Turning Point?

 

If the past few years were about waking up to the importance of privacy, 2026 may be the year where accountability truly takes centre stage.

Regulators are becoming more assertive, consumers are becoming more sceptical and employees are starting to ask tougher questions about how their information is handled at work. And because of all this, businesses are realising that trust is no longer just something that’s a “nice to have” brand value – rather, it’s a core operational risk.

There’s also a noticeable shift in tone regarding data privacy.

The conversation is moving away from fear-based warnings toward more practical discussions about ownership, transparency and shared responsibility. Privacy is increasingly being framed not just as protection against harm, but as a foundation for sustainable digital systems.

In many ways, this moment presents an opportunity.

Organisations that invest now in governance, visibility and privacy-by-design practices aren’t just reducing risk; they’re actually positioning themselves as trustworthy actors in an environment where trust is becoming a competitive differentiator. Indeed, unfortunately, trust isn’t quite so easy to come by these days.

Let’s take a look at what the experts have to say on the matter.

 

Our Experts

 

  • Jimmy Astle: Director of Machine Learning at Red Canary
  • Dana Simberkoff: Chief Risk, Privacy and Information Security Officer at AvePoint
  • Philip Dutton: CEO and Co-Founder at Solidatus
  • Sundaram Lakshmanan: VP of Development at Fortra

 

Jimmy Astle, Director of Machine Learning at Red Canary

 

jimmy-astle

 

“Agentic AI is moving out of the lab and into real-world corporate systems – used for scanning documents, augmenting workflows, and taking actions once reserved for humans. That shift has significant ramifications for data privacy, especially if AI tools are deployed without clear governance, strong access controls, and careful oversight.

“The risk stems from the increasing volumes of information that organisations need to grant their agents access to for them to act autonomously. That data is often sensitive or personal, relating to employees and customers – who expect the business to keep it secure. This is why guardrails around data access must come first in any AI initiative.

“Data privacy in the agentic era starts with treating AI like any other user that accesses corporate systems – it must be secured at the identity layer. Organisations should keep their access privileges tight, maintain clear visibility into which data AI agents can retrieve and act on, and control which users are able to prompt them. From there, employees need clear usage policies and security teams should regularly review how their AI systems behave in practice. Privacy checks should also be built directly into user workflows from day one to ensure consistent and widespread compliance. With robust data privacy controls, AI will remain a force for efficiency and insight, rather than a source of unintentional exposure.”

 

Dana Simberkoff, Chief Risk, Privacy and Information Security Officer at AvePoint

 

dana-simberkoff

 

“Building on the shared responsibility mindset that’s been widely highlighted for Cybersecurity Awareness Month, Data Privacy Week draws attention to individual data ownership and designing privacy into the way we work and the systems we rely on.

“Personal data ownership and agency is critical both in and outside of the workplace, which is data directly tied to an identity of an individual (whether it be surrounding their being, health, finances, or person). From the CEO down to every single employee in the company, organizations must make sure that they prioritize data protection, privacy and security by design (and by default) – leading with privacy awareness when building their security practices.

“This ensures a sustainable future, and one that respects rights of individuals as well as protects the greater good. In practice, this means designing privacy into all workflows across the organization by default, directly into daily systems and teams so that protecting information becomes a shared responsibility rather than an afterthought.

“Organisations should treat employees’ personal data with the same care as their own, ensuring it is never used or collected without explicit permission. However, there is no such thing as privacy without a strong data and AI governance foundation. Security teams must become privacy-aware and proactive, by using AI defensively to predict breaches before they occur rather than just reacting to them.”

 

For any questions, comments or features, please contact us directly.

techround-logo-alt

 

Philip Dutton, CEO and Co-Founder at Solidatus

 

philip-dutton

 

“The real privacy challenge is not just where data is stored, but what data you hold, where it came from and where it goes. Without data lineage, organisations cannot reliably honour access requests, guarantee the right to erasure, or prove that personal data has only been used for its agreed purpose. You cannot take control of your data if you cannot see it.

“Data Privacy Week is about giving people confidence in how their data is handled. For organisations, taking control means being able to explain, protect and, when needed, remove personal data across the business.”

 

Sundaram Lakshmanan, VP of Development at Fortra

 

sundaram-headshot

 

“Data Privacy Week Expert Tips from Fortra and Sundaram

 

“1. AI represents a generational leap in technology, and with it come new challenges that society will only fully understand over time. Privacy is one of the biggest concerns, and not all AI providers handle it in the same way. As laws evolve to catch up, users can protect themselves by avoiding the sharing of sensitive personal information such as identity details, financial or health documents, or family photos when seeking advice from AI tools. Digital images often contain hidden data like location and timestamps, and being aware of this is an important part of staying in control of your privacy.

“2. The always‑on digital world has blurred the boundaries between personal life and work. People often use their personal devices for work tasks and their work devices for personal ones without a second thought. A simple way to protect both privacy and corporate data is to use separate browser profiles, or even different browsers, depending on the situation. This small habit helps maintain personal privacy while keeping organizational information secure.

“3. Most people don’t realize how much information apps and websites collect beyond what they type into forms. Modern web tools track a wide range of online activity including uploaded photos, interactions, chats, hashtags, and mentions, which can all create a much larger data trail than users expect. To protect your privacy, people can adopt simple habits like using separate browser profiles, different browsers for different tasks, or distinct email addresses. These small steps help limit how much of your personal information gets aggregated.”

 

For any questions, comments or features, please contact us directly.

techround-logo-alt