Health data is increasingly being treated like currency – traded, aggregated and analysed, and it’s happening at scale. Wearables already collect enormous amounts of information about our bodies – data including heart rate, sleep cycles, oxygen levels and even stress indicators.
But now, Oura and Whoop have added blood testing to the mix, and this means we’re talking about biochemical data – arguably the most intimate and sensitive category of all.
This is where the notion of commoditised health data starts to take shape.
In theory, data-driven health insights could revolutionise preventive care. Just imagine a future where wearables automatically adjust your lifestyle plan based on your latest bloodwork or where AI predicts disease before symptoms appear. Sounds like a happily-ever-after sci-fi movie, right?
But, in practice, there’s a fine line between empowerment and exploitation.
What happens when this data is shared, intentionally or not, with third parties like insurers, employers or governments? Could someone’s elevated cholesterol, low testosterone or hormone imbalance be used to make financial or employment decisions? Or worse?
The more we integrate biological data into consumer tech, the more it risks becoming a product, not a right, and many people are asserting that we need to protect our biological data.
The Security Question
In a major step beyond wrist and finger-based tracking, both Oura and Whoop have unveiled plans to offer blood testing as part of their growing suite of health services. The goal, so they say, is simple yet ambitious – to blend everyday wearable data with detailed biological insights, ushering in a new, data-driven era of health management.
Both Oura and Whoop emphasise that they use secure systems and comply with medical data regulations like HIPAA in the US and GDPR in Europe. But, unfortunately, history has shown that no digital system is invulnerable.
Health data is among the most valuable types of information on the black market – even more than financial data, in many cases – because it can’t be “changed” like a credit card number. A single breach could expose millions of users’ genetic predispositions, mental health patterns or hormonal profiles.
And then there’s the issue of consent. Even when data is anonymised, studies have shown that it’s often possible to re-identify individuals through cross-referencing datasets. This means that “anonymous” health data can still be traced back to real people if combined with other information – a massive concern that privacy experts have been raising for years, and rightfully so.
In an age where AI systems thrive on large datasets, these blood-based biometrics could become part of an expanding digital ecosystem – one that users don’t fully control.
Health Data as a Business Model
Let’s not forget, Oura and Whoop aren’t healthcare providers – they’re tech companies. Their business models rely on subscriptions, user engagement and, crucially, data-driven personalisation. This is very different to what healthcare providers do and are held responsible for.
The addition of blood testing not only deepens the user experience, but it also broadens the company’s dataset, potentially enhancing product development, partnerships and research opportunities.
While these services promise better insights and health outcomes, they also shift the relationship between individuals and their health from personal ownership to platform dependency. Your health story no longer lives solely with your doctor – it lives in a database.
This trend echoes what we’ve already seen in other sectors. Just as social media turned human attention into a commodity, wellness platforms are now starting to turn biological data into a resource – something that’s incredibly valuable to tech giants, insurers and researchers alike. Does this mean that we’re commoditsing parts of ourselves?
The Regulatory Grey Area
Digital health technology has outpaced traditional healthcare regulation. Devices like Oura and Whoop sit in a grey area between consumer electronics and medical equipment.
While clinical labs must adhere to strict standards, the digital services that interpret and visualise your results often fall under looser consumer protection laws. This regulatory gap means that data use, storage and third-party access policies can vary widely, and users may have little recourse if their data is mishandled.
As the UK and EU continue to develop frameworks around digital health and AI-driven diagnostics, platforms like these could become case studies in how innovation collides with privacy law.
So, What Could Go Wrong?
It’s easy to imagine positive scenarios – incredible things like personalised medicine, early diagnosis and holistic care. Life-changing ways in which the technology could save lives. But what about the darker possibilities?
- Insurance discrimination: Health insurers could, in theory, use detailed biomarker data to adjust premiums or deny coverage.
- Employment screening: Companies could one day request health data for “wellness programs,” blurring the line between care and control.
- Government surveillance: In countries exploring national health databases, aggregated private health data could be folded into public systems, raising serious civil liberties concerns.
- Commercial exploitation: Tech companies could use aggregate biomarker data to develop new products or sell insights to pharmaceutical or advertising partners.
Indeed, when health becomes data, the boundaries between care, commerce, and control begin to dissolve.
Balancing Progress and Privacy
It’s undeniable that Oura and Whoop are pushing the frontier of personal health technology. Blood testing integration could genuinely empower users to make better lifestyle decisions and gain insight into their bodies like never before.
But, as innovation accelerates, we need equal progress in data protection, ethical governance and digital literacy. Consumers should understand what they’re consenting to, who profits from their data and how it might be used in the future. And these things need to happen at the same time, if not before, this data starts to be shared and sold – commoditised.
Otherwise, the age of personalised health could quietly become the age of personal health surveillance, where the most intimate parts of our biology become tools for marketing, pricing, and prediction.
So, Where Do We Stand On Oura And Whoop’s Blood Testing?
The move by Oura and Whoop into blood testing marks a bold step in the evolution of wearable tech – one that’s undeniably both exciting and unsettling.
We are indeed entering an era where health data is commoditised, traded not in clinics but in clouds. The technology promises a more proactive, connected kind of healthcare, but the implications for privacy, equity and autonomy are only just beginning to surface.
The question now isn’t whether this shift is happening – it’s whether we’re ready for it.
Our Experts
- Sara Fikrat: CPO at Semble
- Michella Botto: Principal at Exceptional Ventures
- Dave Dowman: COO of MyLocalSurgery
- Bailyn Fields: Registered Nurse, Boomer Benefits
- Dr. Shelby Marquardt: Founder, Blue Sky Scrubs
- Tracy Wood: CEO The DNA Company
Sara Fikrat, CPO at Semble
“With Oura and Whoop now offering blood testing, we’re seeing wearable technology and clinical data converge in ways that bring personalised health insights to a much wider audience. This is not simply about commoditised health data; it is about giving people the tools to better understand and manage their own wellbeing.
“The real challenge will be how this information is interpreted, protected, and integrated into everyday life. Preventive healthcare is changing, and there is a shared responsibility to ensure this data translates into better outcomes rather than just more numbers. True value comes from insights that lead to meaningful improvements in health and lifestyle choices.”
Michella Botto, Principal at Exceptional Ventures
“Without question, yes. Advances in science and technology in recent decades have made accessibility to data informing what is going on within our bodies infinitely easier, which has created a mass appeal to consumers. Especially since Covid, people want to be proactive rather than reactive about their health. And that’s a good thing.
“But we’re now seeing an explosion of personal data that can quickly become overwhelming or misused. The real challenge is helping consumers interpret this information responsibly; collecting it isn’t enough. As investors, we have a duty to back science-based companies that empower rather than confuse, ensuring innovation is driven by evidence, not hype – and that better data truly means better decisions for people’s health. Ultimately, that’s what moves us closer to our mission: helping people live longer, healthier, and happier lives – maximising Joyspan®.”
Dave Dowman, COO of MyLocalSurgery
“The move by Oura and Whoop into blood testing does mark a significant step toward the commoditisation of personal health data. What began as passive tracking of sleep and activity is now expanding into proactive health diagnostics – effectively merging wearable tech with traditional healthcare services.
“Now this shift could empower users with incredible insights into their health, but it also raises questions about data ownership, privacy, and the commercialisation of biometrics. As more companies enter this space, health data risks becoming just another asset or convenient commodity. The real challenge will be ensuring that individuals – and not the corporations offering this service – retain control over how their biological information is used, shared, and monetised.”
Bailyn Fields, Registered Nurse, Boomer Benefits
“Bringing blood testing into wearables like Oura and Whoop marks a major step toward commoditised health data. It blurs the line between medical diagnostics and lifestyle tracking, giving users access to detailed biomarkers that used to require a clinic visit. On one hand, it empowers people to understand their health more deeply and personalise their routines.
But it also raises serious questions about who owns that data, how securely it’s stored, and whether users can accurately interpret results without medical oversight. As the tech evolves, the challenge will be balancing convenience and responsibility. It’s important to make sure accessibility doesn’t come at the cost of accuracy or privacy.”
Dr. Shelby Marquardt, Founder at Blue Sky Scrubs
“Pairing lab tests with wearable data can empower people to understand their health trends and act earlier, but it also risks normalising constant self-monitoring and turning personal biomarkers into commodities for profit. People should treat at-home blood panels as screening tools, not diagnostic tests, and interpret results in consultation with a clinician. It’s equally important to ask how data are being stored and shared to avoid contributing to over-commercialised health tracking. Tracking only metrics that support meaningful action helps maintain perspective and reduces unnecessary anxiety.
“Ethically, companies must make their data practices transparent and help users understand what these results can and cannot reveal. Clear communication around interpretation limits prevents false reassurance or misinformed decisions. Above all, strong safeguards are needed to protect health data from third-party exploitation, ensuring that personal information is used responsibly and never commercialised without explicit consent.”
Tracy Wood, CEO The DNA Company
“Wearables like Whoop and Oura aren’t just gadgets anymore, they’re front-row players in a new economic experiment where human biometrics are the raw material. These devices now deliver “lab-style” insights once reserved for clinics: heart-rate variability, skin temperature, oxygen saturation, recovery readiness, even illness prediction.
“Yet traditional healthcare still looks at these outputs with suspicion. The medical industrial complex, protective, hierarchical, and reimbursement-driven, has been slow to validate data that comes from outside its walls. The ideology of self-tracking remains culturally incompatible with a system that monetizes treatment, not prevention.
“Oura’s Symptom Radar hints at infection patterns before lab results can confirm them, yet it’s careful not to cross the “diagnostic” line. The tension is palpable: wearables generate insights that behave like medical data, but regulators and insurers still treat them as novelty wellness metrics.
“And in that regulatory vacuum, something new is forming, a consumer health data economy, where the body’s signals are increasingly fungible. Because these companies live outside HIPAA, they occupy a gray space: not quite medicine, not quite lifestyle. It’s a cash-pay model now, but make no mistake, the real prize is the insurance reimbursement dollar. That’s when the ecosystem goes mainstream.”
“When privacy becomes the new currency.
“This emerging market is built on trust, yet trust is precisely what’s eroding. HIPAA doesn’t protect most wearables, and the FTC’s Health Breach Notification Rule now scrambles to fill the gap, bringing health apps under its watch. States like Washington are stepping in with the My Health My Data Act, treating personal biometrics as property, acknowledging their economic value even as federal policy lags behind.
“Consumers rarely realize that the same device promising empowerment is also generating tradable insight. Sleep stages, stress scores, ovulation signals, each data point becomes a commercial asset. When privacy becomes the new currency, users become the product unless transparency becomes non-negotiable.
“The companies that will win in this next phase aren’t just those who build better sensors; they’re the ones who earn ethical credibility through radical transparency and genuine user control.”
Wellness or medicine? The line is fading—by design
“Regulators are still trying to preserve a neat divide between “wellness” and “medical,” but biology doesn’t recognize those categories. Continuous sensors, AI inference engines, and secondary-use data markets are dissolving them.
“This convergence is not a glitch; it’s the inevitable evolution of personalized health. The deeper question isn’t whether it can be stopped, but who will govern it. If policymakers cling to outdated definitions, they risk ceding the future of healthcare to private platforms whose business model depends on opacity, not oversight.
“The opportunity, and the obligation, is to design a framework that protects autonomy while enabling innovation. We need risk-based regulation, not categorical silos; data fiduciary duties, not disclaimers; clarity about inference, not confusion about consent.
“Because if we don’t draw the boundaries ourselves, the market will, and history suggests it will draw them in profit’s favor.”