Are you seduced by OpenAI?

Developing a new solution is easy. You dream up an idea, hand it over to ChatGPT or your favorite LLM and say: “Make this a reality”. The crazy thing is, AI can do it for you. Write your code, teach you how to connect an API, setup a database and get you up and running in minutes. What used to take months and a whole team can now be done by one person from their phone.

But, as the barrier to entry begins to lower, how can we make sure that the apps and solutions we create are safe? Who is monitoring where the data goes, how the compute is handled and provided and what data is being used in training? The low barrier to entry is amazing for innovation and getting useful solutions out there fast but what about in healthcare? In the medical field, you cannot just ask ChatGPT to make you an app and hand it your patient data. Well, technically you can but it would be ludicrous and unethical.

Unfortunately, the people with the medical know-how to create the apps that patients and doctors can rely on do not know what to do next. They may have ideas but they are unsure of how to build them while protecting patient data and ensuring that the solution is effective. Most people will tell you that you can simply vibe code the solution and post it. However, linking together ‘trusted’ APIs like the one from OpenAI is not enough. We must consider how the model is trained, on what dataset and with whose values. Security, data privacy and explainability must be our top concerns when creating and implementing AI solutions for the healthcare industry. That way, doctors can better advocate for their patients and patients have clear insights into their care.

Before you hit publish on that app, ask yourself: How am I handling data? Where is the compute occurring and who has access to it? Do I need that API and who created it? Because if you are going to create an AI solution that uses sensitive medical data, you must treat it with the respect it deserves and ensure that the people who gave it are fairly represented and protected. Don’t get seduced by an API since it is the easiest solution and everyone else is doing it. Think through what you are trying to create and what tools are best to achieve it.

There have been some recent examples of companies who did not think about the impact of their actions. Take Flo Health and their women's health app, Flo. To use the app, women input their most sensitive data: information about their period, symptoms and other gynaecological health information. [1, 3] Despite assurances that their data was safe and confidential, it was leaked via SDKs (software development kits) to third-parties without consent. [1, 3] Users sued the company for its alleged collection and interception of their sensitive data, leading to a settlement of just under $60 million in September 2025. [2, 3] The breach created a huge mistrust from the women who had trusted the app with their most personal information.

Another example is 23andMe. In 2023, hackers used credential stuffing to break into the genetic testing platform, exploiting reused passwords to access customer accounts. [4] Through a data-sharing feature, they stole the personal and genetic information of 6.9 million users, specifically targeting those of Ashkenazi Jewish and Chinese heritage and selling their data on the dark web. The company didn't notice the attack for five months. [5] The fallout? A $30 million settlement, a £2.31 million fine from UK regulators and bankruptcy in March 2025. Now, the question of what happens to all that genetic data sits in the hands of whoever buys the company's assets. [6]

References

  1. Alder S. Jury Rules Meta Violated California Privacy Law by Collecting Flo App Users’ Sensitive Data [Internet]. The HIPAA Journal. 2025. Available from: https://www.hipaajournal.com/jury-trial-meta-flo-health-consumer-privacy/

  2. Alder S. Flo Health; Google; Flurry to Pay $59.5M to Settle Privacy Lawsuit [Internet]. The HIPAA Journal. 2025. Available from: https://www.hipaajournal.com/flo-health-google-flurry-59-5m-settlement-privacy-lawsuit/

  3. Vanian J. California jury rules Meta violated privacy law in case involving period-tracking app [Internet]. CNBC. 2025. Available from: https://www.cnbc.com/2025/08/07/jury-rules-meta-violated-law-in-period-tracking-app-data-case.html/

  4. McKeon J. What the 23andMe Data Breach Reveals About Credential Stuffing [Internet]. Healthtech Security. TechTarget; 2023. Available from: https://www.techtarget.com/healthtechsecurity/feature/What-the-23andMe-Data-Breach-Reveals-About-Credential-Stuffing

  5. Alder S. 23andMe User Data Stolen in Credential Stuffing Attack [Internet]. The HIPAA Journal. 2023. Available from: https://www.hipaajournal.com/23andme-user-data-stolen-credential-stuffing-campaign/ ‌

  6. Rijo L. UK data regulator hits 23andMe with £2.31 million fine for genetic breach [Internet]. PPC Land. 2025. Available from: https://ppc.land/uk-data-regulator-hits-23andme-with-2-31-million-fine-for-genetic-breach/ ‌

Next
Next

From Prototype to Global Stage: Our GITEX & Expand North Star 2025 Experience