Before you go, check out these stories!

0
Hackernoon logoMental Health Basics: Diagnosis, Treatment, Tech Tools by@michael-holborn

Mental Health Basics: Diagnosis, Treatment, Tech Tools

Author profile picture

@michael-holbornMichael Holborn

Last year I found I had ADD.

This was somewhat of a shock to me. I had always struggled with studying and doing repetitive tasks but had never really had any issues with school or university that I couldn’t workaround.

Entering the workforce, I excelled at first, until I was in a role where there were no workarounds — I had to stay focussed.

Long story short, at the age of 28 I was diagnosed.

Ritalin to me felt like sobering up from the longest pub crawl of my life.

Why were diagnostics so poor, and how many people like me are there suffering in silence?

Being diagnosed it struck me how poor diagnostics were. And being on Ritalin, and working out the right amount of medication to take, it became so obvious how much machine learning could do for people like me.

Are mental health diagnoses from psychiatrists and psychologists accurate generally.

Fundamentally there are two things doctors do diagnostically:

  • Real Life Quizzes
  • Look/Listen/Feel
  • Prod you with stuff and investigate

Now imagine. If your phone could look at your text, your voice, your app usage. I could easily identify better than the current system.

Are medical treatments, like Ritalin or SSRI’s, effectively measured for their efficacy in individuals?

To some extent they are. Yet — How we use technology could add so much value here.

So, unleashing my new found focusing powers

I built a tool to measure how Ritalin was effecting my text patterns and my Mood. This was not particularly difficult to do.

  1. Scrape my text.
  2. Applying sentiment analysis
  3. Building a NLP model based off of Reddit comments of people with ADHD

I could easily see how Ritalin was effecting my mood and writing in general.

Now — next step, I thought to myself, let’s release this to the public! So I posted it on reddit, actually getting a huge amount of interest.

The response I got was very interesting.

Now, this was from one post on Reddit… that got weirdly taken down by admins in a day. I had 5 people ask to work on it with me. Feeling validated, I started working out how this could really work.

When pursuing a contract with a research body and then doing due diligence on the legal side, I realised, collecting peoples data in this way is ridiculously unethical and legally incredibly risky. I had hit a snag. Although not before looking into Solid, which looks like a really good way of solving the data privacy issue in this space.

https://moodmap.app - This was the landing page for the business :)

So — what does the future look like in this space?

Well, it’s painfully clear that technology can make outcomes for mental health SO much better. Yet being able to execute the technology in a way that is legal, and doesn’t lead to other negative consequences? that’s another question.

Here are some potential issues that became more obvious to me as I was doing this.

Specificity matters. It’s better to be 100% certain the model has a diagnosis correct, rather than getting a false positive. It’s very rare to achieve 100% specificity though and if you were using a screening tool, you would sacrifice specificity because you would want greater sensitivity, so e.g., specificity may only be 20%.

To implement a tool like this requires a huge invasion of a persons privacy. Solid, essentially a new way of using personal data and sharing it, seems like the obvious way around this.

Surely looking at an individual when diagnosing someone with a mental illness is the wrong approach altogether. A graph of the patient, with the people around the patient, is likely a far better diagnosis avenue. I can't see how one can be diagnosed with say depression, without taking into account the people around them.

and importantly,

Diagnosing a patient has many cascading effects under our current health system. While it is just data, the way it could be used in medical practice has the potential to create real moral hazards, and incredibly negative outcomes for patients.

MedTech cannot be approached in the "move fast and break things" silicon valley mantra of Uber et al. When the stakes involve human life, you move deliberately and with empathy.

Yet, it must be approached for that same reason.Because a simple diagnosis can be the difference between someone struggling with ADD; and a highly focussed person working towards a better future for mental health.

Thanks for reading. If you have any questions feel free to reach out to me on Linkedin. https://www.linkedin.com/in/michaelholborn/.

Tags

The Noonification banner

Subscribe to get your daily round-up of top tech stories!