From The Medical Futurist: For More Info, Go Here…
ngd- What could go wrong? For a half-century, I have been watching the health professions claim the right to control the lives of people with mental illness and developmental disabilities. It isn’t that AI can’t be useful in supporting people. It’s that the whole tenor of this article is using the argument of “accuracy” to justify taking personal autonomy away from people. After all, people with mental illness are a problem, right? They need to be fixed so we can live in peace, right? The other irritation is the trope that medicalizing mental illness will reduce stigma. This has been tried many many times in the last half-century, and it only seems to give the bigots more ammunition for forcing people to stop being who they are. Even you grant that the intention is good in such proposals, they only feed discrimination and loss of rights.
By making mental health disorders more physical, Chiu hopes to help destigmatize them as well. If they can be diagnosed as objectively and corporeally as heart disease, would depression or bipolar disorder or schizophrenia carry the same shame?
In the future, patients might go to the hospital with a broken arm and leave the facility with a cast and a note with a compulsory psychiatry session due to flagged suicide risk.
For example, conducting structured clinical interviews could be done in the future by virtual humans – as they would definitely ask the same, previously determined questions and interviewees wouldn’t be that much burdened by sharing their secrets to a virtual, anonymous entity as to another, possibly judgmental human.
Saving money, controlling behavior…