We have come a long way since Joseph Weizenbaum’s 1966 program ELIZA began psychoanalysing students and staff of MIT. ELIZA was a “knee-jerk” therapist, which asked repetitive questions based almost entirely on the students’ comments to it. Follow the link above to experiment with ELIZA yourself. You may feel better after, if only from laughing.
These days we have “machine learning” and “deep learning” programs that are capable of predicting the onset of psychoses or suicide attempts to roughly 85% accuracy. Other programs can identify “emotional distress” in college students and can actually communicate with students in a therapeutic manner. The following excerpt comes from an article in the August 2019 issue of Current Psychiatry titled “Artificial Intelligence in Psychiatry”:
[Researchers may] categorize AI into 2 types: general or “strong” AI, and narrow or “weak” AI. Strong AI is defined as computers that can think on a level at least equal to humans and are able to experience emotions and even consciousness.7 Weak AI includes adding “thinking-like” features to computers to make them more useful tools. Almost all AI technologies available today are considered to be weak AI.
… In a prospective study, researchers at Cincinnati Children’s Hospital used a machine-learning algorithm to evaluate 379 patients who were categorized into 3 groups: suicidal, mentally ill but not suicidal, or controls. All participants complelted a standardized behavioral rating scale and participated in a semi-structured interview. Based on the participants’ linguistic and acoustic characteristics, the algorithm was able to classify them into the 3 groups with 85% accuracy…
… A similar study looked at 34 at-risk youth in an attempt to predict who would develop psychosis based on speech pattern analysis. The participants underwent baseline interviews and were assessed quarterly for 2.5 years. The algorithm was able to predict who would develop psychosis with 100% accuracy.
… A project at the University of Southern California called Simsensei/Multisense uses software to track real-time behavior descriptors such as facial expression, body postures, and acoustic features that can help identify psychological distress. This software is combined with a virtual human platform that communicates with the patient as a therapist would. __ Artificial Intelligence in Psychiatry
Psychiatrists are Not Worried… Yet
Most psychiatrists believe that they cannot be replaced by machines. But some are concerned that lower level therapists may be replaced — to the detriment of patient care.
About 50% of psychiatrists in a recent survey believe that artificial intelligence (AI) and machine learning will significantly transform the way they work, but only 4% think that future autonomous technology will replace them.
… Among the salient findings of the survey:
• Only 4% of psychiatrists felt that future technology would make their jobs obsolete
• Only 17% believed technology is likely to replace a human’s role in providing empathetic care
• More women (48%) psychiatrists than men (35%) were uncertain that the benefits of AI and machine learning would outweigh the risks
• More US psychiatrists (46%) than those in other countries (32%) were uncertain that the benefits of future autonomous technology would outweigh the risks
The majority of psychiatrists also indicated that future technology would be unlikely to replace physicians for complex tasks such as a mental status examination (67%), assessing the risk of violence (58%), and determining the need for hospitalization (55%).
There were only two tasks that the majority felt technology would likely replace:
• Providing patient documentation, such as updating medical records (75%)
• Synthesizing patient information to reach diagnoses (54%)
The question of whether “AI” will replace most skilled human experts is still an open one for many professions, including psychiatry. But we can be sure that computer scientists and their collaborators will continue trying to develop advanced computing platforms that are capable of doing all that and more.
…therapists base their diagnosis and treatment plan largely on listening to a patient talk—an age-old method that can be subjective and unreliable, notes paper co-author Brita Elvevåg, a cognitive neuroscientist at the University of Tromsø, Norway.
“Humans are not perfect. They can get distracted and sometimes miss out on subtle speech cues and warning signs,” Elvevåg says. “Unfortunately, there is no [objective test] for mental health.”
… Elvevåg and Foltz teamed up to develop machine learning technology able to detect day-to-day changes in speech that hint at mental health decline.
For instance, sentences that don’t follow a logical pattern can be a critical symptom in schizophrenia. Shifts in tone or pace can hint at mania or depression. And memory loss can be a sign of both cognitive and mental health problems.
“Language is a critical pathway to detecting patient mental states,” says Foltz. “Using mobile devices and AI, we are able to track patients daily and monitor these subtle changes.”
… In one recent study, the team asked human clinicians to listen to and assess speech samples of 225 participants—half with severe psychiatric issues; half healthy volunteers—in rural Louisiana and Northern Norway. They then compared those results to those of the machine learning system.
“We found that the computer’s AI models can be at least as accurate as clinicians,” says Foltz. __ https://medicalxpress.com/news/2019-11-artificial-intelligence-psychiatry.html
Psychiatrist in a Smart Phone?
The idea of carrying our personal psychoanalyst/psychotherapist in our pockets or purses may seem unlikely at this time. But already there are apps for anxiety, addiction, insomnia, and other problems of the mind. As smartphones get smarter and advanced computing platforms become more mobile, you can count on some amazing mind-benders coming along soon.
There might be as many as 1,000 smartphone-based ‘biomarkers’ for depression, said Dr. Thomas Insel, former head of the National Institute of Mental Health and now a leader in the smartphone psychiatry movement. At the moment, researchers are testing experimental apps that use artificial intelligence to try to predict depressive episodes or potential self-harm. Another tool called EARS (Effortless Assessment of Risk States) also uses smartphone data to identify people in psychological distress and may someday help flag individuals at risk of suicide.
… many smartphone-based applications have been developed in recent years that are able to proactively check on patients, be ready to listen and chat anytime, anywhere, and recommend activities that improve the users’ wellbeing. No matter whether it’s 3 a.m., the chatbot is ready to listen to any trouble and no one has to wait until the next appointment with the therapist. Moreover, these applications are usually more affordable than therapy itself, thus also those people could get some help who could otherwise not get any counselling at all. __ https://medicalfuturist.com/artificial-intelligence-in-mental-health-care/
Word of Caution
Some of us remember the days of the Soviet Union, when psychiatry was used by authorities as a bludgeon against political dissidents and free thinkers. Soviet “mental hospitals” were full of persons whose only mental weakness was the inability to conform completely to the whims of the nomenklatura apparatchiks.
In modern Communist Party China, the Social Credit System is poised to take Orwell’s 1984 (and the Soviet Union) into the 21st century — using advanced tools of “artificial intelligence.”
A low social credit score will exclude you from well-paid jobs, make it impossible for you to get a house or a car loan or even book a hotel room. The government will slow down your internet connection, ban your children from attending private schools and even post your profile on a public blacklist for all to see.
… There are reports that those whose social credit score falls too low are preemptively arrested and sent to re-education camps. Not because they have actually committed a crime, but because they are likely to.
… China’s already formidable police state has been upgraded using big data, machine learning, face recognition technology and artificial intelligence into a fearsome cyborg of state control. __ https://nypost.com/2019/05/18/chinas-new-social-credit-system-turns-orwells-1984-into-reality/
There is no question whether China’s communist party government will enlist “artificial psychiatry” into its grand crusade to perfect purity of thought and action. Of course it will, once it has tools that work as prescribed — and that can be integrated into the autonomous system.
Meanwhile, outside of the Chicom-controlled sector, we still have some time to prepare for what seems almost certain to come along eventually, from our own “well-intentioned” overlords (currently lording over the US House of Representatives). Be aware and be Dangerous. Begin to make provisions. It is never too late to have a Dangerous Childhood © .