TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)
Note: Reposting to get it out to a few additional groups.
I am informed that a new product called #Mentalyc has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.
I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.
**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**
Here are MY thoughts so far:
1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or
1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or
1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)
2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many #BAA subcontractors playing loose with the definition of #HIPAA (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.
3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)
4) OPTIONAL: Open Source
5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)
6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)
I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.
--
Michael Reeder, LCPC
@psychology
@socialpsych
@socialwork
@psychiatry
#Bias #Ethics #EthicalAI #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #bard #security #dataanalytics #artificialintelligence #CopyAI #Simplified #Writesonic #Rytr #Writecream #CreaitorAI #Quillbot #Grammarly #SmartCopy #TextBlaze #HIPAA #privacy #psychology #counseling #socialwork #psychotherapy #research #SOAP #EHR ##mentalhealth #technology #psychiatry #healthcare #medical #doctor
#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor
TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)
I am informed that a new product called #Mentalyc has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.
I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.
**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**
Here are MY thoughts so far:
1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or
1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or
1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)
2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many #BAA subcontractors playing loose with the definition of #HIPAA (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.
3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)
4) OPTIONAL: Open Source
5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)
6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)
I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.
--
Michael Reeder, LCPC
#Bias #Ethics #EthicalAI #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #bard #security #dataanalytics #artificialintelligence #CopyAI #Simplified #Writesonic #Rytr #Writecream #CreaitorAI #Quillbot #Grammarly #SmartCopy #TextBlaze #HIPAA #privacy #psychology #counseling #socialwork #psychotherapy #research @psychotherapist @psychotherapists @psychology.a.gup.pe @socialpsych.a.gup.pe @socialwork.a.gup.pe @psychiatry.a.gup.pe #SOAP #EHR ##mentalhealth #technology #psychiatry #healthcare #medical #doctor
#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor
TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)
I am informed that a new product called #Mentalyc has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.
I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.
**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**
Here are MY thoughts so far:
1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or
1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or
1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)
2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many #BAA subcontractors playing loose with the definition of #HIPAA (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.
3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)
4) OPTIONAL: Open Source
5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)
6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)
I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.
--
Michael Reeder, LCPC
#Bias #Ethics #EthicalAI #AI #CollaborativeHumanAISystems #HumanAwareAI #chatbotgpt #bard #security #dataanalytics #artificialintelligence #CopyAI #Simplified #Writesonic #Rytr #Writecream #CreaitorAI #Quillbot #Grammarly #SmartCopy #TextBlaze #HIPAA #privacy #psychology #counseling #socialwork #psychotherapy #research @psychotherapist.a.gup.pe @psychotherapists.a.gup.pe @psychology.a.gup.pe @socialpsych.a.gup.pe @socialwork.a.gup.pe @psychiatry.a.gup.pe #SOAP #EHR ##mentalhealth #technology #psychiatry #healthcare #medical #doctor
#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor
A.I. Bots Can’t Report This Column. But They Can Improve It. - ChatGPT isn’t the only writing assistant that has emerged to replace editors. We tested i... - https://www.nytimes.com/2023/02/01/technology/personaltech/chatgpt-ai-bots-editing.html #readingandwritingskills(education) #computersandtheinternet #artificialintelligence #contenttype:service #wordtunespices #openailabs #chatgpt #rytr
#rytr #chatgpt #openailabs #wordtunespices #contenttype #artificialintelligence #computersandtheinternet #readingandwritingskills
寫中文很難寫到很長的一段內容。寫英文就比較好。寫一頁A4紙的英文內容無難道,只需要填一些關鍵字上去AI就會幫你寫好一段文字。 可能因為 #Rytr 較平功能也較少,經常會無法使用,而且超過五百個字符的文字就要分段才能優化。
Hoe schrijf je een boek in 4 stappen.
In dit artikel, bijna geheel samengesteld via een AI-app, geef ik aan waarom je een boek zou moeten schrijven en hoe je dat dan doet.
Het heeft mij minder dan 30 minuten gekost om dit artikel van bijna 900 woorden te schrijven. In het artikel zit een link om het Rytr-programma gratis uit te proberen. Ik kan het van harte aanbevelen.
Wie schrijft, die blijft!
#AI #Boekschrijven #Rytr