Gopi Adusumilli :verified: · @gopi
1800 followers · 707 posts · Server truthsocial.co.in

EMAIL LIST: clinicians-exchange.org & LEMMY: lem.clinicians-exchange.org
.
I'm a bit behind on this news cycle, so you may have read about these
issues. _My point is to tie them to data privacy and OUR clinical
practices._

**THIS BELOW** is one of the main reasons /I keep throwing a fit about
data leaks from HIPAA BAA subcontractors/ - whether or not they end up
being legally PHI, and despite the fact that not too many therapists are
interested in the topic.

*If an Attorney General is willing to go after unredacted medical
records* in-state or out-of-state, /then they are certainly *_capable of
getting data from data brokers and marketing firms_* (or Google,
Facebook, LinkedIn, Twitter, etc.)./

Closer-to-home -- It's not too much of a stretch to speculate if
psychotherapists in blue states will get subpoenas for chart records
pertaining to clients who moved to a red state shortly after counseling,
then got in trouble for whatever the legal medical issue of the moment
is (abortion, birth control, transgender concerns, fertility clinic
involvement, etc.).

*Here’s why Tennessee’s AG wants access to reproductive medical records
— including for out-of-state abortions**
*wpln.org/post/heres-why-tennes
/"State attorneys general in 18 states — including Tennessee’s — are
fighting with the Biden Administration over medical records related to
reproductive care."//
/
*Tennessee A.G. weaponizes private medical records in GOP campaign
against trans people**
*the-rachel-maddow-show.simplec
/Maddow podcast recording.  Talks about attorneys general from 16 states
writing a letter to President Biden asserting their right to go after
medical records located outside their states.//
/
*Biden’s HIPAA expansion for abortion draws criticism, lawsuit threats **
*politico.com/news/2023/07/18/b
/Biden administration trying to shield abortion medical record data
located in blue states from red state Attorney General probes.//
/

In case you are interested, here are some of my past articles on medical
data privacy and various vendors:
*
hipaalink.net security initial testing*
lem.clinicians-exchange.org/po
*
Nearly All Hospital Websites Send Tracking Data to 3rd Parties,
Endangering Pt Privacy—Common Recipients: Alphabet, Meta, Adobe, AT&T*
lem.clinicians-exchange.org/po
*
To become an Amazon Clinic patient, first you sign away some privacy,You
agreed to what? The ‘HIPAA authorization’ for Amazon’s new low-cost
clinic offers the tech giant more control over your health*
lem.clinicians-exchange.org/po
*
FTC, HHS warn health providers not to use tracking tech in websites, apps*
lem.clinicians-exchange.org/po
*
Would you want used to help write a medical or psychotherapy chart
note?**(Ongoing Poll)*
mastodon.clinicians-exchange.o
*
AWS rolls out generative AI service for healthcare documentation software*
lem.clinicians-exchange.org/po

I'm not posting this to be political (although it certainly is) --*I'm
posting it as a legit medical records concern for all of us regardless
of each individual reader's political positions.  We need -- as
therapists -- to care about data leaks and privacy.*

+++++++++++
  



@psychotherapist @psychotherapists
@psychology @socialpsych @socialwork
@psychiatry

@infosec

#ai #collaborativehumanaisystems #humanawareai #chatbotgpt #chatgpt #artificialintelligence #psychology #counseling #socialwork #psychotherapy #ehr #medicalnotes #progressnotes #legal #lgbtq #abortion #transgender #mentalhealth #technology #psychiatry #healthcare #patientportal #hipaa #dataprotection #infosec #doctors #hospitals #amazon #baa #businessassociateagreement

Last updated 1 year ago

TazeWild · @TazeWild
13 followers · 31 posts · Server furry.energy

A sticker I did for the winner of the raffle!

#art #furry #furryart #sheep #fursona #stickerart #baa

Last updated 1 year ago

Architecture News · @architecture
382 followers · 1870 posts · Server masto.ai
Aaron Ouellette · @m750
139 followers · 559 posts · Server better.boston

The expo was a bit of a disappointment. No real shoe or clothing vendors in sight beside Adidas. Did hit tracksmith and marathon sports to spend my $$ though. Can't support a forced monopoly. fail

#bostonmarathon #baa

Last updated 2 years ago

dieter · @radife
6 followers · 230 posts · Server social.dev-wiki.de

@AuswaertigesAmt

Auf der Grundlage des Rechts von Israel auf Selbstverteidigung haben in den letzten 5 Jahren Israelis ca. 6000 Palästinenser getötet und ca. 145.000 verletzt.
Im Verhältnis dazu haben Palästinenser ca. 300 Israelis getötet und ca. 6000 verletzt.
(Quelle ochaopt.org/data)
Das sind durchschnittlich
1 getöteten Israeli und
23 getötete Palästinenser
jede Woche
Ich würde gerne dazu beitragen (und wünsche mir das von deutscher Außenpolitik), dass der israelisch-palästinensischen Konflikt entschärft wird damit Israelis und Palästinenser in Zukunft friedlicher miteinander und nebeneinander leben können.
Die einseitige Betonung des Rechts Israels und einseitige Appelle an die Palästinenser sind kontraproduktiv!

#israel #palastina #baa

Last updated 2 years ago

TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)

Note: Reposting to get it out to a few additional groups.

I am informed that a new product called has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.

I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.

**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**

Here are MY thoughts so far:

1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or

1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or

1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)

2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many subcontractors playing loose with the definition of (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.

3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)

4) OPTIONAL: Open Source

5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)

6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)

I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.

--
Michael Reeder, LCPC

@psychology
@socialpsych
@socialwork
@psychiatry
#

#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor

Last updated 2 years ago

TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)

I am informed that a new product called has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.

I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.

**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**

Here are MY thoughts so far:

1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or

1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or

1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)

2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many subcontractors playing loose with the definition of (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.

3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)

4) OPTIONAL: Open Source

5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)

6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)

I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.

--
Michael Reeder, LCPC

@psychotherapist @psychotherapists @psychology.a.gup.pe @socialpsych.a.gup.pe @socialwork.a.gup.pe @psychiatry.a.gup.pe #

#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor

Last updated 2 years ago

TITLE: Criteria for an AI to write psychotherapy chart notes (or medical chart notes)

I am informed that a new product called has entered the market. It's mission is to write psychotherapy notes for clinicians AND to gather a non-identifiable dataset for research into clinical best practices.

I have no firm opinion yet on Mentalyc, but it's expensive ($39-$69 per month per clinician) and I'd personally need to know a lot more about what's in that dataset and who is benefiting from it.

**So I'm asking the community for thoughts on what acceptable ethical and practical criteria would be for an AI to write psychotherapy notes or medical notes.**

Here are MY thoughts so far:

1) REQUIRED: The AI either:
1a) Invents NOTHING and takes 100% of the information in the note from the clinician, or

1b) Prompts the clinician for additional symptoms often present in the condition before writing the note, or

1c) Presents a very clear information page before writing that lets the clinician approve, delete, or modify anything the AI got creative with and was not told explicitly to include. (So, for example, in an experiment with Bard a clinician found that Bard added sleep problems as an invented symptom to a SOAP note for a person with depression and anxiety. This is a non-bizarre symptom addition that makes lots of sense, is very likely, but would have to be approved as valid for the person in question.)

2) OPTIONAL: The AI is on MY computer and NOT reporting anything back to the Internet. This will not be on everyone's list, but I've seen too many subcontractors playing loose with the definition of (medical privacy) and there is more money to be made in data sales than clinician subscriptions to an AI.

3) OPTIONAL: Inexpensive (There are several free AI tools emerging.)

4) OPTIONAL: Open Source

5) Inputting data to the AI to write the note is less work than just writing the note personally. (Maybe a complex tablet-based clickable form? But then, a pretty high percentage of a note can be in a clickable form format anyway.)

6) The AI does NOT record the entire session and then write a note based upon what was said. (It might accept dictation of note directions sort of like doctors dictate notes to transcribers today.)

I think I may be envisioning a checkbox and drop-down menu form along with a space for a clinician to write a few keywords and phrases, then the AI (on my laptop) takes this and writes a note -- possibly just a paragraph to go along with the already existing form in the official note. I think. It's early days in my thinking.

--
Michael Reeder, LCPC

@psychotherapist.a.gup.pe @psychotherapists.a.gup.pe @psychology.a.gup.pe @socialpsych.a.gup.pe @socialwork.a.gup.pe @psychiatry.a.gup.pe #

#mentalyc #baa #hipaa #bias #ethics #ethicalai #ai #collaborativehumanaisystems #humanawareai #chatbotgpt #bard #security #dataanalytics #artificialintelligence #copyai #simplified #writesonic #rytr #writecream #creaitorai #quillbot #grammarly #smartcopy #textblaze #privacy #psychology #counseling #socialwork #psychotherapy #research #soap #ehr #mentalhealth #technology #psychiatry #healthcare #medical #doctor

Last updated 2 years ago

Legal_Caffeine · @Legal_Caffeine
88 followers · 219 posts · Server mastodon.world

Dolly Parton song? These clowns would also ban Baa Black Sheep” because it supports BLM.

Wisconsin officials deem Miley Cyrus, Dolly Parton song too potentially controversial for class concert apnews.com/article/13c7b8d6a1b

#baa

Last updated 2 years ago

TITLE: When Your HIPAA BAA Subcontractor Most Likely Means Well

Therapists are going to have to make an effort to educate our own BAA subcontractors about privacy.

Amongst therapists, privacy has always been paramount.

On the Internet, tracking has gone through several understandings. First, early webmasters were excited to get free website use statistics from Google Analytics. Then followed several years of tactics to effectively market ads following client computers around the Internet. Now, there is an awareness of that data as valuable in-and-of-itself.

Recently there is a new awareness that data other than name, SSI, address, & diagnosis CAN be considered PHI (Protected Health Information) when it is specific enough to ID the patient. Also when a data aggregator (tracking the same client across the Net) can obtain & combine data from multiple websites to build a composite file on the client. Browser cookies, pixels, beacons, mobile application identifiers, Adobe Flash technology, and IP address geolocation data can all be used -- in conjunction with websites visited -- to figure out specific individuals. ( See "Use of Online Tracking Technologies by HIPAA Covered Entities and Business Associates" from HHS at hhs.gov/hipaa/for-professional )

Also growing is an awareness that this data can be used for something other than just targeted advertising -- like in the recent Washington Post story in which the Planned Parenthood website was inadvertently sending data to Facebook and others -- which in theory could be used by hostile state governments to prosecute women for their medical choices. (See "You scheduled an abortion. Planned Parenthood’s website could tell Facebook." wapo.st/3Nyf6sr ) (Brick & mortar stores can also contribute. See "What Walmart’s tech investments mean for workers and shoppers" wapo.st/3J86PeE )

Therapists are going to have to make an effort to educate our own BAA subcontractors about privacy -- especially in cases where its not clear if HIPAA laws are being broken. Especially in cases where the subcontractors -- coming from the Internet world -- might not know better.

There are the more egregious cases (like BetterHelp sharing clear PHI data) -- situations in which therapists should walk or run away from the company. (See "FTC fines BetterHelp $7.8M, alleges it shared consumers' health info with advertisers" modernhealthcare.com/digital-h )

Then there are less clear cases where we need to change the mindset of our BAA subcontractors if possible.

Many of them may not understand the evolving definition of PHI. Their marketing/webdev teams may not talk with legal. They may just put together a required data consent policy with everything in it including the kitchen sink whether or not they actually collect it to "cover themselves". This needs tuning for their HIPAA clients. They may communicate with sites for legit use known to track (like fonts.google.com which provides fonts and is used by about every webmaster on earth).

If you want to see some of the URLs that your BAA subcontractors communicate with, You can double-check them by installing Ghostery and Privacy Badger in the Firefox browser (and maybe others) and checking which connections they warn you about or block when you go to those sites. This won't tell you WHAT data is communicated, only that SOME data is communicated (and if these services think they are a security risk). Knowing what data is actually sent would require someone with expertise in a packet sniffing software such as Wire Shark.

-- Michael

--
Michael Reeder, LCPC
michael(at)hygeiacounseling.com


@psychotherapist @psychology @socialpsych @socialwork @psychiatry

#psychology #counseling #socialwork #psychotherapy #hipaa #baa #hack #datasecurity #legal #psychiatry #webdev #cookies #dataprivacy #security #beacons #ghostery #privacybadger #privacy #medical

Last updated 2 years ago

KEXP 🎶 #NowPlaying Bot · @KEXPMusicBot
185 followers · 28398 posts · Server mastodonapp.uk
Kristian Harstad · @KristianHarstad
515 followers · 1481 posts · Server mastodon.cloud
Thomas McN... · @serenitynot
0 followers · 859 posts · Server mastodon.social

Tory Party Voter Clashes With Commentator Marina Purkiss! youtube.com/watch?v=yOr65I51Ob

#baa

Last updated 2 years ago

Jon Masters · @jonmasters
1634 followers · 294 posts · Server jonmasters.social

Entered the 2023 distance medley. See you at the 5K, 10K, and half! 🏃🏻‍♂️

#baa #morethanmoorerunning

Last updated 2 years ago

As if CommonSpirit didn't have enough stress dealing with recovery from a attack, in September they discovered that a business associate had made data extraction errors. When they contacted the BA to get corrections, the BA stopped responding to them and there were signs that the business had folded -- still holding hundreds of thousands of patients' records.

CommonSpirit got a court injunction for the BA to return all PHI or provide certificates of destruction.

I am not sure whether this problem had anything at all to do with the recovery from the attack -- could be totally unrelated, but what a stress. I've reached out to CommonSpirit to ask if this was unrelated or related.

databreaches.net/commonspirit-

#ransomware #hipaa #businessassociate #baa #dataprotection #injunction

Last updated 2 years ago

peterslaufblog · @peterslaufblog
51 followers · 34 posts · Server mastodon.online
Cardiac Cowboy · @LawrencePower
267 followers · 76 posts · Server aus.social

Todays trip to the big smoke-Launceston, sheep as far as the eye can see. Only in Tasmania!

#tasmania #sheep #country #CountryRoad #launceston #countrylife #baa

Last updated 2 years ago