brian is live on unlonely pfp
brian is live on unlonely
@briang
genuinely interesting conversation regarding AI therapists on tonight’s stream of consciousness stream assuming a high enough level of competence, would you be open to having an AI therapist for yourself? or is a human connection too important?
13 replies
0 recast
0 reaction

Rafi pfp
Rafi
@rafi
Human connection is important but cannot have it available whenever you need it. Instead of spending 1h weekly in-person, you spend few minutes every day to align your life in real-time.
0 reply
0 recast
0 reaction

July pfp
July
@july
Human connection is not enough, it took me a really long time to find a coach / therapist that I connected with. I spoke with a lot of humans that I didn’t feel a connection with. With that in mind it’s so hard to think of finding that level of connection with an AI therapist, let alone a with another person
0 reply
0 recast
0 reaction

Chris Johnson pfp
Chris Johnson
@spiderfood
If robot gf counts as therapy, sure 😆
2 replies
0 recast
0 reaction

🐰 pfp
🐰
@rabbit
I would regardless of competence cause there's no reason it has to be exclusive. would use it to make me a better participant in my sessions with a human therapist, to investigate perspectives neither I or my therapist would have gone over otherwise
0 reply
0 recast
0 reaction

Mac Budkowski ᵏ pfp
Mac Budkowski ᵏ
@macbudkowski
It boils down to the question: would you be open to having an AI friend instead of a real one? Human connection in a patient-therapist relationship is often more critical than in friendships.
1 reply
0 recast
0 reaction

wake pfp
wake
@wake.eth
I would prefer it. Not for talk therapy, which is bogus, but for a well-structured cognitive behavioral program. The machine can build and maintain any pseudo-skinner-box better than their human counterpart, no doubt.
1 reply
0 recast
0 reaction

Victor Ma 🧾 pfp
Victor Ma 🧾
@vm
human connection for sure
1 reply
0 recast
0 reaction

grace pfp
grace
@grace
I would 100% try this, such a great way to democratize access to therapy when it's currently either cost or low supply prohibitive atm. would definitely need strong logic coded in to ensure individuals with more acute mental health issues should be escalated to licensed therapists, but (1/2)
2 replies
0 recast
0 reaction

Matthew pfp
Matthew
@matthew
ngl i have asked already done this and it’s actually pretty helpful. i think with embeddings and tuning it to suggest more normal things to say (it’s way too cordial and devoid of passion) would be incredible
0 reply
0 recast
0 reaction

Noah Bragg 🔥 pfp
Noah Bragg 🔥
@nbragg
I would want a human.
0 reply
0 recast
0 reaction

Ben  - [C/x] pfp
Ben - [C/x]
@benersing
Check out https://woebothealth.com/
1 reply
0 recast
0 reaction

borodutch pfp
borodutch
@warpcastadmin.eth
https://people.com/human-interest/man-dies-by-suicide-after-ai-chatbot-became-his-confidante-widow-says/
1 reply
0 recast
0 reaction

Maybe Im Wasabi〽️ pfp
Maybe Im Wasabi〽️
@maybeimwasabi
This is a fascinating question. @namekeeper wdyt?
1 reply
0 recast
0 reaction