Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai and robot trucks seem like a great way to move drugs across country, built in…
ytc_Ugx-1OS6c…
G
wait until a heart broken , brokie , insecure engineers secretly makes it autono…
ytc_UgwOwCIGf…
G
Just remember that AI ‘art’ is on a screen, but can never be held in you hand❤️❤…
ytc_UgwtLcOVN…
G
We need about 1000 more Karen Haos. These companies don't want to invent somethi…
ytc_UgxlXfemT…
G
I like ai art
I have no artistic talent, the talent tools or dedication to learn…
ytc_UgzoG_v-Q…
G
Oh, I agree with you. But the current boomer talking point seems to be "well the…
rdc_mvbt5ap
G
Imagine AI taking over just to then become corrupt officials working in there ow…
ytc_UgzUDKqf8…
G
i automatically know youre a fragile liberal when you compare something loosely …
ytc_UgxVwzdiu…
Comment
It's around 100$ a week to talk to a licensed mental health specialist. How is this Open AI's fault. People talking to AI (as a friend) are not the most well adjusted people around and I am pretty sure that they don't have insurance or a good enough job to pay for a licensed mental health specialist.
The issue here, is not that AI is not doing a great job at saving lives,,, it's that people have nowhere to turn to address their mental health issues.
You should know better as you do segments on healthcare and mental health on your show.
youtube
2025-10-30T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwJP-_W1ASZ7ml2iU54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwjpGiKyrz-pkaj7tN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzdErGIXC35udbMOf94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyZ2AXH6ZfwIIdSyJV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzo0AJ3I-oZmdo7BdZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyUTuCzndKeHHa3oH54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzOJOkDUiPNn85RQEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},{"id":"ytc_UgwwL05qqBEE-El0O0V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UgyzfJSXAkisrnPbmT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwxWiE7DObqEMprQmN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]