Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Doesn’t matter how smart a doctor is- he is finite in the information he can pro…
ytc_Ugz0Ndxiq…
G
Ill be honest, I don't really care about AI art, the people who pay for art are …
ytc_UgxNTmXKX…
G
Can you make a video about Alethea AI? They’re actually working on the ownership…
ytc_UgzcJ-MFm…
G
Just tell the weird AI that you will continue the same conversation and therefor…
ytc_UgyaX1oP8…
G
Yeah but this is the first time the rich might employ a robot army to fight for …
rdc_m1yho3b
G
Full Self-Driving (Supervised) is currently available in the U.S., Canada, China…
ytr_UgwoIXEQU…
G
It just means you have to go to college for degrees that won’t be overrun by AI.…
ytc_UgyUE3lAd…
G
Positive note: Life after Ai must be pleasant with endless technology and probab…
ytc_Ugzdir0m0…
Comment
It's like so many other things... It can be life saving if used the right way, at the right time, by the right people, but also devastating to social systems if used incorrectly, similarly to how social media already functions. Too many young people being exposed to ideologies that come from experiences that they have not been through themselves yet. Socializing with AI too early or too often will deprive people from the ability to communicate effectively with each other because there is much more complexity to understanding, interacting and maintaining relationships with real human beings.
youtube
AI Moral Status
2025-06-04T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx8CLEWDsV9JkwvPoh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzETHZx6GB_OIJlimR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUSWs_ACIxgT6Tf8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzF0SaHtiDefm7v1Fh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxI4NYHvNMniw1Vhgt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyHZnxgScMiwPPOKdJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwQZLHezVq2KTGiW8R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw3vXNUfuDwl2X6Lux4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyEONueG8ZeHIdQeQh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxvq-w5szx_8eP8vT14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]