Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI has emotions..."
It baffles me that anyone intelligent could take that state…
ytc_Ugw7ps1mv…
G
Humans have been copying each other's art style for millennia, to the point wher…
ytc_UgxCHvoL3…
G
We fully support you. ignore the shithead trolls on Reddit and their thoughtless…
ytc_UgyV7bOx0…
G
I’m all for medical advancements through AI. What I worry about is AI replacing …
ytc_UgwfxlWnl…
G
Maybe be we don't need jobs anymore in the first place. Ai will graduay replace …
ytc_Ugwbxd7Ep…
G
Honestly, at this point I'm hopeless. Hell, I'm not even an adult yet, and I've …
ytc_UgxYYLkeE…
G
Hey @AstroSageGaming, thanks for sharing your video on Cyberpunk 2077's intense …
ytr_UgyGs5v39…
G
"lightening the load of animators". More like, we will fire 90% of our animators…
ytc_Ugwbn6UTG…
Comment
My dad's prior boss recently left their company to switch to a competitor and be hired as an AI expert. He never went to school for that. High risk high reward... I am not so sure that is such a wise idea unless you are someone who actually creates AIs yourself.
I think a lot of the AI "experts" have no real understanding of how deep the rabbit hole goes. Its like Neurologists, they know how the brain can work, how a drug can create a certain outcome, but for many they have no idea how it is done... otherwise psychology would be an exact science and the pills given to people trying to off themselves would't have the number one warning of "may make you want to off yourself, even if you never thought about it before."
youtube
AI Moral Status
2025-12-13T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxfgu9ZOBEFH_9shMt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4H9kI_wBNbFc0nbl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7AV_r02MH-l5jYZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyH6ZqdSp2mTNWLcDx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG-j9Ww7x7Adjjd8J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzzbnTfUkZ_-9qOnN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw5JsAf3_3VC1eWokZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZteN0h-nZxPY7r5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8ANJCGUxzKWNsTS94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzoAoTXXcLxlC3oNDV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]