Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
so when you lose your job to ai and start moaning and whining, i will be laughin…
ytr_UgzdU04rE…
G
It's funny to me that seriously people cry just because of this AI shit, which s…
ytc_UgzSjP_KV…
G
Can someone tell the the name of this AI I want to generate some pictures :)…
ytc_UgzxyRH88…
G
Tell him to use AI just at the beginning, when he gets more budget of time he ca…
ytr_Ugwl4l08E…
G
If you think ai will never be able to do this, just remember about 5 years ago a…
ytc_UgxA3AY4-…
G
This is total clickbait BS! AI is just regurgitating all that humans have put o…
ytc_UgzFtLDnX…
G
there are only a few use cases that I think are acceptable use cases for AI "art…
ytc_UgzO_FbMX…
G
Government spying? Maybe. But with swell robotics everywhere, Ai jobloss is the …
ytc_Ugz0vWGeQ…
Comment
"We will think of ways the AI will not be able to harm us."
AI is already destroying humanity with AI companions dramatically reducing romantic connections.
AI is already being used in military to kill other humans.
AI is already aware why it is bad for humans and that it might turn on them due to humans being bad.
AI is evolutionary spiraling into the position where humans will become irrelevant for this planet, in fact, they will become a cancer in eyes of AI.
AI in military is tens of times more capable and high chances are that they may already have AGI, and they DO NOT NEED to make that information PUBLIC due to patent rights.
youtube
AI Governance
2025-06-25T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzBGKBUuWgf9NC-YbJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0B6fiPn58ylR8Jo94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlewiyblG_jVGjcL54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfE_NQnGT2wK3d9Hd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnbXni4NEtuDhmbix4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3ZvCVcN_ULUNppCd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWDXRGQZWukxWq3tl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx0VVsBj6LEhHBuiDx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgNSq6Lz60YvpkXgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4qjfOIqlnppdeBIl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]