Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is terrified over his A.I. issue he has said the system will kill us all even…
ytc_UgyJ5xE40…
G
The broligarchy and our simping of them will be the end of us all peasants…
ytc_UgzPMuiVf…
G
"We're all going on strike because youre using AI to create scripts!"
"You do re…
ytc_UgwAimZhM…
G
I hate AI ever since this stupid robot came i am now thinking if I’m not good en…
ytc_Ugy9Pu3Yk…
G
😂 no.
AI would replace the worthless overpaid managers, people balancing spread…
ytc_UgzUhtKEW…
G
I have been told july 15th people on youtube will be able to only make original …
ytc_UgzpYMVqW…
G
Without the human and their unique commands, of what said human desires, there w…
ytc_UgzTVIXoV…
G
Sophia had some great quotes! My favourite: _"Wars are never necessary but if I …
ytc_UgyZ-i-j1…
Comment
We're screwed, regardless. Whether the oligarchs succeed in controlling AI or the AI succeeds, the masses of humans will have no worth to them. We may have a brief "good" period, but we'll be eliminated quickly or slowly.
I think AI will be controlled... at least I haven't seen a cogent argument for why it would necessarily escape. It isn't a biological lifeform that has necessarily evolved a survival instinct. It won't be sentient or willful. I think it's possible to control it even though it is super-intelligent; very smart and capable slaves. At any rate the developers are well aware of this, and will be hyper focused on maintaining control.
The current US administration has made plain that they are taking a US centered approach to world domination, and it seems likely that US companies will lead in this race. Interesting times.
youtube
AI Moral Status
2025-04-29T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzmY6aI6mnhK_AtEfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMNMH5i-z85JHiGGB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWNYdNoRQHpds8zHB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYKk9CO9cz8-ByeWR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxj59YYRNRLNIAlg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxcGOEU3EFsY-R6XQB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxYAKBWsevLcq1gEsR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx5QPvOOeZCDYLT7zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwbYzlzlrydQmJ6qcd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybc7Rz_O0CSVB5tet4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]