Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Juuso Pylväläinen I completely agree, it seems to be all script coding, for now …
ytr_Ugy0NsxBp…
G
The Google and Cisco people are dead wrong. The Internet didn’t make video renta…
ytc_UgxYxtSfD…
G
@Dr.farazalam give an x-ray image to GPT4 and ask to read it for you! And that i…
ytr_UgyUnPEH-…
G
Decades to come to think like humans, really? 1) Why would they want to think li…
ytc_UgyK8d5gS…
G
I’m reasonably sure that you will not know if an AI is conscious. It’s fairly ob…
ytc_UgxcDToJI…
G
All the examples you showed look like shit compared to the original. Sounds like…
ytc_Ugw2lYaI_…
G
People shouldn't worry about AI taking their jobs, it's not a problem, it's just…
ytc_UgycykcpC…
G
I make character ai's... im doing a vote on my channel for what i should make ne…
ytc_UgyBWehk5…
Comment
We don't need actual AGI to destroy most of our jobs. And the people who are making decisions about the scope and scale of "AI" are, in fact, trying to eliminate most of our jobs without bolstering any kind of safety net. It will be like the apocalypse of the horses, all over again. Your robot maid, in today's world, leads directly to people starving to death.
youtube
AI Moral Status
2025-08-17T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwbn7Tls4SKwsKIV2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyWwrFOxTcyg72MbyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxfWvPG-99GFEnOKS94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7onaB50SlPiC-2jN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzmmtgaxN_QSW9oy_V4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwu7s1SpTxrjkN7IpJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5QHhgBYj9KIl7sMx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyp0VC0XfGlmOjKkVZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAhNw2ipoZl5UwHeV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7Z91zqJwFx2PP-ZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]