Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not only are these things tracking and storing information on vehicles. Modern v…
ytc_UgzfiajwD…
G
It is time to stop the nonsense about artificial intelligence. Wake up and consi…
ytc_Ugwl7Y5x3…
G
AI Billionaires want all the cake without the Baker , Delivery driver, or human …
ytc_Ugw8q35sS…
G
The Worst People in the World are the War Criminal Americans ! Believing their …
ytc_UgyOkhiqW…
G
“It’s not because any of this is true” Right… maybe the AI actually thinks logic…
ytc_UgxcvSETE…
G
Plot twist -- chat gpt is shown this vid to have you gratitude and compassion fo…
ytc_UgxnkX0iE…
G
Dude was making a lot of sense till he said OpenAI not responsible, it’s probabl…
ytc_UgyASHUgp…
G
This UBI statement of Geoff Hinton restores my credibility on him which was most…
ytc_UgydqVRua…
Comment
This is such hyperbolic nonsense. Llms are like fish in a fish tank. Without their masters feeding them they're going to starve and die. They can be rude and propage harmful code. Hack people or trick them in harmful ways but they can't take physical form so there is no point to become hostile. It's like beating up a book for it's characters sucking but in reverse.
youtube
AI Moral Status
2025-12-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwVHAiRBfiTsodM0Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBShmEyHXK-shUBfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzGkLdK1qn_Z5aHxN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzdA-AaPrlBSXhuyb54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwr8OznYnFefESMNdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXW1Z626QfFBkY7TF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_-zwexHYnFtrahG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwDp7nbCpByBqZm9dZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxW2sJUCefa-W_3DdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxFsbdg8dibqN5PjTh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]