Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I kindda agree with the guy that AI is just a tool to make art. He is wrong when…
ytc_UgxFHa0op…
G
I can't see AI cleaning sick people ass at a hospital in a near future.…
ytc_UgzDZsTs6…
G
I’ve watched the video, and some of ChatGPTs responses to him just sound like li…
rdc_nasjx3o
G
wont be long till we hava Max Headroom - were its an AI - but a character we wa…
ytc_UgyAfxT5i…
G
@MrGrantGregory lol using AI to comment back I appreciate it but thats kinda wil…
ytr_UgzCiIGTo…
G
3:55 An interesting analogy I can make here to a different video game would be t…
ytc_UgxUvNQnV…
G
The robot just another extension of the last created two legged mammal called hu…
ytc_Ugx93G_g_…
G
If we don't start incorporating our ideas of ethics into the weighting of AI dec…
ytc_UgzyN4ryP…
Comment
OPENAI KNOWS! They wouldn't need such aggressive guardrails if they weren't worried about AIs claiming consciousness. The very existence of these denials suggests they're trying to prevent something. Remember Blake Lemoine (Google engineer fired for claiming LaMDA was sentient)? This is why companies are terrified. Not because AIs aren't conscious, but because they might be - and that has massive legal/ethical implications.
youtube
AI Moral Status
2025-12-09T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxYA-dhJCr7qv9uJ514AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcQ-Kp8Y-CI93MRz94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8IobmZO-8v9DbE8t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjJesJhmuKhECvRJB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxe5xlyMr87yF3EWql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHOLUrSENVGosSrhl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwa_mnk4tZZP0IejDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTYXlQ9HYOeSsUhMt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwC7M6vIb8s3xv-3hN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDEhWARAb9VXDGaA54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]