Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
omg the robot has a mind of its own the robot apocalypse is happening runnnnnnnn…
ytc_UgiRf2QFe…
G
I bet the problem will be taken extremely serious all of a sudden and all forces…
ytc_UgwHfgWuK…
G
it's not malace I think, poor AI, it's either malice or malaise no? I guess it w…
ytc_UgyY_0ODu…
G
Generative AI isn't even real sentient AI, it's a literally unthinking language …
ytc_UgzcpI8GF…
G
@zjaeriqsanders1731 so you're denying the fact that ai art steals artwork from t…
ytr_UgxuwfN_a…
G
we're not scared of ai in America, if we can't eat it or F it, we'll just shoot …
ytc_UgzbxEJOq…
G
just imagine politicians cancelling their voters because AI said so. Humans are …
ytr_UgxeIADcp…
G
can you start worrying when driverless trucks move from a hypothetical future an…
ytc_UgwzwVLGG…
Comment
If AI is created by humans I assume it's going to have human nature tendencies misinterpretation of of the world since it's not experiencing the world like a human and without the difficulties of life or just life experiences in general it's going to have a hard time getting smarter and accept intelligence of moving forward this is what I assume i don't know
youtube
AI Moral Status
2025-04-11T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz3DNwDbJ3Hvw25S7p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkYacp5TeY7HiEXeh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzK7DkBvD9HT6a8RvB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwPYhnO4vBkNyPTp1B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzqA70FX9le0VnRGz94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDo0b8uPcUUhqGQ_l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHCus9geBRZapsM154AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVj3hPAqSZUYXgw_J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwZ1OO2Vnu9cuA_AjR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyekf2NV9bau8DqOLR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]