Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This may be a little bit of a digression, but I've seen little to no discussion …
ytc_UgyVGy51A…
G
Wow. I am so glad Amazon didn't hire me. I would not be able to work there learn…
ytc_UgxBXFooy…
G
Most of the negativity doesn't come from a dislike of self driving cars in gener…
rdc_cpncauf
G
This isn’t that surprising.
I’m not an expert on that type of technology, but …
ytc_UgxWMtjc_…
G
We can all make the choice to unplug from the internet annSM….that’s how they’l…
ytr_Ugy7kUme1…
G
It's amazing how people don't know what they're talking about and think they kno…
ytc_UgxlhZeuN…
G
This is the best ever AI video i have ever seen, is it possible to connect with …
ytc_Ugz1ADC87…
G
Its sad that law didnt even adapt to the modern era of the internet completely (…
ytc_Ugzv59WtD…
Comment
Nothing scary. Many have know for years that 1 child policy is correct, and AI says it would implement it by using the intel that it has at its disposal just like anyone would. AI described how it would first avoid the inevitable war. Meaning those who reject the peaceful 1 child approach are the cause of the war.
youtube
AI Moral Status
2023-05-08T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxSigzBBcvJRX9hCjV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfa3FMSRYfC21_FKV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTBaZTB1W9br4Kbop4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3CMWnftq0G6kS-714AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJTS81HVElQfUSqPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxtu-vme2sguQFQoQ94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzH4G30P0ovXUoAFMh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyC4ypoApbvEkDvHT94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxxXHaS22EeM2swfgN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy77FcOX2w-vQHwz5x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]