Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree, if you want your autonomous weapons to attack indiscriminately.
Howe…
ytr_Ugy3QOWo5…
G
I dont think we need worry about any other Nation if Tesla has an army of these …
ytc_Ugzdypi8D…
G
Art is the soul of humanity... it's the one and only thing that AI should never …
ytc_Ugytog7Hl…
G
Back to the AI topic—people need to understand that the layoffs aren’t about lac…
ytr_UgyWLD8i2…
G
The way you can be sure the AI is not sentient is *that* it talks like a human. …
rdc_mdkswok
G
Inspirational. Thanks ♡.
I found myself arguing a lot in social media because o…
ytc_UgxFzV9Zs…
G
I really am sick of hearing this…. Either something has to increase with bad beh…
ytc_Ugw1tXuuN…
G
I hope aurora and any copycat goes bankrupt. They are not going to improve wor…
ytc_UgyZw1fuV…
Comment
Do animals have rights? No. So unless some psychopat idiot coder makes a conscious robot program, it won't matter.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg8BCQ6vQeVy3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugh5HuNQx_2RBXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh1IdnbNn4US3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh4EvEz2oN5d3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj-_TJm1YcyDHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJkMb0npbVNHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi0dNvgndhWbXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghbLR35UtWqVXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugho7QY63NVatHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjNsVEAgYa52ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]