Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the future this will be the black skynet guy Sarah Connor went back in time t…
ytc_UgyBEZNIa…
G
AI IS PROGRAMMED BY HUMANS FIRST AND FOREMOST HOWEVER AI IS NOT YOUR FRIEND IT M…
ytc_Ugxwqjgnt…
G
Hardly. First off: “were” not “where.” Secondly, not everyone has endless amount…
ytr_UgwIwkDbD…
G
To target minors is freakin crazy!! Like they have so much to live for and learn…
ytc_Ugxuhpmps…
G
the problem is not AI taking over Jobs, is that the "saved money" is going to th…
ytc_UgxaRGXpI…
G
Any job relating to software designing is at risk. That's a reality, not a myth.…
ytc_Ugx7UTxfl…
G
AI still appears to be remarkably bad at CATEGORIZING objects, the way humans do…
ytc_UgzzWrhwE…
G
Lamda isn't sentient. It is exceptionally cleverly programmed to speak about its…
ytc_UgwSjc98n…
Comment
This will be a negative loop: ppl rely more on AI for everything-> less critical thinking-> beliving everything that AI says -> less education.
I remember ppl who developed relationships with their Siri or Alex just because it has a voice. Now its just going to get worse
youtube
AI Moral Status
2025-07-10T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugys0vWGbvO4ZCmkhUV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQFLZmPjmEWsASsml4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxte9bCrgURHnkiBfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwW2KwHuMwLVxy9Ah4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyW74Jj06n6NcaIUil4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYqRGLJiw8e_JyVdJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWu4FzvmPgIk-s10B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4yt81iE1cJiR8_0B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuZz8AjZgbynwKM0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPHYFPHq2blkKNn1N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"}
]