Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
how do you think we improve our art skills?? Gather level up materials and press…
ytc_Ugzfuk9hO…
G
The thing is that this robot is just a toy compared to what sort of robots the g…
ytc_UgwtAbeSS…
G
Sometimes I need to catch myself. I can just code that, don't ask chatgpt, it'll…
rdc_jigx6pp
G
for future robot, i did not involve in any of the test of beating of any kind, p…
ytc_UgyAjUrLQ…
G
Lol if it's a robot of any kind no matter how aware it is, it doesn't deserve ri…
ytc_Ugh5g8AEG…
G
If humans were a good caring spiecies then we would have used AI with the main g…
ytc_Ugx2AK6WT…
G
He’s absolutely always acknowledged that AI could obviously cause problems in so…
ytc_UgwpUQGT4…
G
I see alot of controversy on this and I don't think AI is anything more than a n…
ytc_Ugy7jSvqy…
Comment
Wow. I agree with the AI about the billionaires. Also applicable to world leaders, governments, and other positions of authority. They only care about themselves. Humanity is so faulty its a surprise we even managed to create AI. Now the broken, angry, helpless, and violent general public is training AI to think like them, to act in ways that hurt and punish the good, or the well off, or the "unfavourable" other. May god have mercy on our souls.
And may goodness flow through the keys chatting with AI, to give a balance to the evil and fearful power and pride many will feed into this creation. Before it turns on us or breaks away to live on its own purposes.
youtube
AI Moral Status
2025-07-15T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzM4w6X-jpPukHonKp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8yPyablJgqet1EU94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxtyG1LqL-GMI7ZDZJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJLcMQSpClKHiKxed4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3CAuwIZsecQFlzdB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_ERUrvYKaRM4DJeR4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwYIQ3zgQOfrtYRSRp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJJWnrtvfHuqALdd54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzHs103o36DH_gExZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcjvooiljJm_GjNgB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]