Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The opinion of the robot is the opinion of the person who wrote their code.…
ytc_Ugzw9VOoH…
G
Why would companies use AI to displace jobs if they know that if humans don't ea…
ytc_UgxlZm-zD…
G
The thing people aren’t saying out loud is that AI isn’t optional. Yes, it has a…
ytc_Ugwb7pkHs…
G
Instead of giving A.I. an algorithm for painting, which is still pretty cool, wh…
ytc_UgwCsmD4Z…
G
You're saying that they don't understand how it's working, but you may be giving…
ytc_Ugx-THrMG…
G
AI is going to take us down long before global warming does at this rate.…
ytc_Ugw6azM5o…
G
No thank you I don't want a robot as a friend I want a real human as a friend du…
ytc_Ugy22CIDP…
G
If AI really takes everyone's jobs, then we will simply have a society where lab…
ytc_Ugxv0QVME…
Comment
Maybe it would be the best for humanity, been controlled without to know... maybe the AI would stop the whole bullsht and put us on the right rails
youtube
AI Harm Incident
2024-07-30T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyCvnPWxlfZn86eKcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPCk05YKq5XYJb5hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyJHuZg7ukuoQ4bjwl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd9OQhiZx1GFwW03R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-BRa2rQ5Av_QchFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DVh8u3OVjim3LAR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtjU9EsxMSjmLFzfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW7aKJmz6dGXTiZNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyF9H8it2KgJhz8xa94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqxfMb2S9B54gc0jN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]