Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Howdy! Disabled artist here! AI “art” sucks and I genuinely don’t think AI “arti…
ytc_UgwXHkJRX…
G
A guy telling a AI-Artist to do art, is not a artist. They are a AI-Artist wrang…
ytc_UgzuSPhzG…
G
I wonder if the robots are being sent the answers via some sort of messaging sys…
ytc_Ugy7D7ymZ…
G
No, it's people who are rightfully upset about somebody stealing content, mergin…
ytr_Ugwja6OEr…
G
I may have a devart account but I'm not against nor do i support AI on dA. if an…
ytc_UgzTwQ8K3…
G
Johnny as you said automation bias is one of the biggest and most important aspe…
ytc_Ugziql6Di…
G
If you consider yourself an artist while using AI, no. You’re good at writing pr…
ytc_UgzzTt4SL…
G
I don't like AI to be put in these situations but hay AI studies patters…
ytc_Ugxckp07r…
Comment
In my opinion, it should always favour the lives of others over the passengers and drivers. If you choose to drive in a self driving car, you are accepting the risks (which, may I mention, are less) and show you are willing to take the hard end of the stick if it does end up in a crash.
youtube
AI Harm Incident
2020-10-14T10:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyr0RMpIPjrTwnqobN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-IgbdLnExAW_EhZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN7dBvLWbIzar2GFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEykcQ7MVTCsx7g7d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwvKTI5iOHqUc77Qtp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0YcSTWrba2PmZtwp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugylzv1Xaz0MgWpxUfJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3EQZ9U6NBDWibkPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxJ57uNPwCyWaPdS5p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsG5XLEWsnnq4EsCR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]