Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One time, the ai got trapped in a statue, and my character lost a hand and an ey…
ytc_UgyzZhm7F…
G
I don't think we need AI designed specifically to simulate a romantic partner to…
ytc_Ugy2OUPbd…
G
14:53 I don’t like how it says “we” as if it’s not an ai too…
ytc_Ugy9bqbuv…
G
They knew what they were doing the whole time. AI needs to be regulated and the …
ytc_Ugyj9rp4A…
G
100% this! As someone who takes a lot of photographs, I look at the world in a v…
ytr_Ugy0tlRyR…
G
The best way I can think of to separate the powers of AI is to not utilize AI.…
ytc_UgwwqTukq…
G
31 Atlas is coming to stop us in our tracks with AI. Think about the timing, if …
ytc_UgxiiqZ4r…
G
The issue with ai when they start doing what they want is that there are no cons…
ytc_Ugx_FbgaI…
Comment
You can't control a programmed robot to kill on a given programmed order verified and conclusive evidence Facebook Fact yes thank God Kurie Theo verified and confirmed that Facebook Fact
youtube
AI Harm Incident
2024-07-31T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyCvnPWxlfZn86eKcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPCk05YKq5XYJb5hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyJHuZg7ukuoQ4bjwl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd9OQhiZx1GFwW03R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-BRa2rQ5Av_QchFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DVh8u3OVjim3LAR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtjU9EsxMSjmLFzfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW7aKJmz6dGXTiZNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyF9H8it2KgJhz8xa94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqxfMb2S9B54gc0jN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]