Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This was excellent, but it is the easiest thing in the world to get AI to change…
ytc_UgxenOIEr…
G
Problem with this issue is that other superpowers know the race to AI dominance…
ytc_Ugy-kjelZ…
G
Zenyatta from Overwatch! 3:25 even in 3:36 the middle robot says: PASS INTO THE…
ytc_UgxMG6x2m…
G
@group555_ I disagree.. A pencil is a tool for creating something with human in…
ytr_UgzZ7RcUO…
G
"Why has it never written back since they died?" Well, it's a fucking code, it's…
ytc_Ugy8oIaFv…
G
Well it's the amount of progress the AI is achieving every year that is scary. I…
ytr_UgxzDglcJ…
G
the moment someone tries to give a robot human rights i hope that its disassemb…
ytc_UgwoZKwVb…
G
We need to treat AGI as an alien. Because it will be. It's no different than a a…
ytc_UgzJUoaRB…
Comment
Instead of working on self-driving cars, we should put our mind and resources to find real solutions to traffic problems like reducing the number of vehicles on the road by building more walkable cities, subways and good old electric tramways.
youtube
AI Harm Incident
2020-12-23T10:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxzrJFHtwQVGAhSoTR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxU5tCZM8HakEDvsFl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMY4iDTn7n18gY6-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4ux_V7SfOPVtCZ9V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxScl_CUE54ZUahNwt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxSt_7nWVXXZuuXwvV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwalSdv_7oC0WE64eZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzo1Jhy8gfrjzTAU1Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxdohbhxra2IeZqFE54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxx9whhaDWkfEJOjy14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"approval"}]