Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They're already doing it. AI doesn't work but they cram it down the throats of a…
ytc_UgyFEUkV2…
G
Turn off AI, why can we not use our own brains??? Why can't hackers and people …
ytc_UgwezpzYq…
G
I disagree. Once most people lose their jobs, the concept of "money" will rapidl…
ytc_UgzNQxJJW…
G
Implement a tax for any company that employs ai and create a safety blanket for …
ytc_UgxYwtKaX…
G
the figure quoted for blackmail by the AI was 86% of the time they told it they …
ytr_Ugxska7cd…
G
"AI DRIVR" - tells me everything I need to know about your intelligence and your…
ytc_UgxN8DJNC…
G
No, because software is complex and you are required to have a mental model of i…
ytr_UgzWqfd_Y…
G
I think we’ll reach very advanced AI, but not full AGI, because keeping humans i…
ytc_UgxaNXwJP…
Comment
Asimov's Three Laws of Robotics are a set of rules intended to govern the behavior of robots, particularly in science fiction. These laws, as formulated by Isaac Asimov, are: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Harm Incident
2025-07-26T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzwe76NfuYtBb1jtXl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGEbyz2xO0qlyZ6rx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhYP1mDwT_HfcocOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwnq9VnYtUCWV4PZCt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxBNclizqKSoskbjNl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSIo63WFtgnFl8nzh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzV5zJLqzYDlKo1WhJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyL8qtTqFB-3lxmIGN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxk5i3OSoQ8r-P_HDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6Z_9yGpuhmEadG2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]