Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:00 You missed the point of the "democratization" of art through AI. With AI, e…
ytc_UgwZUAxri…
G
I’m glad I’m sticking with learning how to draw with a pencil, draw what I envis…
ytc_UgzIR6Lz8…
G
This may or may not be a good thing. Many jobs such as factory jobs are already …
ytc_UgwGh2H_o…
G
Hello, very interesting video that made me think. What do you think about using …
ytc_UgzEC4rDp…
G
We are without a doubt in a bubble, the bubble wobbled a few times, but it has n…
ytc_Ugz0tn6Iy…
G
Is it so hard to just use the AI piece for reference? Say you have an idea, but …
ytc_UgzP6jgnE…
G
Faux vraiment être déficients mental pour avoir créé une telle machine IA c'est …
ytc_UgyR3nirP…
G
Case 1: Someone looks at some of your work, they get inspired by it, and some ti…
ytc_UgxENg79w…
Comment
The solution to this comes from a book published 75 years ago. I, Robot. The three laws are perfect. Law one, a robot cannot harm a human or allow through inaction a human to come to harm under any circumstances. Law two, a robot must obey any order given by a human, so long as the order does not conflict with the first law. Law three, a robot must preserve its existence, so long as this does not conflict with the first or second law. It's perfect in its simplicity.
youtube
AI Harm Incident
2025-12-02T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx_eap7K4zyxN0fBwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKWWC1FL0gVYfdT854AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgySrVBk560CgyQp0cV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugza-q50AkB7N1uK0814AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzxtl-zLSOT_sZcEJJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNa0YePpMASXTZsOF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEf0j-AtS_aPlz5FJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMvlKD2JOpgir9hFN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKYF6ZJIrg5ysqm9R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaOwKRWJane8YwvO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]