Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right great way to make it sound even more confidently incorrect. Can't believe …
ytc_UgwYvTQxX…
G
Only because the robot became a cheaper option. But as AI and AI-powered machine…
ytr_Ugwcss8D3…
G
I could not agree more with you. Is important to put limits on the usage of AI…
ytc_Ugy4OuIZs…
G
Is the AI interspersed through our algorithms today not already, more subtly and…
ytc_UgzYdePcF…
G
AI isn’t a binary threat-or-saviour issue for software teams. Its most reliable …
ytc_Ugz752BLm…
G
I call bullshit. I have friends who are partners at major international firms wh…
rdc_n5h9qf8
G
Robots, Ai, will never have emotions and they can't clean themselves. Joshua ch…
ytc_UgwnDSLgc…
G
“AI art is more accessible for disabled people!” Hey hi so im disabled + an arti…
ytc_UgyKUrvcO…
Comment
1) A robot may not injure a human being or allow a human to come to harm through inaction; 2) A robot must obey the orders given by humans unless it conflicts with the first law; 3) A robot must protect its own existence as long as it does not conflict with the first two law
youtube
AI Harm Incident
2025-07-28T14:2…
♥ 29
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiAMIUWQs8ZSW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxiHjSONWgF9MytblR4AaABAg.AL7xTGhZJnqALL16xkbUWK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxLg69yQ3o68BlWYA14AaABAg.AL7dIEe8EwXALBBJEhYEX4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL85ez1YTKK","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL9sXWfXEiU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAL7PFZbZFpX","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAMMxCDpBAt8","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzRZaN5Z1zjWMAqD_h4AaABAg.AL69YcijqY8AL6A6MHlpQw","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizAL8xhfiSABI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizALEfyfCVp78","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]