Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve always been against AI art because of its flaws and how people can become l…
ytc_UgxtNDzvp…
G
Girlll what imperfections are you talking about !? Lol ai is the one that can't …
ytc_UgxfV5TIN…
G
Slavery didn't end because it was wrong, slavery ultimately ended because it was…
ytc_UggLBNtGH…
G
ok but that narrator is defo an AI voice, and what is 'AI's first kill' exactly?…
ytc_UgyE-z4cX…
G
I hope AI can push this video to more people! The world needs to see this video…
ytc_UgzyagaM5…
G
Waymo was just giving it's passengers a real tour of L.A. Nothing wrong with tha…
ytc_Ugz5JZVTY…
G
I saw one of those driverless trucks on a cross country highway on a road trip o…
ytc_Ugy6BrXKV…
G
Most of "AI bros" are teen dudes that believed the marketing of Ai companies and…
ytc_UgxVZkvNA…
Comment
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Harm Incident
2025-07-24T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxzcvtslR6_zz2d4sl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfGwt7oGyvEW4cPOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRwvg0nKCESfN3LKB4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzdUkbjNgUjgM8rI4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_vXdTO7c5OcWbiU54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw3sSWR15T6ynwLyxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxkceJPH9qE07lwSrh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzAUdnoUmmp0eiOjzB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzaa-LR-q8CvhzTN9B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQMOghOF2nxgnzIgV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}
]