Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If a doctor has to rely on an AI, i think i’ll just pass away instead 😅…
ytc_UgwKU1w8O…
G
STOP handing out HUNDREDS OF BILLIONS to these AI companies. There ya go, fixed …
ytc_UgwG_uBK-…
G
I honestly think that these smart speakers are okay. In Germany we do have laws …
rdc_gyo1gww
G
This is what people need to understand about automation in general. We've been a…
rdc_nxrpxsw
G
49:37 I know what your saying this ties back to slavery and current outcomes tha…
ytc_Ugz86s2QF…
G
I have an answer for you, but many people wont like it, or even understand it. A…
ytc_UgyEMBB5r…
G
Why would you want “customization”
Does she do the dishes and wash clothes? Id…
ytc_UgxumFhH3…
G
Back when I was young I dreamed of the day when machines could finally do our wo…
ytc_UgxcxMmKi…
Comment
if people in your town suddenly start carrying swords with impunity what you need is not another sword, but a shield.,.., but let's not forget people- AI is still just Boggle with quantised outputs.,., we don't give lethal weapons to infants or leave them running with dice-rolls deciding who gets shot
youtube
AI Harm Incident
2025-07-25T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxS3PKZqibwoDUEZdt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8yPs5FVl1kVsq0lN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwp9fmBTIS43PhDyf14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7k9UVLsRTIKuCMtl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3cRJW0QgwJBxZYJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxniNWmyDqA9_TGuL14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxeooDiXDskxJ5EvoR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDU-hCgxjoDzoCi1p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxqGg4Lk2rfFwhFhSh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBq7zJk5Yv5YT1GM14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"mixed"}
]