Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's crazy to me that he's no only pro-AI but also apparently anti-human art 😭? …
ytc_Ugz68IWNX…
G
This is an essential pairing of videos! 🤯 Bravos Research delivers a stark finan…
ytc_UgwKXQQa9…
G
"Hi Nicolas, you got the right answer. Kudos.
The contest is over and winners ha…
ytr_UgxGkK4ZU…
G
Jacob, I found out there is a, "Association of AI Ethicists" but it is in France…
ytc_UgzsOIJh8…
G
Elon, I'm 67 & you're talking a foreign language to me. Its taken me 7 yrs to fi…
ytc_Ugy64RoFM…
G
There just making them just for money and do not take the after affects in to co…
ytc_UgycprBcV…
G
Tell this to Max: if AI needs to kill humans to achieve ITS goals, them It Is no…
ytc_UgxE3VqHc…
G
It's not like destroying an AI artist, more like respect. if there are so many r…
ytc_Ugxa2oqm6…
Comment
@findlisa5yes they have caused deaths and your statement is not true because a self driving car has killed a woman named Elaine helizberg so your statement on how a self driving car has never killed or hit someone is wrong and also it has killed a car named KitKat in San Francisco and it has caused problems for 911 emergencies and easily get destroyed during riots and those riding it are unable to exit the car proofing that they are a useless form of transportation that does more harm to the public than good.
youtube
AI Harm Incident
2025-12-06T17:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwKtRuB9PBJZKCmCyV4AaABAg.AKtmEy7VtTzAM1XRX0fG21","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxN0nNPluOITP_2YEp4AaABAg.AKtfcV7jKIhAKtmofmLIp","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxN0nNPluOITP_2YEp4AaABAg.AKtfcV7jKIhAKw6Gx5Cn3-","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxv1mFj_eioeZ3MY_F4AaABAg.AKt11_f61xaAKuRb82iaY-","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_Ugx3_qXw8gG61SLXsy14AaABAg.AQ_E75avMcyAVUAsXym7py","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxpsb6ZiGkGjUQQa2F4AaABAg.AQRfPeAav5YAQSHCRDqhns","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwgSZhLitAIx0sugFZ4AaABAg.AQQSmX7AqYAAQSHzUVydqb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxyWg5px6mbCOBGOlh4AaABAg.AQNB2UyzIBGAQSIZtoHPUw","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzJga58EKJNN-t30iJ4AaABAg.AQN9AVSawJEAQOalADW24J","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzJga58EKJNN-t30iJ4AaABAg.AQN9AVSawJEAQP38TMHGho","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]