Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meta CEO Zuckerburg - Ghew, Open AI's Sam Altman - Ghew, Google cofounder Larry …
ytc_UgziJ017i…
G
Hallucination is just the technical term for the AI being creative. When it gets…
ytc_UgxyKQ7GL…
G
The CNN host says we should stop the progress of AI because kids have died, but …
ytc_Ugzr8gK7Z…
G
My question is why would AI want to kill us and take over? What would be the inc…
ytc_UgyV3cGre…
G
Pay attention peeps
"human made"
"hand crafted" will become a thing again and …
ytc_UgwPGSP2W…
G
Everybody's talking about the terminator this and Terminator that-
Man, just do…
ytc_UgxCORY7q…
G
LLMs are the closest we’ve ever been to agi, but that doesn’t mean that it’s clo…
ytc_UgwlB4QOx…
G
Waymo is safer than going with a human driver, however how manh people will lose…
ytc_UgxLuTj2R…
Comment
@anukalgudi6216 Then the mother is problematic for letting him use the phone unsupervised. Along with having access to a gun. How did he get the gun. Oh she doesn't want to talk about that. You know, unless it belonged to the a.i bot.
youtube
AI Harm Incident
2025-12-11T18:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwnisDTOstn0A_PTXV4AaABAg.AQTpxI4DiG6AQTr6fABFXT","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxKhDDl0TUgsvc46hd4AaABAg.AQTak8F2RA_AQaYk-Moxk2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugyjk2SZth4n7PBHmdp4AaABAg.AQTM6h-6g7yAQWYMtF0xL-","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyTanoZp6qSmSjLcjN4AaABAg.AQTGlgEzi4UAQUXnu7X7vF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyN6SvXayhKj3CLw7p4AaABAg.AQTM6h-6g7yAQWYMtF0xL0","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzruIe3Vx8S3AXWMJF4AaABAg.AQT-B8bHma_AQTj48BC-2V","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzruIe3Vx8S3AXWMJF4AaABAg.AQT-B8bHma_AQTrizgkI7y","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgwHhbxyO0gqe-Yno8x4AaABAg.AQSwWPxswhaAQb0WTD9jPG","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyeK2tmLvhHHFVeGZV4AaABAg.AQSr7JKSEp6AQZ8WC5gXl0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxKsIk360yVSuco85B4AaABAg.AQSfHOxY547AQTOo2eg8R7","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]