Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All fun and games until those same art pieces are stolen to make AI better. You …
ytc_UgxDQ-qzi…
G
I appreciate the high-stakes setup! However, as an AI, I don't actually have a l…
ytc_UgxnnETVc…
G
The inexplicability of neural networks and LLMs is not the issue. In essence, th…
ytc_UgxKC3XJz…
G
Humanity is not a group. You should speak in terms of haves and have nots. AI im…
ytc_Ugwbq0KRw…
G
There's no such thing as AI artists. You're just using an AI to generate art for…
ytc_UgwHnkWnD…
G
5:52 loved the video! but maybe next time don't use google gemini screenshots as…
ytc_UgzWLUZ5Y…
G
AI is still bad, it has no soul, but its papassable. It's like settling for some…
ytr_Ugz3z1-r9…
G
Interesting, I got a ChatGPT ad in my email inbox right before this video came u…
ytr_UgzCbsEJn…
Comment
The only segment I think is interesting is why people don't also watch car crash fatalities on record without Tesla autopilot (not even just Teslas). We'll readily see glaring examples of 'if a human was driving X wouldn't have happened'. People are idiots on the road and many fatal accidents, autopilot or not, are from human error. If we looked at that footage it would be just as revolting.
youtube
AI Harm Incident
2024-12-18T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywcUDsmSHtKmu2EY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxR7bWE1QNGoGAe4Bx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHNHGvCF8TCBofQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwe8me_sXhvf6h1VpN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydPcgYpA9Eo0GpnF54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyjk8hF3wmW4-dVXc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxccQADSSlRhUGeMLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZzC2pi7V_nX2JLd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymIYmuz-HlGnRmCmR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgBGDpZa6uRQhpi-J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]