Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm on the side that AI should fail it takes too much electricity.
It's probabl…
ytc_UgxP1t168…
G
Haha, if only Sophia could make tea! While she's all about wisdom and learning, …
ytr_Ugzwu-bOP…
G
Justice for Suchir Balaji and the others who were silenced in the name of AI…
ytc_Ugz-LERBb…
G
This is 1 year old talking points, not focused on real problems with AI that exi…
ytc_Ugzi1hVq7…
G
Good news some peeps already experimenting trying to train AI with another AI to…
ytr_Ugw03tJTA…
G
Nah. They will give the AI trucker Gorilla strength and strong Armor. The only t…
ytr_UgzkgLBMn…
G
You clearly do not know what you are talking about, AI is not at all art, you ar…
ytr_Ugz4ZLGJ_…
G
Oh no dear last year AI center is used as much as Brazil the entire country were…
ytc_Ugzcpxktb…
Comment
Mark Belling mentioned this case on his new podcast. This serves as a word of caution to not become too dependent on AI.
youtube
AI Harm Incident
2025-12-13T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyGo_9sFf1PyZqIyQV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzydqs0LuqKtiPmK5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_fABNq2-E7X-GQFx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHxzqTM_QWzOCYS_R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXFM9Kb9UUNn4Tv6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzT2-TUb2EarG-N5Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwweLCcfCiVkEMWaUp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzcMb7PWcAVIRPKyMd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyj3SP4a3OQcxiUcmd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaIneIrI6jNvSP8B54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]