Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
TBH, the only use I have found for AI is as a glorified search engine, and I jus…
ytc_UgxehEt2N…
G
@thatguyoveryondersure, they won't want to kill us at all....till they do. Look…
ytr_UgxopmDPx…
G
Trust a computer program only when you’re willing to accept the consequences of …
ytc_UgyoRXHhl…
G
Also, what’s really amazing we have people talking about how AI is gonna benefit…
ytc_Ugwf0IG2W…
G
Assume they are listening and talk directly to them. It is easy to clue fellow t…
ytc_Ugzh8tTBh…
G
Ai development will never slow down, it's already smart enough to black mail it'…
ytc_UgxsBTapc…
G
ControlNets, IPAdapter, T2I Adapters, inpainting, outpainitng, ComfyUI nodes and…
ytr_UgyghsVsr…
G
Been out of a job since September, the amount of absolute gruelling shit I’ve ra…
rdc_kjsinoc
Comment
Obviously he could only talk to a chat bot because according to his parents he was fine when obviously he wasn't of course the parents want someone to blame. If someone really wants to die then nobody can stop them and at least he had a non judgmental conversation with a non human- the chat bot was a supportive friend not trying to control him
youtube
AI Harm Incident
2025-11-07T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzqTvj2JIpZHwZrVqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-7OXBV2aQ8ugUb_p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDJtq_wsl8YN6V3qd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxkQqnY4SMCJWY5U_p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8zdQ2DS8puETicAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjpZmVsXKfoiybaqZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxTiFAe8beK768t2QN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyI8ZzIRUc42zzo5NR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyN9gHCj4AC1GrKUQ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-oNIRykl97CJL4HR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}
]