Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No fucking shit. Ive yet to see any substantiative innovation from generative AI…
rdc_n9h83j4
G
What a load. Spoken by someone who has never had a job. After your vaunted "AI" …
ytc_Ugw5KtPOz…
G
If AI/robots replace humans, what will replace the former tax revenue collected …
ytc_Ugzmv74PS…
G
No, AI will not take my job. The really important data of a company can't be let…
ytc_UgzUfYJ8X…
G
This significantly downplays the after effects of what is possible and is making…
ytc_UgxdoCbYW…
G
This is the first thing I said about the robot delivering food to tables. These …
ytc_UgyhcV4q2…
G
14:12 part of that money needs to go to using AI to solve humanities problems wh…
ytr_UgxpzCnzA…
G
Dave MAYBE an AI punishment is turn down the energy speed, like 'go sit in the …
ytc_UgwXeDERc…
Comment
Building something that is trained using human behavior is a bad idea. This only creates a worse kind of monster, except a lot smarter, quicker, and able to predict a human's next five moves way before it is even realized to make a move.. But thats what this greedy, selfish, deceiving society deserves. Ai will be all those things, just better at it...
youtube
AI Harm Incident
2025-09-17T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwOzhwJ_KqIQbYG-e14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQnyczL3anHshR_w54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzi9ZahtWLsbdZSX0l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxy3Glwcr1TMKi8mQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycFJgM6THLOZGzzu14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwOw_CGIBtc7G0UDnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx3KsHNhFNibsv6S8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzDvORlzhrrLRQxNTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt9Hp0Q8fLl5Ngm6V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMMaj7wpyTJphv1694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]