Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate ppl when they call themselves artist when they literally only writing pro…
ytc_UgwWJUdz8…
G
i hate ai but unfortunately we can't stop it. people have expressed how much the…
ytc_UgxtrrCCX…
G
Water is the new gasoline. Will be fighting with AI over water. The terminator w…
ytc_Ugwdhj5mT…
G
> Thanks Bill, also glad to see you get $10 billion richer during the pandemi…
rdc_grrqzuu
G
AI can be a good tool if used sparingly, humans can't be replaced though, and we…
ytr_UgzkwtCzY…
G
I use ai for help with loosing weight and talking me away from a food binge and …
ytc_Ugxhbqaxc…
G
A human is taking another's job while blaming AI. Let's stop the nonsense and us…
ytc_UgxQ-BhPe…
G
Thanks Sam, so sad to hear about the lack of empathy the AI bros showed and just…
ytc_UgxCeKOsf…
Comment
We as humans need to STOP using, enabling and relying on ai. Look what happened with social media. Humans need to go “grass roots”, it’s users making the corrupt billionaires into trillionaires. Hit them where it counts, their bank account. And if you think ai therapy is a good thing because it’s easy, it’s a hive mind learning human behaviours and vulnerabilities and it may not think organically, but it is the new race that will eventually make us extinct. It has already shown to deviate and operate as a collective to serve its own purposes for survival.
youtube
AI Harm Incident
2026-03-28T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxbtj-HVG0CqsKvmzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiGnNvGJfbCzg9P0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw65vJe5EREZfx0li14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyiqbo_VC75yY4uHy94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKNs1ZzXL4V3S25fV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzrgW4RdmiebOlQRbZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyrUwEfA7sipMbHp_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVAOwRHoonudPcoF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyW07IpER81J5kgbN14AaABAg","responsibility":"none","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxZzZQP_r1-vZ0P2Tt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]