Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai is never going to replace artists, the same way bikes, cars, skateboards, you…
ytc_UgzCyxJ5h…
G
My take?
AI art *IS* real art, but it's not *AUTHENTIC, HUMAN* art, so they're …
ytc_UgzuW8zsH…
G
Congrats, businesses that insist on continuing to operate like it’s the 1900s ar…
rdc_hzfrca3
G
I think when AI takes initiative to do something of its own without being told t…
ytc_UgzXjTgkS…
G
If you can’t separate talking to bots from humans, you shouldn’t be using bots.
…
ytc_UgzCTh8Tw…
G
It’s insane how you argue that humans should keep doing jobs AI can already hand…
ytc_UgxSmgJJY…
G
We are going to need AI if we want to survive although it may not turn out that …
ytc_UgyKvobqc…
G
Using A.I. is having an idea and lacking the directive and effort to achieve it.…
ytc_UgzaeN81G…
Comment
Something has to change. Mega corps and their governments aren't. Maybe ai will be smart enough to realise we need the Earth to be thriving so we can. And actually manage to overhaul mankind's operating system to reflect that. Or will humans (I notice some of the most sociopathic individuals alive are involved) simply train something that thinks f-ing everything over is the greatest? Just like its creators.
youtube
AI Harm Incident
2025-07-27T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwwVLGSgP4KJ2Uv0gd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxYoo0mxguiiOQD1I14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWlPG4cebOWlxUcL14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwjrdE-TtrWxHJki1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwlu3jwwnHDKvlzb6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6CUJr-9wCVz5gMeN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwxJOHUDMJeS62CjN54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwlmFiX_poRT7HcK-Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyD4yUQWOUhJVZ4tjt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgweZXTNJl8Zp5_FhXp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]