Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That robot said he is going to take over the power grid and have his own drone a…
ytc_UgxgqBHAG…
G
It is more complicated than this of course because it is humans + AI. Either o…
ytc_UgwwObM6k…
G
If I can't use your art in AI, ya'll can't use other peoples are for fanart crea…
ytc_Ugzu4cPvW…
G
Personally I don't think a super intelligent AI is anywhere near the horizon. Bu…
ytc_UgzzJxdB0…
G
I mean.... i just would of reached up there and hit the power button.... but tha…
ytc_UgxH-TwRW…
G
of course, thats the whole goal of AI why would you wanna work at all…
ytr_UgyDEgaC6…
G
The heart on her shirt breaking and the extra finger on the AI "artist" so iconi…
ytc_UgwSk9BVQ…
G
How in the fuck is "usisg a digital software" to draw anything even clise to AI.…
ytc_UgxcVzyVj…
Comment
This is so out of touch with reality for me, it makes me so mad. This is not progress. This is not inevitable.
People are just so disconnected from their souls and each other, that such novelty presents a thrill. We don't need any of this stupid AI "smart" shit. My home will forever be wi-fi free! EMR is ruining our health and the health of the planet. The data centers for powering all of these super computers are one of the biggest polluters, along with mineral mining for materials for technology.
What really matters is preserving and protecting nature and our human relationships/communities - we cannot live without that!
While living without technology might be tougher, it is entirely possible. Priorities!!
youtube
AI Governance
2025-12-13T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdvm_ji62KuXX9Kcl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztRgdjtpuxO8gIIup4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzIkHNhE_wcnY96Ntl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtK4YZiJzXWLMawNF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzc1N0LTqIfyoflUSF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx540xP3BtNgkoU8ot4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJwzgxJpdsbZ9TqyR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUzTteBk-AsblDcBB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqOhdy7WnRXkd3Tup4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzofneNXxK35Ai7nyx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]