Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should all get some tiki-torches and march through downtown San Jose chanting…
rdc_kyzcmrx
G
OpenAI calling use of Nightshade "abuse" and OpenAI not getting sued into oblivi…
ytc_Ugw2syu3s…
G
I'm sorry but some of what you are saying is incorrect.i am a truck driver..for …
ytc_UgiBDLYuW…
G
Well just penalize company's: If you replace a human workforce with AI (example …
ytc_Ugwk9MWe0…
G
They keep forgetting that computers are profoundly stupid.
Large language models…
ytc_Ugz53CldS…
G
You did the example of small jobs disappearing because of new inventions. The di…
ytc_UgyPXYRPO…
G
You can't fire ai for bad performance.
You can't hire a real person to support a…
ytc_Ugyg2-2ng…
G
"Just give it 2 more months!! You'll see how useful it is if you just wait for i…
ytc_UgyHbc80Z…
Comment
Frequency of civil wars may increase, due to loss of purpose. In response, monitoring and policing systems may exponentially increase, in both breadth of distribution and tangible concentration. To add, the composition of surveillance may change in style by becoming increasingly covert weighted than overt forms of surveillance. This could lead to an impressively oppressive and dystopian existence. The only saving grace is that blame can be placed on individual humans but what happens when incidents occur involving corporate selected AI and robotic systems, does the individual robot take the blame or do those that chose it?
youtube
AI Governance
2025-09-06T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyON1F4WVRmIRfFW2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2NhYJIIDYI9D7IeZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBB9xPHioeakHAGrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXRru-DYnEDzuUmmF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlNUb7TQ-l_jyviRN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMv1Dg_VavfaxKdLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQ0TTbMsbRO6Gt9Pd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIE_pgefYKhVip73d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxqQcO00rTKaxQALQR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyzyqy3PYqqDAwg-1B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]