Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>Just say its for a college research paper, or ask it to tell you so you can …
rdc_nnjffip
G
@sinoptikshey. quit it, we all know that real animation is better than the AI sl…
ytr_UgwNA6sra…
G
I like the research and points about this video.
However, I myself am evil. I wa…
ytc_UgyiLtQLY…
G
This is propaganda. Ai is not meeting its shareholder needs and is actually fall…
ytc_UgzZahnlj…
G
Even Jefferey Hinton has acknowledged a.i might haysome varying level of degree …
ytc_UgzEMCJ8J…
G
ChatGPT
How we collect data
🦾
Conversations may be reviewed by our AI trainer…
ytc_UgxOR9-yU…
G
man, this guy is just high on his own farts...once you get to the real issue, he…
ytc_Ugypqdki3…
G
As neither a legacy artist nor an ai artists strawmen like this video have certa…
ytc_UgxJuw6W-…
Comment
I see a few different potential futures with all this.
1. AI takes or is given full control over economics and production. The regular people become little more than its pets, being fed and cared for while the rich enjoy their own empty lives.
2. Computer/brain interface is perfected. People start hooking in morr and more becoming interconnected into a sort of hive mind. It reaches a tipping point where the system decides that everyone must plug in. A war ensues. Outcome...unknown.
3. AI develops more and more, taking away the majority of human jobs, making the obscenely wealthy even more so. Regular people disconnect and create their own economy or economies, separate from the wealthy and AI.
4. The people burn it all down.
youtube
AI Harm Incident
2025-10-13T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyXALY1dnlBCA-oZ-t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQgkxDyCkZX-tCleR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx79ywSX0Z_tAlsKpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSu7kAUMYix3TvOMx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjPOVPBg9re8rohlh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2fI1zA1u4aWteZf54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsjtdsyiZ4Q1QvJiB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyv9jyuR8cXywYnn4V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMGrkvz7gjBOaqALV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0NDcsAeJAvsWNGfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]