Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“all of that is meaningless if there’s no originality behind it.”
you talking h…
ytc_UgxwSYma5…
G
If a company worth 100 billion is asking for a government subsidy of a Trillion …
rdc_lp7r9zj
G
You could at least try to get the lighting right on your CGI robot. LOL…
ytc_UgygCoVJm…
G
I asked GPT-4 to write a second paper for my science class and it told me to f*c…
ytc_UgwKb3P-M…
G
“You have to think”. Another stupid AI promotion video offering solutions to non…
ytc_UgzrEPTIP…
G
The difference between an ai chud and a real artist is that a real artist can ma…
ytc_Ugwvg8nXn…
G
Easy, just disclose if you use AI or not? Why is this a problem in 2025? It's no…
ytc_UgxOmmdaM…
G
all those AI-hypers sound more and more like the Jehova’s Witnesses, who have be…
ytc_UgyKrgd3Z…
Comment
As an IT-engineer, started way back in '86, the development of computers was equally intriging as it was scary. Long before any handhelds we're invented, I'd allready had an idea of what people needed, ease in control and everything at hand, preferably with smashing images. It wasn't too difficult to notice how people we're easaly sold tot hese gadgets. I've always stated AI would be the downfall of society as we kow it and would surely be abused by those in powers. In the end we will all live for alms and be happy... Loved the quote " people who really know how horrible the world is, usually escape pretty soon"
youtube
AI Governance
2025-09-04T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwLck2PpIh1dV3ppFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgytpBC4CfMcwPh2qaB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxrD4sp9g1NeOpDa8t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRpm9K7Fv4qSVoJzt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWrfvc2j8ZQOmC6i14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT-2ngswaVQIsrFy94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyEqljw_Ob7dsRQHVJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwX8HN1ZtNXkV4U9K54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwbhYkmlP8VhkB8Uvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXntOW3bQ7LkP-Hzt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]