Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't recommend anyone go into 3D or digital art right now. Ay-Eye is terrible…
ytc_UgzYXbKpQ…
G
It's infuriating to listen to people talk out of one side of their mouth about t…
ytc_UgxtuX9KN…
G
Hmmmmm what'll happen is that we will be replaced entirely in the workforce. AI …
ytc_Ugymvr5J6…
G
It’s the same with AI art. It was bad. But slowly over time it will get better, …
ytc_UgzaCcvi8…
G
Holy shit an actually measured response to an episode of AI involvement in somet…
ytc_Ugz2CxFGW…
G
I make AI Images, but that shit isn't 'Art'. Anybody who thinks they are an 'Art…
ytc_Ugx8pdcev…
G
It is pathetic that YouTube stoops to sleazy, tabloid-style clickbait like this,…
ytc_UgxGQUmIp…
G
I don't believe the hype that AI is causing large scale job losses. It just isn…
ytc_UgzRJOJu4…
Comment
Computer programmer here. The real danger in AI is as follows.
1. AI will make controlling the populace wide spread and with scientific precision.
2. AI will confuse an already confused population about what is real and what is artificial. It will cause people to become even more delusional than they already are.
My conclusion as a programmer: if people aren't opposed to AI they have no idea that AI is as dangerous as an atomic bomb. AI will allow powerful people to gain Godlike power more so than we have ever seen in the history of mankind.
youtube
AI Governance
2026-01-05T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw0KLZHzjKSmcBmzSZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7PMj-KGEcuXJsBux4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcXYxhCgF2Ox13q7Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBGBHIubfJmBpzxSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-nl2Y_FC-zsddL6N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUJZn0uBzRAH1djQR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzvNtFgJuiumP_Gtep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxaSuR3zMpbhYzVViN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzVRSuS8IRHjbcFAjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVgX9sbMRzB1Isva54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]