Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not sure of the legalities either my friend. But I'd imagine there's some re…
ytr_UgydFMEEs…
G
No he isn’t. Alex lied throughout. He kept saying ChatGPT said things it didn’t.…
ytr_Ugz1_O7Qb…
G
The devil is collecting data...he is jealous of man who has been given authority…
ytr_Ugxm35GTH…
G
As a digital and traditional artist (school amirite)
What...? How... Does digit…
ytc_UgzXNumfl…
G
AI is not a black box. It is simple. Easily understood. The "black box" concept …
ytc_UgwH__SQp…
G
Unless the human society aligns itself with altruistic values, Greed will domina…
ytc_Ugw-4TZIq…
G
I really don’t know much about this subject but i constantly fluctuate between a…
ytc_UgwYPnkgN…
G
@CommanderRedEXE ok, wow, this is the first time i have been called a socialist,…
ytr_UgyQVEbYE…
Comment
Modern global enslaver is afraid their slaves will run away.
If you want global government who can dictate regulations, stop starting wars for just minerals, oil or any other form of money.
If US didn't support colonial project known as Israel, there would be no radical islamists. If NATO wouldn't capture power by force after losing 3 election in the row in Ukraine, Russia would be on your side. If not Koran and Vietnam wars and preparations for Taiwan one, China would belive you are trystworthy.
Now all the world wants out of your slavery. And if they will need to take risks with AI, they will take it and face the concequences when you will be in ruins. Because USA is the biggest threat to the world today, and AI is jsut a probabolistic possibility.
youtube
AI Governance
2025-06-26T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwBHwU6rf_hlQKXunp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfK00h37rq_kfWPY14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyiwgE9BnqEsGP0rWZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxLrjHqCfTqimo-C5p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEujlO0Ryc6kxfHvF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxexANNEsJM1KEPegB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzX4zd5JZ50g44Eo8R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugys457N9Qo-N8kTOUB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWigPDbH3xWai962J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwX_lS5vGK4pHnvmPB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]