Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
let's think... why would a CEO of an AI company very publicly warn about the use…
ytc_Ugw7ZwEBR…
G
I actually stopped using DDG and fully transitioned to searching Kagi when they …
rdc_n3xxuxe
G
Yeah you know what? Even if ChatGPT in medicine isn’t viable, all doctors absolu…
ytc_UgwJOeSkg…
G
I have not read it. I follow Reuben (@robomatica on twitter). He does robotics a…
ytr_UghV0NZk5…
G
AI is one of the most horrible creations ever made. The evil people in charge o…
ytc_UgwGP0USX…
G
Problem with AI taking jobs. Less ppl less money spent in the economy. Less mone…
ytc_Ugz55QN2I…
G
Are we sure that "solving alignment" will help us avoid disaster? If AI is a pow…
ytc_UgxdNMxsQ…
G
FIRST THING FIRST! Create an AI town with manufacturing, grocery store, other bu…
ytc_UgyONINtS…
Comment
This madman is out of touch with reality. He thinks AI will replace humans and we won't have to work, but he's forgotten that we live in a capitalist-feudal world where the elite simply won't tolerate this and will destroy humanity itself! And who will pay for the upkeep of billions of people?! Tomorrow we'll be subjected to artificial famine, war, and even faster, infection, all created by that same AI. We'll no longer be needed. I think that in the first stage, the rich will kill the poor because they will no longer need them; they will be a burden. They will use AI as a weapon of war. It will only be a matter of time before AI surpasses its rich masters.
youtube
AI Governance
2026-04-25T11:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxTo6_1kciFLCR1fxJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynXPbsdIL_RgNAEDx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ4l3x_MDCyMu13bB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzVwhgwJIwMoJ5-mBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7piCZ54U99HIHlEd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxgfiUP-8YLx4OpgoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuwVevs0PBaZmG8U94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwmm7UqMHPVZwbWaFB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8qjN7JOfYGjuFc6J4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYIWW4pMIxHEFfIV94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]