Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, artists care about art. The majority of society seems to be happy to consu…
ytc_Ugy5k2ZJm…
G
So what is their plan for when money is no longer an incentive for government co…
ytc_Ugyn3UNtq…
G
funny thing how the people who can't make an AI are the people who have these co…
ytc_UgxzCT67C…
G
Hey, in a country where even don-OLD is AI-generated (see the speech for the dea…
ytc_UgyfqddUN…
G
AI slowly fragment bits of data and need defragmenting to continue operating smo…
ytr_UgjqCEreM…
G
WHO decided it is it's task to be engaging? I find this very disengaging......
Q…
ytc_UgwL3SHme…
G
AI will outperform us at literally everything. Don't listen to people who tell y…
ytc_Ugw2e1bAB…
G
AI IS 100% PARASITIC.......THE CORPORATE LEGAL F I C T I O N "PERSON" IS…
ytc_UgxS4O6yi…
Comment
Goodness. Very nice guest and enjoyable interview, but he is not feeling the AGI. I would suggest that he watch Michael's "Lethal AI Guide - Part 1" primer on existential risk, just to open up the imagination to the wide range of outcomes that may be possible with an uncontrollable superintelligence on board. Doom is doom, but there's something different about the speed at which this is approaching, and the sheer scale and range of possibilities involved. Although I liked this guy, I would respectfully suggest that the Bulletin of the Atomic Scientists (Doomsday Clock folks) are in a position of responsibility to convey this threat with greater clarity than perhaps was accomplished.
youtube
AI Governance
2026-02-24T22:2…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]