Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lul, just because companies are replacing customer support with chat bots does n…
ytc_Ugz9af74f…
G
One note after watching this video - I see a fair number of people who seem to t…
ytc_UgzkF1yVT…
G
I think OpenAi forgot how dumb a lot of people are especially older generations,…
ytc_UgwRxRzlM…
G
Why dont use humans use all the time we have for learning things instead of aski…
ytc_UgymRaDjn…
G
God created us in his image…we created AI in our image… AI will create…well we w…
ytc_Ugx6iVnL0…
G
people say "oh this car killed only one person but normal cars kill way more" Fu…
ytc_UgyRtExH2…
G
Why have a.i. if it's going to cause harm to humans and controls the world?? Any…
ytc_UgxhLHV-S…
G
OH MY GOSH if that does happen like ya know have robot's to help us in the futur…
ytc_Ugj4Bh9p-…
Comment
The Caveat is that AI would need us to make the infrastructure for it to be self sustaining... we can't even self sustain ourselves. If it diverted all the power to their data centers then we would just have the people who work at these power stations just switch it back and remove the AI connection to it cause clearly it can't be trusted. Simple fix. Or lets say it succeeded in stealing all our power and there's nothing we can do about it., then the systems be it water, air, solar, require maintenance to achieve efficiency which is what we take care of and if we are removed from the equation the AI will eventually lose it's power and Poof.
youtube
AI Governance
2025-08-26T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgzvRdc-64mGkpe28Sh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRi-v5iXE7puXt1cN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWNSvuAsVUMTaONNB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxAmkC3glyIn497n8d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw2WLzYOhpCABZ300p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]