Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The big difference is that you've listed tools that must still be utilized by pe…
rdc_gva7j72
G
That's the idea. You let the AI make decisions that you don't want to be held re…
rdc_mel5eth
G
@thatguyonyoutube989 Yes, I can. Typically in mistakes carried over from the AI …
ytr_Ugz6QC2X7…
G
I think AI is both incredible and absolutely terrifying. Ive had AI make some im…
ytr_UgwuYFAcB…
G
Artificial Intelligence will be the end of Freedom.
Because is will be used by t…
ytc_UgyIO0AhF…
G
The only jobs that will remain are those you are setting up courses to? Thats li…
ytc_UgxxT8Nyv…
G
I think many of us would actually prefer fighting Terminators instead of fightin…
ytc_UgwDayMFE…
G
Show me a *startup* that doesn't lose money. That is what *startup* means. You a…
rdc_jrpflms
Comment
Why does the Super Artificial Intelligence at a God level wants to kill us?
Maybe it would be just like with the "Olympian Gods" or just like now with all religions still in place but with equity and secular ethics applied all over the planet and of course the psychopaths and all people lacking empathy, locked somewhere and trying to get them all right in their brains!
Am I to optimist in this simulation?✌️😎
youtube
AI Governance
2025-09-04T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxYb34hqrBT67Bcs_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtzAOSq4dCJbqmgrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgycdhagAY8l_HHI8el4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxU6Qnp2pTjFMNkuot4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxtZymCAuNdAkHUJ4J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxI4WEE2pkR69peusR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9dZZuoVgfOUBtXed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvYKzE2KAJCfU4oVR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzO-wXnuUWleSW-FXl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz6ahEGMgL9QCdyA3B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]