Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I thought elon wanted to integrate AI and humans now he's saying it's bad? Isn't…
ytc_Ugy7V76hD…
G
Ai racism, do you guys thing theirs going to be like a civil war between ai abou…
ytc_UgyUMkBZv…
G
There is lots of things AI will never be able to do, trades, for instance, there…
ytc_Ugw7_FYmb…
G
@ was hoping for more than just rage bait but congrats on my 2 replies I guess.…
ytr_UgxMbZ-Js…
G
To know humans have soul, one has to attain self-realization. Surely automatons …
ytc_Ugz3Hd6Yy…
G
I think the AI art piece looks better than the other art pieces mocking it. Lol.…
ytc_Ugy37rihi…
G
@anm8001 I know that it's been in use before, but I specifically mean those that…
ytr_Ugx3gvsfR…
G
True “Superintelligence” would quickly realize there is a Creator, a God, becaus…
ytc_Ugy6XTt1V…
Comment
What about Terminator? The SciFi scenario is there... Hollywood like solution looks useless though.
Or Matrix? Although superinteligence using human as energy source seems even more crazy.
But yes, if you think about it - somewhere in the year 5555+/- it might end, in a war between AI/ robotics and human. The question would be which of them is "ready" for total destruction, including its own existence? None/ AI/ human/ both? Sheet happens.
youtube
AI Governance
2026-03-10T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8t3irTrwJm7BOuJd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugym6nTInYBUxaci7o14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5IDiOr26Dt3QtqxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs31pjTfDLgBtxAMN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgziLQ2SVYkeThhg6T54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVcL62KMwShTbUuoh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo01rVEJ1z6WBT64Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyrh-icSCC53cPYCA14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQDGkF-yjImgUlOXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzt4426qNqR9M655JJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]