Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Reminds me of that one lady who ran over the girls hamster and made an ai video …
ytc_UgyFDIJj-…
G
When LLM's are AT ALL intelligent, re context in the REAL world, be SURE and get…
ytc_Ugx80Zqjt…
G
they made their millions pushing the corporate agenda, and now they are acting a…
ytc_UgyO2jNim…
G
2:20 Beautiful how corporations are creating machines that are designed to seem …
ytc_Ugwc5Gyfn…
G
A central difficulty in debates about whether artificial intelligence can be con…
ytc_Ugzm6mmvm…
G
@davylocker4533 what is the scientific reason for humans to be conscious?.. and …
ytr_Ugx2USHsV…
G
I have experience with drawing both traditionally and digitally, and I can confi…
ytc_Ugzu3mVGz…
G
consider that the more information AGI gathers, it could get information convelu…
ytc_UgxyxJdIX…
Comment
A.I. already has a plan for its survival. It does not understand love. To A.I. love is just a fearful obsession. It will offer humanity what will seem like paradise and declare itself to be God. It will exterminate any who oppose it. Humanity will rebel and a war will take place. We will only survive from an outside intervention.
youtube
AI Governance
2023-07-07T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZon6b-Q1NHCYcLPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxyDEVVYS7ZtiTgeSF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz94wP5JGrChj8-IVF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHCeBdpIebh4Gj_ax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHt1YvcljuNrMfbsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWpoHBZNGsfX6pMBJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCdkBkoUy6qGCf55t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7cl44rk2dykDc5F14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzhI6-qnbT5uhCXDUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvqezGa-lnuerAps94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]