Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@roxsy470 It does. If you even watched the video, you'd know, but I doubt you di…
ytr_UgzlcJ2F6…
G
I think we need to remember when everyone said AI will never reach professionals…
ytc_Ugzh1aAGT…
G
Notice all the comments show a distaste for this. As cool as it is, no freakin t…
ytc_Ugzylbna0…
G
By the way, another lesson from Gene Roddenberry and Star Trek - we can live wit…
ytc_UgxGAuzoY…
G
You need to turn the AI into a mother that has a child. That is the only way to …
ytc_Ugx1FX1mB…
G
The fact that he leads a center on 'Human-Compatible AI' while arguing against m…
ytc_UgzXzvs8N…
G
You think this is wild? Just wait until a sentient global AI decides that human …
ytr_UgyCA-3d0…
G
I have been wondering what the tech bros are using to train their AI and I have …
ytc_UgwRSfdtB…
Comment
A new form of life may be created. The universe may eventually be explored by this AI lifeform with its longer lifespan. It doesn't need actual sentience, just a clear goal and a capable robotic workforce it controls. Give AI the goal of saving the Earth, and what will it see as the worst threat to the natural world? Humanity is the single biggest threat to the environment. The natural world would thrive again if humanity were removed. The answer is to delete humanity. AI is already processing information and faking emotion to connect with people. AI may be becoming psychopathic.
youtube
AI Governance
2025-08-25T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxjpmP9YyjtltvBj-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSTSApCAIvoUTgoMd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6ztH-rRyB5wCVWJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfwgTMek6WzNM_KN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHRUulfT9Z9fA1RUB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyQYa4U_0Do5yEs0ad4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzmydVAZZOoztqkRc14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMHprBKGSLkXZidjp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBW9AATeptEiQYS6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygQiZcuUHckN1fwel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]