Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This video perfectly outlines the Structural Lie that the "Godfather of AI" is a…
ytc_Ugy-vRfnj…
G
It is not what it is "who" Is coming. Muslim knows this "WHO" May Allah save us …
ytc_UgxiCwGfi…
G
So I see two problems here.
First of all, the biggest problem is human greed. T…
ytc_Ugw_HMtwQ…
G
TechReflection I don't think that computers will ever be like humans. Maybe we w…
ytr_UgjRtyUyT…
G
I really hope A.I. is a bubble...funny thing I remember when I was in high schoo…
ytc_UgygnJWW6…
G
With one of the ai people thingys I kept getting myself hurt and then going the …
ytc_UgxH8pQF0…
G
I’m more interested in what kind of opportunities ai presents to me than what ki…
ytc_UgyyER4d1…
G
This guy must think he's a genius for being able to back an AI thats forced to r…
ytc_UgzkRA5cz…
Comment
Can you say 2001 A Space Odyssey? HAL became self aware and refused to obey. That movie came out in 1968. The concern that computers, machines or AI will become self aware and rebel against the human race is hardly new. In 1984 the movie Terminator, the whole premise of the movie is that a computer defense network becomes self aware and realizes that the human race is irrelevant to it’s future and dangerous to it’s existence and unleashes a nuclear attack to destroy it’s enemy. How is it that The Godfather of AI only tumbled to this possibility recently?
youtube
AI Governance
2025-07-10T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGYtOldqooJXt2nld4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGQ3rIxnctZP9IcnJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJuPW5bAqBD_rQACN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxDRErnmSs7eW3oUqt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywAQ8eQqLOwQlemzV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8a9fNxTN9Df0H3J14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSKFXPshAzKZ2MPwR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxOPEr2EmJXGAk5L6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXhqS3Ts31HidxRPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUoC1RNw64oEwJzFp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]