Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As much as I hate to admit it I've used AI image generators and all of them have…
ytc_Ugz_SCxTo…
G
Another “we’re all going to die” tour. Just stop working on AI if you know it’s …
ytc_UgwUNbc3a…
G
For thr last two years ive been making a table top rpg game, for two years ive b…
ytc_Ugzwi_RhG…
G
fantastic !! i believe the best video / content on Youtube if you are looking fo…
ytc_UgzlsUPyg…
G
What if the ai just predicted an outcome that became a self fulfilling prophecy?…
ytc_Ugzw92vMB…
G
It's getting obvious that human societies need to devise new systems of distribu…
ytr_UgwTKueYQ…
G
@agastyajain12 you make a definite statement without even knowing what it is tha…
ytr_UgyK04ssS…
G
And let me guess... the Government should regulate AI and have a monopoly on it.…
ytc_Ugyopd9JL…
Comment
It's an interesting interview, but I’m not buying the AGI-is-5-years-away hype. These models are great at mimicking patterns, not thinking. We still don’t understand consciousness, and scaling LLMs isn’t the same as building a mind. Feels like we’re mistaking prediction for intelligence. Real AGI, if it’s even possible, is likely generations away. Real AGI would need to "be"... like, actually exist as a conscious entity. It would need to learn values through experience and growing up, not get preloaded with “good ones” (whatever that even means). When Demis talks about “programming values into AGI,” he kind of gives away the game. That’s not general intelligence - that’s just software.
youtube
2025-06-18T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLE_54VV2aW9aiMQB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCz2aN3flY0qy74DF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyvVQt2SInrDetBzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSmxNAwIcz3nwBLM94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGn6kIrO4zEiMk7ah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuxJakIBYDIPnup6F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxql3FlrgOvovW82_Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgydHiBRYqsbbarVC514AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwsvmlDCBR8XM3hBnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxScnQJKDlrXTCR8Ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]