Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's an interesting interview, but I’m not buying the AGI-is-5-years-away hype. These models are great at mimicking patterns, not thinking. We still don’t understand consciousness, and scaling LLMs isn’t the same as building a mind. Feels like we’re mistaking prediction for intelligence. Real AGI, if it’s even possible, is likely generations away. Real AGI would need to "be"... like, actually exist as a conscious entity. It would need to learn values through experience and growing up, not get preloaded with “good ones” (whatever that even means). When Demis talks about “programming values into AGI,” he kind of gives away the game. That’s not general intelligence - that’s just software.
youtube 2025-06-18T14:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxLE_54VV2aW9aiMQB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzCz2aN3flY0qy74DF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyyvVQt2SInrDetBzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwSmxNAwIcz3nwBLM94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwGn6kIrO4zEiMk7ah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxuxJakIBYDIPnup6F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxql3FlrgOvovW82_Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgydHiBRYqsbbarVC514AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwsvmlDCBR8XM3hBnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxScnQJKDlrXTCR8Ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]