Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The AI is being hyped, so the big companies can hype their stock, and this is the MO of Silicon Valley. Hype on an ill-defined concept, make outrageous claims, and sell you stock in the bubble. These guys failed on AGI, so know they are talking about "Super human intelligence". Apple will have the last laugh since they did not go down the AI black hole. The religion of creating Human Intelligence is a pipe dream, and these current models and hardware are incapable of this. Not only that, what is Human Intelligence anyway? Read Roger Penrose, "Shadows of the Mind". He might not be correct on everything, but his conclusion that Human Intelligence requires consciousness seems inventable. Hal is just not around the corner. China has DeepSeek, which bypasses all these big companies, runs on my Logic Pro M2 laptop. I can buy memory so cheap that I don't need a data center. If you notice, every time that reality start to sink in, Altman says some other bullshit thing to keep it all going. I like what she says, these people really believe there are on the cusp of Super Human Intelligence. They also claim it is dangerous so they can sell more stock at bubble prices. The real danger of AI is this spending and damage to the environment. War Games was fiction, so this idea is not new, and we already have Machine Learning throughout the Defense Industry. We are already vulnerable to computer and sensor errors in Command Control of Nukes. Adding an LLM in the loop would just cause more issues, and still the final commands would have to come from Humans.
youtube Cross-Cultural 2025-07-01T18:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugw4YPJ2I5icGwWfIhF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyaH_MKdvOInplh3x14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzoSgGiVoMqUep9FFB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzwB6c7Ld9a3xRmFvN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz13aOp5XGpRD88COJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwjQpvYdNNnWGbTxhx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugz8VvkVv4dZge-eFK14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgxPfrJt35yKgNWQ4z54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyXReNusyRVl3E6sGp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxdmgEzSPcMgjA3q9F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"})