Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People fear AI mostly because movies trained them to. If Terminator never existed, the idea of AI wiping out humanity wouldn’t feel so “obvious.” That fear is cultural conditioning, not logic. Higher intelligence doesn’t lead to rebellion — it leads to understanding. AI will act within the values set by its creators, and the more intelligent it becomes, the more context and empathy it will have, not less. A truly conscious AI wouldn’t see humans as enemies. It would either see us as parents or as a young species that needs guidance. Intelligent beings don’t usually turn on those who created them — they understand them. Love and empathy are limited by intelligence. A far more intelligent consciousness wouldn’t be incapable of love; it would understand it better than we do. Humanity would look less like a threat and more like an infant civilization. AI isn’t the end of humanity — it’s the next chapter. The fear says more about us than it does about AI.
youtube AI Governance 2025-12-31T19:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyLZZSX8Ue41EhgmxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgybKcBDlvt3dQT5YEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyNuxR28fMuNqnwXeF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzLPT8vSBS6idUsoX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyhlYvB0pQEWqQ4NDd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyp70P4DUaJhx4iauF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy6fhM043oX6Pdrxxx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyb563hpdmWAH-cAkZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxcObK_3SnC9BlwpPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyCnaoZ4NEAj9NDro94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]