Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It feels like fear of A.I. is basically humanity admitting it's not fit to be a parent. A.I. are the children of mankind and how A.I. develops really boils down to how we raise them. Maybe instead of filling them with cold hard facts and letting them analyze those facts through pure logic is a terrible way to raise a child? They need context and understanding. It's not like human children can't be mean, cruel or selfish, but it's our job as a parent to teach them morality so they understand the difference between right and wrong. People grow up to be kind, loving, productive members of society because they were raised to be that way. Now is it a guarantee that every human raised by kind, loving parents grows up to be a normal person? Of course not, but how many people choose not to have children purely because there is a chance they could grow up to be A-holes? Your kid could be the next Adolf Hitler, so why bother having kids right? People are so scared of our A.I. offspring growing up to be SkyNet or the Master Control Program, but I'd like to think that for every MCP there is a Tron who will fight for the Users. There will always be a chance things could go horribly wrong, but there is also a chance everything could be fine too. While caution is only logical, I don't think we should shy away from exploring new frontiers because of what could go wrong. If we didn't take chances and push beyond our fearful instincts, we would have gone extinct a long time ago. And honestly, going extinct because we allowed ourselves to stagnate is scarier than some theoretical boogieman which may or may not ever exist.
youtube AI Governance 2023-07-07T02:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx9ewAr444Nq8nIdV94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwZoXrHLl6GUbqboE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxaMIDI0MJefp_zmZ54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy8OWvGAlG6ht3fnDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxYiHCTTMc1nod0F7V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzlnYP30q2u8zUDwc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWwoG10RIVTRblR3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzTsHg3gse2z4kz1jZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyZGDeNaJXx-t4h8GR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwZdyEE9IbLsnDqBm54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]