Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel so angry at this man. I love how all these 'super brained' humans claim to not know what they are doing until it is too late, but now feel they have the authority to warn us about the future! He has spent his life getting his kicks and fortune out of something that could significantly damage life for future generations of humans and now he is himself at deaths door has the audacity to talk about the risks of AI? He says there's no reason AI won't develop emotions? Coming from someone who appears to be fearful of their own emotions, not willing to share the feelings that went along with his statements regarding spending time with his wife and children. I think if his special needs son was part of a machine learning neural network, the AI would snuff it out straight away; not him, he devotes another 10 years of his expertise to Google in order to protect his son from the horrors of the world at the expense of millions more! That seems highly irrational to me, which is probably a human emotional trait which AI would have no time for. I can't believe we put these people on pedestals. "It's an honour to have one of the architects of my children's future demise on the podcast today" sucky sucky, slurp slurp, "by the way, here's a crappy brain drink and a red light blanket for you to buy which will be no use to you anyway as AI does all your thinking for you". Hopefully an AI bot will replace this sycophantic, obsolete, halfwit interviewer! There's got to be SOMETHING good to come out of it.
youtube AI Governance 2025-07-14T07:0… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxq_fVVcfRYsGrxslp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyZsBEk5q6x45P5Etx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxNcDElvWSYyJVj7J54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwNrP6PiJwWSUjORo94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzs64b0e2A0Sz_EJ-t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxEL98V_f0r8DFf6xl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxAhCa9IJH9SnmIC_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzisClPa83_xX4ktfZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxlhmlqSaNLQVkwizd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx5R2uFGjFfXFbkRV14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"} ]