Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes, gun powder will supplant the bow and arrow. I'm not sure we should listen to their fear mongering about AI in general. Hawkings doesn't just fear military AI's, he wrote a whole essay about how the AI's will be able to constantly redesign themselves, evolve quickly to be more intelligent than us and basically eliminate us us. I think he spent some time binge watching the Terminator movies. The flaw in his logic is that becoming more intelligent than us is necessarily followed by eliminating us. We don't know what being smarter than us even looks like. It seems to me that beings that are smarter than us might come up with some sort of better plan than killing us off. Why would we even be in direct competition with them? We're organic, they would be mechanical electrical, we aren't even going to be competing for the same resources. It may be the case that coexisting and cooperating will be the more beneficial and intelligent way to deal with the beings we created and for those beings to deal with us. Being afraid of an inevitable future that is clearly coming soon is dark ages thinking. Besides, intelligence isn't everything.
youtube 2015-07-30T15:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgiG1VbD93Hl9ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugjmk0vQ39_GpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjlP3MMVlkjBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UggD7tYfVbQtU3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ughm6vEeTLi9RXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugiw0vwfohKCq3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjrD7whMK2ahXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjXz5wvV6sOe3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UggRoh03TKwiPXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UggS5-aiz9SI73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"} ]