Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The only reason AI clever enough to kill humanity would kill humanity, is that they are so human they would find us so much as a threat to them that we think they are to us. If we ever make a fully sentient AI, we must first make sure we are not a threat to them. Then, they will not be a threat to us. Obviously, psychology is more complex than what I can fathom on a Thursday. However, it might be built on rather simple principles, something that can be recreated with code. You can break human psychology down to binary seeing as though the world consist of a set amount of quark. So AI psychology would be just as complex as it`s human counterpart. We don`t need to be killing machines, nor do AI.
youtube 2015-07-30T10:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UghzZ-K8T33kwHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UggDitZp2No-4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggBUWzgbO_wAXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjGKqitLzXsyXgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgirTe4Oz4K71ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugijts-RK_hu6ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjWmK_f0a2WkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjBaFAfZPlcj3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghpZV-KGnAD-ngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugi-b3JxHJjEVngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]