Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you are not afraid; you did not understand it. Only computer experts know what is coming. Physicists have no idea. By the way his defn. of AI was wrong. (He actually defined something like computerization) I wonder if he could give a good definition of "software engineering". (I wonder if I could give one) Software engineering was an ill structured concept to begin with ... But engineering is the only weapon we could think of (against monstraus complexity). Now we are moving into a new (actually formerly neglected) paradigm called "machine learning". The problem is that there is no proper engineering here. "Intelligence" is kept in a "black box". We can experiment with it, keep statistics, make good guesses. We can never really know (or design). I am sure Mr Tyson has heard of Gödel's Incompleteness Theorems Morally they are computational equivalents of the second law of thermodynamics of physics. I mean "If you are not afraid; you did not understand it." type of thing ... Machines will take controll too early .. Too early ..!
youtube AI Moral Status 2025-10-07T10:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyDtupO9bmltIr2M7N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwR8n1RS7C1QEWDnYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyNiygHMsonXzIEeuZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxRxdFpGrBn0NrCX6J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwnlU8EL3XRvzkVP7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxyF_zMoS82yaMyy694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx7IWMhFVEWmx_oxDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzYf5000temkNKkiWB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxz0Uj7CK4Vtqf3rih4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwHeNwep2Zfve0OQ1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]