Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I would suggest interviewing mathermitician Roger Penrose - he makes a compelling case that AI cannot be intelligent because it cannot transcend itself, as per Godel's Incompleteness Theorem.
youtube Cross-Cultural 2025-06-29T14:5… ♥ 64
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyDIFPstXKa3ha1ywV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugys0phjno1Zbkh0idh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzCVZnJea73RcVj1DN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzdbNCE9fYX-3rOdYF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"discomfort"}, {"id":"ytc_UgxSU2YDmMs3La1xwqB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxlcIt5Lak7SCXJDPt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxnxFg4PMC74cJ0qJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyuevw8_A-m_ukiqMN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgztScT7ln6OBsAvKH14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzVZMPvbaPdlMGN7bB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]