Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Did musk actually fear AI could harm humanity. Or did he fear someone else would get there first and he wouldn't be part of it? Give his recent behavior I'd side with the later.
youtube Cross-Cultural 2025-06-29T15:0… ♥ 101
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyDIFPstXKa3ha1ywV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugys0phjno1Zbkh0idh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzCVZnJea73RcVj1DN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzdbNCE9fYX-3rOdYF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"discomfort"}, {"id":"ytc_UgxSU2YDmMs3La1xwqB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxlcIt5Lak7SCXJDPt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxnxFg4PMC74cJ0qJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyuevw8_A-m_ukiqMN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgztScT7ln6OBsAvKH14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzVZMPvbaPdlMGN7bB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]