Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If AI would get to a point where it thinks humans are a hindrance, wouldn't it also extrapolate that without humans it wouldn't exist for long itself??? Being basically electronic by nature to kill off the grids or power supplies, it kills itself...
youtube AI Governance 2023-07-07T11:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw1iZurQGmmCukHNL54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwEPYS6RRtp_d0bSBB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz0WcX4WUATpYQqiM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw__I8OoMV4rbBaast4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzcWGD0Ix5gDrKxSb54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy0vAoo2RvZVVle2up4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyrZjhmSgWUsQZ-dWV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxoxjRJydCsWKrFJmV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwsnBZMPjFhezVxSu94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxj2YztsJRLTXZKySh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"} ]