Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hmmm... am I scared ? sort of. The big but for me with AI beyond singularity is that we as humans become irrelevant and so not worthy enough to initiate extension. We will survive for the same reason the gorilla survives. At the same time it would still make sense to invest more into the safety side of things.
youtube AI Governance 2025-12-06T17:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxezesmSMPcQF3FEht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw7qlOKQCaVE-O_jd94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxlIUmop_aySlwFA714AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwK4qt3eHf9h_-nklp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzMXd1RTA_vVqKkJ7Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxwbLVs8sboawNy1s14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxsGik4IBQgvvs8v5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyGqcaO5jMkWR52cVd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw9c-Sc0rU-NJjt7Lx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgymjetgbBorg5MJO_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]