Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That second one is exactly why automated emergency takeovers are just so much better than humans. If a human had to react in the same way, it would've hit before the brain even figured out what to do. (On average, it takes a second to translate seeing the danger, thinking what to do, and then actually doing it with your hands and feet)
youtube 2023-12-08T21:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy038PZhhAC2nPMFCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwHiunMPgjVfs--nsZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwEOCbDkX3kirHrtaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxnaUAVKqlm21qdR_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzBTbcjQMC6iHo5LvB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwba21taxdiEEeqs-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwlp5wtSOx3Pb6ODSF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzozUQuMeE96QoPY0V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyz4AhovvvVrH_DeWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwiAHojVjf1sVPAJBx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]