Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Back in the late 00s I was skeptical of driverless cars, not being able to make judgement calls. But people told me I was being silly and that the only threat would be other humans driving badly... Yet here we have a car that cannot make a judgement call when one should be made.
youtube 2025-04-05T00:0… ♥ 11
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugzn2bDX0QSQlcRldw14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgytiQgjfjhof7FHJbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwGI5MtGmnnHVCWT1N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgywK6cjFUAyHmXW4qt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyUvgUApTUb5_tDo2J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwT3HBH8WKiQKenTIF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwWZTMph7mEShMQpQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzll0g-FMiD_q0gyqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxajtyzXHSoi7hS7OB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzjEwmPCH-fAoxSxWp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]