Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What I hate about Neil is he makes up an assumption, treats it as fact, than forms an argument around that. Even the best science and best estimates have no idea what driverless cars would do. It’s even possible we have more casualties. Of course you would never know that because corporations would never let that out.
youtube 2023-07-29T19:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxNCFm5yZP6-to6eLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2YbH0tpjFa8p3XXV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyZ2iYaWDktdtiDuG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx9N7c4ZUHp-Z3L7mZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy9VNWjnhuHRDYOVhV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyBxLMY4ANyQQ3IIUN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyDNtYwaHzVx3NHDId4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRRpe7Y8QsfhH69Qh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxzQVnbgPoI_p-6NE14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzXL6WEfTuxAMqZI_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]