Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They have forgotten the one thing about automation you always ALWAYS need a manual override so some one smart would still have a driver there to take control of the truck if the nav ai starts to mess up as it can and will do that which would not be good for anyone. Never fully trust in automation as you still need a human to make sure it does not error.
youtube AI Jobs 2025-05-28T21:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzllkrKjA8EK7JYwKh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyjIKji4UQ04nxCeGB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1bT4_aMpDF9Al5W54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzAOnV5PYNamoEg1jp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxJcwz3Z4n2hxk7F2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxqpLLIkbMZ2aw_Eo94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyOMEbQi4BgYju_6714AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxapbaJzILkwd3TVf14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgywLEITYj4JNNgMDn54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzsCtsUbKzDlCr5T0d4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"mixed"} ]