Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So what happens if a truck has a blow out or some part fails that a human would normally identify in the pre-trip? The truck can’t automatically repair itself, it isn’t a transformer. What happens if the truck gets into a fatal crash and is found at-fault? Who gets the consequences and what are they? Is the truck able to detect concerning conditions and not go flying 70mph through a school zone or construction zone? There’s so many safety concerns here. I don’t understand this. As a human truck driver, it is already a task to be safe and avoid mistakes so that this 70,000 pound missile doesn’t plow through someone’s minivan and kill their entire family. Now think of an AI having to make those split-second decisions or react in such a way that has the best outcome
youtube AI Jobs 2025-10-15T16:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz8pY3y4ExYejwUcjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzeW3v5IYBG2wcmWQd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzN_4-u8AuqqkchBbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3PrrBlNtMhjuUtdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugws794Sx0yDuvSoB9p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwteoxVOkahQB1OuiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxeV2vPf74tRGetDsp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzD5qYg08TmXS3FpbV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxEJ-LSJIMft0kEmel4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzG1X80ckY6njccmit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]