Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The project won't be viable for one reason. And I only need one to destroy all their plans... Looting! Now that the public knows the trucks are driverless, all it takes is for someone to drive a vehicle in front of it, slow down, and force it to stop: it's programmed to avoid accidents; open the back doors and help themselves! LOL! And they can never program them differently to avoid the risk that the truck identifies a traffic jam as a threat and plows into the pile! The joke with self-driving vehicles is that they don't respond to social disobedience. You remember that scene in The Truman Show, the movie, where he manages to stop traffic just by raising his hand? Well, we're in the same scenario... If we ever get to a point where we only have this kind of vehicle on the highways, you could cross them on foot without any fear.... And soon enough, the project will be deemed unviable... Remember this word : "social disobedience." we invented it in France ^^
youtube AI Jobs 2025-11-21T18:2… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyFgvY9Qj4Qi2nsSeB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzw7qwT_tC0yaavZ6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIY9mmkM5Ft8R2L394AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxpxRg070KO2YCp6rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyh44b6MNc-2LlOx1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgythmHMSjR7XkwFJg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgygA1qscTIxMgwrKhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwIgz_NeMLWoJ95HnZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzbAflLW5PD7YCMdSV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwt3dKkP_itbHhIubp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]