Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I know this has been mentioned before, but this is a dangerous road... for humans. We're silly creatures that sometimes want to push our limits with strange things, and in the case of this road, it's how fast someone can get through it without crashing. You're car on the other hand treated it like a suburban road, barely breaking 30 mph on a two lane road that's well maintained, has no breaks, and no pedestrians. This is something a lab would love to show of their "full self driving" to gullible people, because this is something that looks hard, but is extremely easy for even simple AI. It just has to follow the road that is on the map at the set speed limit and follow the well kept lines on the road. Also love how you made light of your car literally stopping in the middle of a dangerous road for humans because it thought a car was in the way while said car is parked on the shoulder. That is a dangerous failure right there that you decided to play up for laughs. Tesla has the AI disengage before a crash so that the crash isn't technically Tesla's fault since you were in control at that time. If you were going at high speed and had something similar happen, you probably would have found yourself killing both that drive and yourself.
youtube 2022-12-05T06:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzR38jJE2YJC_2dp9V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwDgA3SdQLDf5sbJnZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzMJHctMlGqNRmlBnF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzXj4cvD2gZ0wmxyj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzEBeeTvk6Bnj9IxIV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzBJWwRynO7e1JEHLx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwmoCErVbpsF774dAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzKACy2THfXpvj-EN54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxsBZfVwaxTrF0CsrJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjBnljdvS3gNr45JJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]