Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's all because of MuskRAT's hubris and ego. He alone willingly made the decision to axe LiDAR from his vehicles. Mammals have been on this planet for about 225 million years resulting in binocular vision that humans enjoy today. This vision though has complex regions of the brain the process what the eyeballs see. Trying to emulate human vision by using a couple of CCD cameras and a very limited image processor is the ultimate stupid decision. Especially on the public roads where everything happens at random and no two scenarios ever play out the same. You want high speed self driving cars, you're gonna need a lot more sensory power than just CCD cameras. LiDAR would be the absolute minimum. All other vehicles on the roads would have to have beacons or transponders that would alert other vehicles to their status and their position. There would have to be status points on the roadways to alert autonomous or semi-autonomous vehicles to the presence of dangerous situations ahead. Cars would have to obey speed limits set based upon road conditions. MuskRAT needs to realize that the public streets and highways are not a game of "Out Run" or "Pole Position" where the obstacles are always in the same place at the same time each and every time you play.
youtube AI Harm Incident 2025-04-21T07:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxR9I6PNNLzQleDQGN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwCVe-uLqHzpaQwRyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgweWosgEdBDbM1J9t54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyVzcrXsyuBHAsJZIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwztLjEZzLc6Zu7oxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzNVC1ydO0c-soNTr14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3JtDkh1o42myOp4d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwskBsLDHm3rIbREOF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugwv_5nV_2Z909Ro70h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxyE2r3cdmW6pb2iWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]