Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That is the thing about self driving cars, they are unable to think at all so EVERYTHING that can happen out on public roads have to be manually programmed in with an appropriate response. If you for example forgot to program it to recognize a collapsed bridge if will drive off the collapsed bride and kill you... 5+1
youtube 2026-01-11T07:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwc8cfdgLdblcYAA814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwvDKkdbl-aVvgQFj54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxUc9YH7_-EDiYyBqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugwx-g50BYJlNBJgegB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxFSOnAULeOqO-UCyF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyhuooovKZevf7fpf14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugy3zFSdljyhDmr7sJt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyhk5Abry774nmINvl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyVkaKNF41ZElDGEQl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzIvKwqPyDGnDbQ9hp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"disgust"} ]