Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Tragedy. But what is wrong with all of you?! The self-driving tech was not really the issue. The safety driver was supposed to intervene even if the tech failed to take action. The tech is in experimental stages, like all driverless cars from everybody from Waymo to GM to Lyft. Instead, the human driver was lazy and constantly looking down probably at the porn on his phone. Or texting/sexting someone. Just as he looked up, after repeatedly long periods looking down at his phone or something on his lap, the collision occurred! If he had his eyes on the road all the time, as he was paid to do, the collision may not have happened!
youtube AI Harm Incident 2018-04-24T07:5… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"ytc_UgwLnbNChiM3OWGiPKB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugza6YVvhuGJQnAcosR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxxSyAQd6n2q7u3k4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgytDv0l4ahdzT4Gg0B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwk-dKLfPGhxsEyIk14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"})