Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@heatshield I’d be lying if I said that I’ve never been in this situation The car probably should’ve gone during that initial opening as the clip starts, but I’m guessing that it’s designed to **not** slow-roll’n go these kinds of turns… So it waits for the first few cars on the far left-side to go, then assesses that it has time to go Then as it’s going determines that the next pair of cars potentially sped up and that it ~might~ not have enough space, so it stays where it was, which unfortunately was blocking that first lane that it had rolled into… … not sure if it simply wasn’t clear to back up, but either way, you’re never supposed to reverse into oncoming traffic, or even back into that transition “lane” that it was originally in… which is what that car would have had to do *IF* it was even clear to do so (it’s one thing if I/it overshot the line at a stoplight… usually there’s an avenue to back up a little so you aren’t partially blocking cross-traffic, but even I’d never back up in ~this~ situation) Once that second pair of cars were clear it finished it’s lane crossing into the parking lot… yes that right-side car was approaching, but in a situation like this, ~TYPICALLY~ the other driver/automated car would see the obstruction and slow down/be ready to stop safely Whether that oncoming car was slowing down or not, the amount of space between it and the Waymo car as it finished its turn wasn’t any closer than most of my turns, and was pleasantly sufficient compared to some of the spaces I’ve seen other drivers leave themselves I’d still agree that the turn could prolly be put into the “less ideal” category, but this was hardly unsafe. Take away the one person being unsure of the timing and the other person seemingly laughing and this turn wouldn’t be anything newsworthy, especially if done by a human driver
youtube AI Harm Incident 2024-10-26T14:3… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyWiNeybJ7PhT5jYZh4AaABAg.AA3d2RHLpscAA7AEg70z-j","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgynZwmPHbHo10wsiKV4AaABAg.AA3cJlqot0rAETAfCCg9dd","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzpF90llWjDivI2J0J4AaABAg.AA3MGX2B-VgAA3eR_QdZgr","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwTue-ZiD5VviPruSd4AaABAg.AA3IotyLHagAETAvfpC4Sm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw2VYowlIx2Jon8U8x4AaABAg.AA2ukYgV9eTAA3eDe7GMCH","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzMp6Z2XvDIb2U-emF4AaABAg.AA2qD-tL0ChAA3WQ4UXGuW","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzL5u55pycu8aCZhHl4AaABAg.AA2VovNr2xTAA3aKt1lnae","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzL5u55pycu8aCZhHl4AaABAg.AA2VovNr2xTAA4EwrdeakG","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzL5u55pycu8aCZhHl4AaABAg.AA2VovNr2xTAF3thKq4lYt","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxxEbNfjqB8u4dKJnF4AaABAg.AA1lvhnuUN5AA3K7Ez04p_","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]