Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Had to stop watching around the 10 minute mark, but this was the first video on this channel I ended up deeply disagreeing with. Fundamentally the only thing that's relevant is whether on average it gets better than human drivers. I also deeply believe there is a big difference between companies that took a "move fast and break things" approach such as Uber and Tesla, vs companies like Waymo who were basically the first (out of the major investments) and took stuff slow and steady. I remember watching the DARPA self driving car challenges sometimes before 2010, and Google hired the winning team from one of those. Anyhow, the reason I am deeply excited about self driving cars is because 1) it means that family who is blind will be able to drive 2) self driving cars should want to compete with car ownership itself, which means it should end up cheaper than car ownership, which means that for those of us that don't want to own cars the disadvantages of not owning a car will dissapear and 3) I don't see a future where small two person cars and shared minibuses don't become a thing in a self driving world. They both make too much economical sense. Why do people buy big cars? Because they *might* have to fit more people into them occasionally, but if you can haul a bigger car when you need it, then suddenly it's more economical to drive a lighter cheaper car when you are travelling by yourself. And same story to a lesser extend for shared minibuses I hope (less confident about that one). Anyway, we will see how it goes
youtube 2024-11-15T09:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxdiAqZlgNvsFMUbSh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy0ZhqDxLZbV57uk0Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwqBuq7whSoh8O0PWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwdVX-ZE15gyVUHvkt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxVuvCDadE89BL6nDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzx7qyFhMBc-6sD_Hl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyJxB9zqH5rBG-6g5l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzMzKiYcGm5o_0xWnR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy3a8feVJobcrxfjD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxsvAq7Ctie28oiXu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]