Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
you get little glimpses into lex’s mind when he talks, for example, he measures the cost of cars to society as deaths caused by accidents, meanwhile we have cooked the planet with the CO2 they produce. Lex really hasn’t internalised the reality that our global environment is in a full on, downward spiral crisis due to industrialisation and capitalism. This is a great example of the unintended consequences of a technology. a few million deaths and cars might be worth the transport, but causing climate change could never have been predicted by the first car makers. you ask anyone that is an expert on this, if you could go back to the invention of combustion engine cars and prevent that, just organise society around efficient electrified rail, ban combustion engines, electric only, they would. AI will undoubtably be the same, we’re worried about a whole range of issues, rightly so. there may be a vast array of other issues we haven’t even considered that are even worse. wisdom would lead us to stop ruining the planet with our tech, stop playing god, be happy with what we have and try to live sustainably, but we’re not going to do that.
youtube 2024-12-31T04:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz--w5v9NLFuI0HLNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyK9PqiG93vC5Z5qgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz5PNBsAB935H-5Fh94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwHfOPWI1WN22d0AvR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyxhagj0nv-U8MyCTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzFrqnZ7sdgoG3bF4R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzwqblHF83JZ5MCWvR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxIiyW4_QdAqW7rdNV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwdEojchf_Bj2bqH9V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzawtYWlWT1Z9p0sYV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]