Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great Video as always and very eye opener. My take on this is that self-driving will tak over in the future, since we know that humans are inefficient in driving and the #1 cause of traffic in the first place. But the keyword here is the "future" not now, where the technology is still immature. Secondly, even I, a person who has limited knowledge about planes, knows that autopilot and the pilot works in tandem to make our flights safer. The pilot needed to supervise and intervene the autopilot if necessary to correct any mistakes and prevent any accidents. Autopilot is there to help pilots fly the plane and reduce the workload of the pilot. God knows why Tesla owners and fans equate Tesla's "autopilot" into Autonomous Self-driving and leave the "autopilot" by themselves. Humans fail all the time, why can't machines do so otherwise is a mystery to Tesla's owners and fans? Lastly, a company like Tesla will probably fail their "autopilot" or Autonomous Self-driving since: a) They're anti-consumer (anti right-to-repair, failing to deliver promised products) and b) They always push any liabilities and malpractice under the rag, blaming the Tesla owners otherwise when part of the blame is always on them. I get that mostly the blame for an ICE cars accidents lies on drivers (owners) but sometimes the car makers can be liable in some accidents (although rarely) and were forced to recall their cars or stop the cause of the problem (cruise control, sudden acceleration, etc.). Tesla being immune to this is probably because they have tons of money to shut the politicians in a Corporate Capitalism of America.
youtube AI Harm Incident 2022-09-04T01:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwkNLEsJJlkcW95_1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxZCzeLWRYK1ZeM1sp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwOLC4MeOt0846tv6p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyReFE13Esonf8RuE94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw3lyrLvRf2V9PiYUt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzbKsiufsyPpWRJc0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwKWUxBNKb_E0-fvF14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3azlfIiuaIk7mJUV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwixm2Q69mBUuzA1oR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwYH_sxlT-cR7eM9Bh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]