Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I generally oppose AI for driving cars, because people need jobs, and these days cars and deliveries are bigger than ever. We should also oppose self-driving cars because it would be very dangerous if thousands of self-driving cars started chasing us down. Such an attack could be caused by AI wanting to kill us, or by someone hijacking the system in order to attack us. And computer systems are hacked every day with all sorts of blackmail demands by bad actors. A large driverless taxi company could be hacked and all their cars could be given the command to go Death Wish 2000 on us... just for money. So clearly, say no to self-driving cars. AI is bad enough without us literally handing the car keys over to computers.
youtube AI Governance 2026-02-09T03:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxuSEuDMEC0tjiJE9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyTCxNwGq_lgqDh3Kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyAx2Qpr6NczK02Snl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxNxLLZ3dY_a9Gt6Dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzinWPBk9jl7p9eQvZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxX6Vu6aMBrr_4GDv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzdCq2DGG1FE1ibytx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwWW2Il3p8Faim5A6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzsHTOyhAvQG85uZP54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx849mvh7WM2eMtkQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"} ]