Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To exaggerate intentionally, maybe don't drop the moon on a house to kill a flea? Sure, it will kill the flea and demolish the house, but what about the rest of the planet around it? If you need to take out a sniper, why are you using a bomb? As you have stated, this is obviously overkill in more than one sense of the word. If AI can figure out what will happen after a bombing, AI can figure out how to kill one man with one bullet without any collateral damage. - It seems that there is some thought that how much money and time it takes to build a weapon isn't even taken into account in regard to the end results. How much should it cost to take out a sniper? What is the cost of one bullet? How much is the cost of a guided bomb? What is the cost of each civilian life lost? What is the cost of each building destroyed by bombing?
youtube 2024-07-29T20:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwtufjJRDpIt3XuOxB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwaENeA222Q9v183KZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzqXdXgwHlwe8tY7dp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwaZg1Y8_Dnbo_B-sp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyMTIcd-SAUkvBekM54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyuCQIIvVgjDpeKZ5B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxLDj3KAbOsr1XNcvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxa7cdZzRKCQqTY07h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwzMOgGbxsrCAfNAkh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx2F8cBOM5pxeBESZp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"} ]