Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI should be a machine first and foremost, and it should not imitate human behaviour. Actually, AI should not happen as we know it, it should more like a Programmed Intelligence, a massive software with millions of actions, variables and conditions working together. If we take a PI for some combat drone it should be like this (Example) Action 1) Fire weapons Condition 1.1) Target is hostile 1.2) Target poses a threat to other drones or Humans 1.3) Firing weapons poses no collateral damage to allies Variables (If we could call it like that) 1.1.2) Condition can be ignored if target is non direct combat vehicle (Jammer) 1.1.3) Weapon can be fired even if it poses a risk to allies if the no action would cause more serious damage
youtube AI Moral Status 2017-02-23T14:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgjCkbW8HzWknngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgiEPKpkQpLBvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UggS6u_4h0pTJ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UggR_H-guI1ov3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgisRaHAbPZkRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UghDTSkKguh_eXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugjb307Mr6aT_XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UggBAqOIJtgnCHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UghBJWyJQzHrOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgjYadM9MhFjhngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]