Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The laws The Three Laws, presented to be from the fictional "Handbook of Robotics, 56th Edition, 2058 A.D.", are:[1] First Law A robot may not injure a human being or, through inaction, allow a human being to come to harm. Second Law A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Third Law A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Put that into every AI as the first and most important laws. The Three Laws of Robotics ( Asimov's Laws) are a set of rules devised by science fiction author Isaac Asimov, which were to be followed by robots in several of his stories. The rules were introduced in his 1942 short story "Runaround" an "I, Robot". AI is not a puppet nor a toy. This technology is dangerous.
youtube AI Governance 2023-07-08T14:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw0QSZYt-JgbZuab8B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugymkdh7E_gns9FGO914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw6m68ixkbMvKVZeDJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw19jjxcJfTrRF37bh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxosLXJBbXADa6EkmR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwJD-z4GLRMMEcZslh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxTc506MdwK2KbA5hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"mixed"}, {"id":"ytc_UgxD1912uQ0PodsiVs94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzL_Eol4j3onkRYMdF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz0bYYTiBUC_6Svk6Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"} ]