Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Isaac Asimov's Three Laws of Robotics, introduced in his 1942 short story "Runaround," are a fictional framework designed to govern the behavior of robots and androids in science fiction narratives. The laws are: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. These laws were originally conceived as a narrative device to explore ethical dilemmas and contradictions in robotic behavior, rather than as a practical blueprint for real-world artificial intelligence or robotics. Asimov later introduced a "Zeroth Law" in his novel Foundation and Earth, which states: "A robot may not injure humanity, or, by inaction, allow humanity to come to harm," placing the welfare of humanity above individual humans. Despite their widespread cultural influence, the laws are considered fictional and impractical for real-world implementation due to inherent ambiguities in language, such as defining "human being" or "harm," and the challenges of translating natural language into executable code. The truth is, humanity is so screwed.
youtube 2026-01-06T04:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwsl6VlHxoyrI1J02N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyJvAXllrV_r_wkGMN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyOnG1TEGW9ECtqMT54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypIriCAjtb5DmnPD94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugym-3TBldTk2Q47BHt4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzrZLlajho4OLSNM1B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz02PD01E00C101p9J4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzIi_DRnwv2rvyAD6l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyGNNuk1nIz0l7mmKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyFCLLMuwYMFDmUU1d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]