Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is so lame. These devices are "designed to emulate human conversation — and thus, human stories. Breaking under stress is a common narrative arc; this particular aspect of machine behavior, while fascinating, seems less indicative of sentience, and more just another example of exactly how ill-equipped AI guardrails are to handle the tendencies of the underlying tech." Seriously, there is no model or code yet for consciousness that we have identified. You don't get sentience from feeding information into a computer. Does he not get this?
youtube AI Moral Status 2023-05-27T03:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[{"id":"ytc_Ugz4NP00QEJzCVjEfyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzuroy0LiTE7iwqRsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwEUX2a6dLnKDa58HV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyrAMbotlJgyWw3xhx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzAJb3vpWMHg3v3ARl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}]