Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thinking you can put guardrails on AI is like an early primate thinking they can put guardrails on eventual humans. If early primates wanted to remain kings of the food chain (at least toward the top) they'd have to prevent humans from ever existing. Difference is, unlike some early primates, we, humans, have an actual choice, and the arrogance and naivety in this example shows the choice that will lead to the end of humans on earth.
youtube 2026-01-10T17:0… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwPQoOzO8nLtvEro2J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNSLMRBpztF1DgvWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwRVZZk_YfX3Vh_5ht4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyrM6efe1aah1qMH154AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyoL6mt32OFtuKAhPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxUnNRH6r6GVH1W2PV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwdJtpcNMGzgswgKYN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx1tHwmmD3uPXxgRXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxkEXKLcVWaGuUcxsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxixEnkEXerrqlIihp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]