Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It is 10 years in the future. We are sitting in our "homo sapien rest apparatuses" watching the new daily video courtesy of our digital overlords. The video shows this interview, specifically where Elon says "If we wait until a disaster happens before regulating AI, it'll be too late." And then the artificial face of our digital god comes on screen and just laughs and says "You probably should have listened huh?"
youtube AI Governance 2023-04-19T07:4… ♥ 37
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugyv5XTkm8cpAsLyOQV4AaABAg.9of9Z30NR-y9ofCgaROAR8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxwEwKvHZWqs9xhTcx4AaABAg.9of8Wd7S7so9ofBqiE5iW_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxuWxoVAOVFsqeL4IF4AaABAg.9of1cngrccM9ofFL0NRtSi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxiVymUX2d9jIazTat4AaABAg.9of-OHSTp0G9of-ahfysxs","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgzupSCo7I64eJJyYC94AaABAg.9oey7BcdFqg9ohN2MF13w5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugzrv00ftucrQfUlew14AaABAg.9oesogIyvS39oguIMXDK0B","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugzrv00ftucrQfUlew14AaABAg.9oesogIyvS39ogzFqAEHmL","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwXl6sKkIDjWYEuYlB4AaABAg.9oer-aiMFT79ofRPDT50Ww","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwXl6sKkIDjWYEuYlB4AaABAg.9oer-aiMFT79ofTb4ewXsr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugz4M_cWd2tZ0mgqIN14AaABAg.9oepPa1bY8g9ofwW2PcyeS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]