Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ive never taken Rokkos' Basilisk seriously. But it feels like openly supporting control AI could be risky the way things are going....
youtube 2025-11-05T17:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwGBEJLWb3eAKRF-RN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzFmbQmvCahBt5P2Vd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyHoVBpAr8DeVxUNVJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxViYAWXh6fiM2JPeN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgynK1-eSn7FiHfG1_B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwshhemtspFL9x-KQp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxgwlIoIXeLwZ4NaFl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxy8H8gzZS4zsYKQNp4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxQZ3Z6KSMDSojU9ld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxmvqcuNINqM-8Ivj14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]