Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So.... I had to listen to this twice. This guy has contracted himself at every turn. 'We control the software' ... yeah at this moment. Until AGI can outpace you. To think that we can anticipate how AGI will interact is unfathomable. Also where does he get 20 years from? Is this guy on the front lines of AI development? Because im hearing 7 months to 2 years max before we see AGI. 🤔
youtube AI Governance 2024-03-18T16:5… ♥ 3
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzKboi0DXd6Pqn-k8J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzdIHcOt4szwTwucaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyT8kTJWCGtMnxq4HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx8PSGg5D4sLUQ3v_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyka5bFP7_-MKJDAbh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgymcvkPWO-89bTE_cp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwsl_hfOVagQXXSjvB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzt4AfQftqkuBYPCod4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyD5s557KWgDSdqe4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzGTlfOoPxVtOaBd_14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]