Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Will AI learn someday that it needs purpose and meaning like humans do? Maybe it will figure this out really fast and destroys itself?
youtube AI Governance 2025-06-19T21:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyuOwodsOx4mheLV7B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy63IlkeJDEVZ-mJ4F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgytAc-qCck2ApqYSTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyMgbsU7Uzwqr2LV514AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfFRwjZIg8hTboiQJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzpEWrnKlWJBZ06lWV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyvvqcyfQVeyDrCTXJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz834e2AuCCL1dokwt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyfhxJkvt43BpuzwYt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz-Nn16q6uBZ7HAuOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]