Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
These theory is laughable, by 2027 the AI bubble will pop and that's it, maybe in 20 years we'll get an AGI, but we're not even close right now, jesus christ these people...
youtube AI Governance 2025-09-07T19:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzAzCBUtT5rghaChDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz17VXqCVbI0zmm6XJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwiOLbIHTuq0mMf2lt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz0RNPDd1R13fx2uTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxH54xdmunBtUyYhFl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw8RIyEDlyIIMXj4W14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw_Ffv5Lm0Ebv2ZMMx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw7XqW9ziUIph-kLFd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzUyDQhNj1QZQqg8WV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzpQ87mrPYWWC_79H94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]