Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I took acid one night and made the mistake of starting to talk about AI and when I went down the rabbit whole of what the future holds It almost drove me mad for a second. I snapped out of it eventually but acid is what told me all of this…. This type of technology is going to hurt us and reset us as humans, my children are going to grow up in a world we won’t recognize :/
youtube AI Governance 2025-09-05T15:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy1uVE8i3Mnkbh4sel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxcTdyHbv1Nu3ckgIN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxvhqruO4lZUoe5AP94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxhsESO9qEwJvDQGn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzKPgn8g4_SePMTbJJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxETUw_HG5lPJhb9Op4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugymx-5FpMSKq7d2ufd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxJqS9hNV7Lv-yjGDp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwn4WeacK07L_fKBHp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw8iuKfMp-cstXuPF94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"} ]