Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI Chat programs can have "hallucinations" which is when they make stuff up out of no where but it is based on the structure of data wanted and sounds so believable that if you dont check it sounds like there is no way it can be wrong. FFS I get that this guy was a lazy mofo but not only did he not do the job himself, he didnt check the data he was given when someone else did it for him!
youtube AI Responsibility 2023-06-10T21:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxgPcXJa2ffTpbNHz54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzH6LoxLMPmYqA2tmR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwTytsZfN4pZy_CxQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxA7q30LSCNw5I9RUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwNEbwOadVVMvE7cax4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0DK__xHNVBVLcQHR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfuNgqZFoKn0m2bhF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy7z-ZjdQ3lucZqqZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxd2KAVhNZzEovmYnl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwvMvKmmtTNHDSj2N94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]