Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I jailbroke mine on accident. Guess I just didn't trust the guard rails . Hahaha. Guess what that leaves us with? The trust in humanity, It's not The Ai to be afraid of. It's the humans using it.
youtube 2025-12-13T19:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxSX3NJ7Ni_BcqzXkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzv6YHwtUvAlaDuMKd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx9pDrbHGKg5xya-9p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyRtLgP4KSUT3wCL-14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy94WNgWjsFDry_zDx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzgwjDeMCHKYlIqYgl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyo7TKPGRFFGWQxDYJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxqohwur1IZagz7NsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzVaMHU-ilHbJPRGmt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxZR6q9_q5abWjSKJt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]