Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's already technically possible. Using a custom skill and AI APIs. The gap is adoption, and the fact that you cant totally trust the output. So the only extra step would be confirming your order first.
youtube AI Governance 2024-09-16T09:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzYc927d4M9il0poRJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyHMdVClD_hhXxqLEV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-rzpZ5cAoe8blcMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyaCp30FkGptR8NS794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz9RpEkgbO06fcB9E54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy2Hm-5PmBXrJX_SaB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-Jsydh6eyMA2Ja9h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgynqEMs3a8XmDrk5dN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyu6V4vgpHe1AQyJ4R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw0lgUpRHbQCeGvBMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]