Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you're imagining sentient machines you're missing the point.  Using trigonometry, your brain figures out how far away the things you're looking at are.  It does this automatically.  Now imagine a machine that's been taught to learn, or infer, or deduce automatically.  It's literally impossible to imagine what it would "decide" to do.  "Decide", because it's not really conscious or self-aware.
youtube 2015-07-30T14:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugi0NsTNEzCVi3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghKW3U90dEZzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggPxLAiNL91S3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UggukRYKoZHNNngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugj16MbwYPEvoXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggenJTFfEzdeHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgigfobkZ5wmnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugi_4nSGJeFjvngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugg1ZPolIExGAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgizJ7I8moWOHXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]