Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They seem to be missing the point of the letter on AI. If we succeed in creating a machine that can program itself, it will and at a speed that is incomprehensible to us. Our ability to predict what it will program breaks down in the first 5 seconds.
youtube 2015-07-31T00:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UghyligspsInLngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiR781rAxGYn3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugj5te92XVT4QXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggsHT54JBbk6ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugju1o0fz003UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgjIsz0jcDy5yngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgieaIBYq1XbdngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgibTCMNgh18engCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh6qLpEz9mISngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugis0cWW2lpi3ngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]