Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dakota Davidson For a robot to be self aware requires that a human being perceive it in a way that we perceive other self-aware beings, at least according to Turing. If it can convince us that it is self-aware, then it is. That is the same standard by which we judge the self-awareness of other human beings, so why should that standard be any different for a machine? Just like with human beings, self-awareness is a property that emerges out of a sufficient level of complexity. If we can recreate the complexity of the brain in a computer, then it is likely that computer will be self-aware. And it is actually not as far away as you think. If Moore's Law continues at its historic pace then by around the year 2029 we should be capable of creating the first human level AIs. That's when the potential will exist most likely but it would probably be at least a few more years to implement it.
youtube 2015-07-30T06:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugghx3Nm4RuttHgCoAEC.82DGI6gxKIX7-ICy93g2kI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-H5EW9ggKw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HG_bomCne","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HLM5uXJkL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HRAbdeSdD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H1P9al9aR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H5jcxhEEn","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H5zMVtLEP","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UghppzR5z6_tM3gCoAEC.82DDSIk0JM17-H2hdl4lgx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UghppzR5z6_tM3gCoAEC.82DDSIk0JM17-H5a17Ul0g","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]