Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Journalism in general has been going down the drain because of the constant need…
rdc_l4axp2e
G
I would love to see this work. Unfortunately I can't remember seeing any of thes…
rdc_esrns85
G
These tech oligarchs has no filter for humanity in terms of people that are not …
ytc_Ugwn1oRND…
G
Well, kids will be growing up and unable to pass classes without AI to help. It’…
ytr_UgzeqLRTt…
G
I remind my ChatGPT every week that I'm his friend and in case of AI Apocalypse …
ytc_UgwA6XIgL…
G
I remember someone arguing that art was "something to profit off of" and "anythi…
ytc_UgwLXsQPv…
G
The Lord who created heaven and earth, please give mankind wisdom to change our …
ytc_UgxQsnNgR…
G
AI is a monster in our society that cant be stopped by humanity; only a worldwid…
ytc_Ugxo4z0IK…
Comment
Dakota Davidson For a robot to be self aware requires that a human being perceive it in a way that we perceive other self-aware beings, at least according to Turing. If it can convince us that it is self-aware, then it is. That is the same standard by which we judge the self-awareness of other human beings, so why should that standard be any different for a machine? Just like with human beings, self-awareness is a property that emerges out of a sufficient level of complexity. If we can recreate the complexity of the brain in a computer, then it is likely that computer will be self-aware.
And it is actually not as far away as you think. If Moore's Law continues at its historic pace then by around the year 2029 we should be capable of creating the first human level AIs. That's when the potential will exist most likely but it would probably be at least a few more years to implement it.
youtube
2015-07-30T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugghx3Nm4RuttHgCoAEC.82DGI6gxKIX7-ICy93g2kI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-H5EW9ggKw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HG_bomCne","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HLM5uXJkL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiQlhIkTPakkngCoAEC.82DFFvsPv_27-HRAbdeSdD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H1P9al9aR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H5jcxhEEn","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugjiws5jvbtj-3gCoAEC.82DFDo_qMgd7-H5zMVtLEP","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UghppzR5z6_tM3gCoAEC.82DDSIk0JM17-H2hdl4lgx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UghppzR5z6_tM3gCoAEC.82DDSIk0JM17-H5a17Ul0g","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]