Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unrelated question... Where's the best place to monitor for when the "totally no…
rdc_nkapeyq
G
When they say God- like AI that's when God will punish planet earth, Read Revela…
ytc_UgzVCn7kB…
G
people who keep asking this should educated themselves on how ai works and askin…
ytr_Ugxi0oKm4…
G
Even the Ai is saying it works. Wonder how the 'Artists' are gonna argue about t…
ytc_UgyhASdSo…
G
Wow! That hay maker he threw hit the robot so hard it went right through it 😂😂😂 …
ytc_Ugxsj1b6O…
G
AI replacing all the jobs... Anyone who believes that billionaires are working t…
ytc_UgwE7HeJa…
G
Why would auto pilot after passing sit there in the trucks blind spot
Thought i…
ytc_UgxxzvxWg…
G
Its difficult to sit here listening to this and obviously i am very concerned a…
ytc_UgysmFCbF…
Comment
This over simplies John Searle's argument and cintributes to a common misunderstanding of it. He says that a digital computer can't be conscious or semantic but he doesn't say that a robot couldn't be. I know you specifically said computer and not robot but you left the impression that he did because you were talking about humanoid robots and never limited that to ones with computer brains. Searle is a reductive physicalist. His lecture at Google demonstrates that. He simply believes that the computers we have today lack a necessary physical mechanism for consciousness and sentience. Adding more RAM, faster processing and more elegant programming won't cut it in his opinion. I think he underestimates the power of emergence but I am not certain. And the philisophical question arises: If we developed a robot that could pass the in person version of the Turring test using a computer "brain" (which even Searle admits the possibility of) would it be morally safer to grant it personhood in case Searle is wrong about emergence?
youtube
2016-08-12T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugjm-y_VlSp7gXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi2wDy0pT2alXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg7EsEaqcTWyXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjjLumvdyAMHngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UghgbmETgqh03ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjL299GEEWmWXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfuTD6n0HD9ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgipEKhD38TZX3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjvcfv2lTbNwXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggc9eL6NxhYungCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]