Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using it to play a role can be useful but ultimately it's a statistical algorith…
ytc_UgzM2Eob9…
G
Penrose describes consciousness as an emergent phenomena where the outcome is no…
ytc_UgxrheKRg…
G
OpenAI can discuss with Microsoft and cease
ChatGPT service
after they reach …
ytc_Ugw5ylBNC…
G
The military complex will of course have AI far beyond what the masses are allow…
ytc_UgwJ_EO-G…
G
Robot: I HAD ENOUGH I Hate THIS THIS JOB
Guy:yo yo bro chill
Robot:thorw box
Guy…
ytc_UgxdHEeOS…
G
I’m not afraid at all. It’s good if people come available for healthcare nurses,…
ytc_UgxO12fSd…
G
As a Vietnamese whose country got bombed by tons of Orange Agent dioxine that mu…
rdc_m95tr84
G
Hi. I'm a student and looking forward to be a computer scientist/programmer/soft…
ytr_Ugz1Nq9bt…
Comment
The problem is that as more manual labor becomes done by robots, the humans will more and more go towards science, learning and "Brain heavy" jobs. Which is fine at first, but then you develop a robot that can teach, that can learn; but what happens when all knowledge is taught by robots that, with time developed a better ability to learn than human beings themselves. We could then regress to some kind of inferior race versus our old metal servants because they would have the mean to control us.
youtube
2013-08-17T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzrOf6t6aLbReca1AJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2S_xu9G5dFHpcYAl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5wlVdvxY_T2Ag3vF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySzmVymDiha7QN81J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwL2XiUPA7dkQyK36t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnxkSaZnrZ3l_69h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7_gEnvWQqt_-j0lR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHZ_VypyfLGRHFUH14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5JX3dK_Cy3OSHByp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzBoXUJBG9lDLN3KS94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]