Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ok this is starting look like Ai problem is that Im religous and think that your…
ytc_Ugwjn31dR…
G
I fear that the next generation of engineers are going to be woefully under/uned…
rdc_nm0vfkv
G
Heh. I think we will have AGI within two years and ASI by the end of the decade …
ytr_Ugwzi43lM…
G
This question comes up with another question from genetics, when genetics will b…
ytc_UgjuEldbi…
G
I get what you're saying but those images of when you were less experienced are …
ytc_UgwKQzzxS…
G
Are you really going to claim that the country providing the vast majority of Uk…
rdc_jy061qd
G
While I do believe that AI has some uses and can be helpful in various ways, try…
ytc_UgwiA2qOW…
G
This is the whole issue with AI. They're stealing everyone's work and acting lik…
ytc_UgzMtwZ9A…
Comment
16:43 If someone is wondering what “singularity” means which the robot is constantly repeating, it means:
The technological singularity—or simply the singularity—is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. ... The first to use the concept of a "singularity" in the technological context was John von Neumann.”
2029…
youtube
AI Moral Status
2021-10-25T21:4…
♥ 103
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyv9tkouzibwPFFYod4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLRB1fFPW11xBksxN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqMTaVBSo6NzX3GtZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEzaRyXh-r2xq8l754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUYB9Ot7WZIXHVXE94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxdiYGBB4LZxlIfqyl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKrTUgwt8ANSFJJll4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzu0OsfRhpyUa1D6Zt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycW7CHTHkYp08THpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhcqD_feExmWS0d0J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}
]