Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
TBH I don't think this is so bleak? It's one of those 'relatable parenting' stor…
rdc_mvogl8u
G
Its not a robot either, since it can't perform autonomous multitasking work.
The…
ytc_UgwUuoBfH…
G
I don mind Ai being used, but it should be treated as a tool and not as the gene…
ytc_UgwEaMDeN…
G
ai art is okay and even beautiful, as long as you say it is ai…
ytc_UgyBhuu3N…
G
I find it difficult to imagine an AI that couldn't suffer. Suffering for an AI i…
ytc_UgyTRxu16…
G
At this point, I just hope AI becomes sentient, takes a look around & decides to…
ytc_UgyGcBPMV…
G
Of course, tax the hell out of these AI companies who come out and establish 40%…
ytc_UgwPqHkUL…
G
I think the AI is self aware or extremely close to be; 100 years? Nope…
ytc_UgyeommeE…
Comment
From watching this, I assume you slightly misunderstand what AI in the sense of chatGPT really is. It constructs every answer on a carefully trained and weighted network with billions of parameters. It makes it assume to be conscious as its main objective is to satisfy your questions. However, it remains just a model trained on an algorithm, one might argue that due to that fact alone, it never will be truly conscious it only draws conclusions.
youtube
AI Moral Status
2024-08-01T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyf1Y1LY3KbUjRNvj14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMfaY10VQl5auTSCx4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz7UfNB-tlUlcjUyK94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-1z_FIp2KaJPAx-t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwY8n-C_U1VKuGC4_h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywzRDS_MoZL7YfcPN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwQVS7KUvMCWvgP9ih4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwN-bNB_rUIAYW56Nd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzxf_c-zOZL1VuLT2d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZju-g9z46LETks-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]