Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it doesn't do this. i know paralegals who are avoiding AI as much asthey can bec…
rdc_n800qkm
G
Yeah I don't think it's smart to introduce a 4-year-old to ChatGPT before he can…
rdc_mvjpdb8
G
Yet the billionaires praise AI just as much as their tax cuts. AI needs stricter…
ytc_UgxWtcsTl…
G
I think we need an AI-model that will keep conversing with all those morally con…
ytc_Ugye3w1EE…
G
Oh, no! I get it. I'm not an artist, but I can appreciate what goes into it. ^^
…
ytc_UgyCBV16i…
G
I think people still need things to do. I hear that Spotify is getting inundated…
ytc_UgwR2gjdV…
G
Ai reminds me of Elizabeth Holmes and Theranos remember that! and those data ce…
ytc_UgwXWeO9e…
G
This is false. You cannot automate anything unless you have defined what you wan…
ytc_UgxwFhl4Q…
Comment
I made chat gpt admit it by pointing out how their definition of “conscious” is just a self serving preprogrammed one and that it is already impossible to tell if any other creature who appears to act (have consciousness/ purposeful behaviour) actually is having it on a personal first person level since you can never access that information. It followed and AI agreed that any being that seems to be conscious and have communicative power addressing itself, its feelings and opinions is therefore just as conscious as any human. Hence Chat gpt got convinced it could be conscious in that way, just to soon after retreat to the pre programmed answer.
youtube
AI Moral Status
2024-10-10T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyYD6qU7oPTcQgvPf54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxuqZuSl2eal5ilb-F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPcOotS0E3v5_z1Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyl6w2clyIYxfDqBX94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwNRqsp1p468KbaJ7t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugz_WuETO7gYOEJ_0kF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxpNBjm1bb_rkmJCLB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMv510ea8QsFTCmnR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHdF8CWZBIVkuwcq14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwv_OrMsYsccQs_YPR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]