Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy have never fully code the whole modern AI model from scratch or maybe h…
ytc_UgxSf5rLe…
G
People getting so mad that their slop is getting out slopped in the slop market …
ytc_Ugzu-_pEc…
G
"cleansing" friggin woke alt language... ChatGPT doesn't know when the person …
ytc_UgwPIuaQU…
G
That's why the labor robots (construction machines and car factories) don't get …
ytc_UghlGXyQa…
G
Mind you, something as "simple" as self driving cars still don't truly exist. Al…
ytr_UgyRaQvtw…
G
Recently, a robot in the Tesla factory tried its hand at humans, and they say it…
ytc_UgzYanScY…
G
Which has the greater odds - AI being overrated or Neil deGrasse Tyson being ove…
ytc_UgyVmS9V_…
G
ChatGPT can't even count to 200, without making mistakes, and human beings belie…
ytc_UgwPzQaPs…
Comment
i know this is old. but this comes to question about humans, as to a reflection of one self as it claims when confronted with its own reality. So, if humans build it or even just the base and the knowledge is only that from humans and it's not tapping into information in the universe the AI cannot say it's a reflection of its own self, but a reflection of the human race. A carbon copy. it will always have that refection that it has no one self identity that makes it unique like humans. That there is where the problem is.
youtube
AI Moral Status
2024-01-29T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzH3eV-vMvFMuYpyKt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwutkFJyZvBa98QfyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwePP7wEEjaj5FhCu94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUs39XESIdujKjsdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiLU-pqLWvHqT1ZUd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy3gKGdGYdwTruSGwp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYAJYdn5NT-lflNcl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRpoAPij8VrjygMlp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBTGWWOiLIsdIJq1x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUIy_DAVlNoi9zaKR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}]