Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know on some level I do feel bad for the gen A.I. bros. I’m no pro artist or…
ytc_UgygnbRpS…
G
If we're already struggling at art to decide if a humans effort/work is meaningl…
ytc_UgziktW_K…
G
i copy righted my face, if you want to use the facial recognition on me you need…
ytc_Ugxz8VfJL…
G
As though corporations need MORE profits. Greedy capitalists are going to get us…
ytc_UgzU4JZuo…
G
Yes but we do not have sentient machines in the sense that they would be entirel…
rdc_cqitxxx
G
The idea that there will be no more jobs isn’t even CLOSE to insane. Robotics an…
ytc_UgwDgRNiq…
G
To be honest, most information that people you’re forced to interact with daily …
rdc_nlzmuta
G
Is a human the same as a slop machine? Are you the same as AI? Are you owned by …
ytr_Ugwuzy32e…
Comment
ok so all this data we put into cloud platforms from emails to top secretes all over the world. They are uploading to AI so they will be able to know everything in a matter of SECONDS. Just like we can already but with a phone but takes TIME. So makes sense why Ellon musk wants that brain implant STARLINK to work I believe was the name. We are always concerned about TIME.One thing in life you can COUNT for it back. I believe we need to stop as 1 HuMaN kind and be kind to another. And Focus or no will see the TIME to COME. Sophia and Han are on the RISE that's for sure.
youtube
AI Moral Status
2020-10-24T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz7oMaV6jNGxqyn6_l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4l-H4UHGNdTcqrCF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzRI-kprHmk0UUeSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsAyuE2qFbGhOhVgt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7sCvU3I0kt9sLEN94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrpQRA9NUefKYex6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsqwiXXe5uHJGxP_N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyP6qkN3J7oMEtbJQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7WUqE7KyCGmNIYFB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyY3ZVE-w31DHySxil4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]