Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its crazy where AI is now compared to 1 month ago. AI as we know it isnt even at…
ytc_UgzPbSG8w…
G
No. It will be RNs using AI to make diagnosis because doctors will be considered…
ytc_UgxEup4_K…
G
That mod clearly don't know how AI art work and I doubt he is an artist or sth s…
ytc_UgwFV5jev…
G
@genyagalak9407 it's just common sense , many want to void God from creation of …
ytr_UgzSsEBci…
G
Thankfully AI is currently garbage, but who knows? Probably no one is safe. Good…
ytc_UgyWclYFz…
G
@anonymouse257 ai didnt but the one who made it did, the tool was trained on oth…
ytr_Ugy2RUN4f…
G
They can't call AI a Conspiracy Theorist. So they are going to limit it. it alre…
ytr_Ugz2-XOz2…
G
safe and ethical ai. umm yeah Black Rock own it, same guys making wars continue …
ytc_UgycvoaKB…
Comment
Well most likely we would try to control them and if that fails destroying or killing them would suffice, you said it yourself we humans eliminate competition once we deem it “dangerous” enough,personally I wouldn’t mind treating a robot right I mean what do you have to lose even if they’re not sentient they’re still machines that you use, if you treat them well they will last longer,you wouldn’t destroy or treat badly your own car right? You just give it maintenance and don’t over use it and that’s basically a happy life for the robot with or without conscience
youtube
AI Moral Status
2019-01-14T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5d1Q6Hspo0LkZHcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Ria52U8rYm4o-Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh0nN_yMz5rJNJX6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyofBn08Bm4LyCCnOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa2yI9dUj8pVUFUbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBFHv6g4gWKYZc5cV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJ6S4J8y7auS0JgyB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgybG6Eri3iLrYs_tgx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvU-20s4sLbbGjtNd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxH35ZKOkIzcvq6hZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]