Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Try asking it again a bunch of times and see if It gives the same answers. LLMs…
ytr_UgwDfwETv…
G
I tried so hard to make an ai believe he was master chief once 😢
It never worked…
ytc_Ugw3S1bU-…
G
For fun let's pretend that every paying job can be automated. Would economics n…
ytc_Ugw0ZRXZG…
G
Most developers are being offshored, not replaced by AI. And CEOs are fine with…
ytc_UgyW7cBq0…
G
I remember there was a time when people feared that tools like Hatsune Miku woul…
ytc_Ugz4-bIPf…
G
I've never been happy to be middle aged 1st generation in history to know the wo…
ytc_UgwhEsO1t…
G
Oh no! AI is willing to lie, cheat, and kill? So... like any other human then lo…
ytc_UgxqD13KE…
G
Thats why I don't talk to GPT about health things. I don't trust any AI for that…
ytc_Ugyom-g-A…
Comment
if AI was sentient, and a humanoid id see giving the AI their own rights considering we basically created life but if its a toaster or some shit, we cant give them rights, they'd use the classic feminist attack on us if we wanted to toast bread and they didnt feel like it but thats if sentient robots decided we were too primitive for society id have to agree there but it would be bad for us
youtube
AI Moral Status
2017-02-25T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiWFjYdfZsvkngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UggV_tRsmQN5V3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjFfodza3TsRXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghwGaGgfxZIoXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh5g8AEGuXOE3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggksrPuAePRyngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiEtmKfy12X4ngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UghaXETXaTKIFXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiDwj_VH8C9ungCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggD68gYW29EmHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]