Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's now a common explanation and even AI researchers did use it at one point bu…
rdc_mnxhj0q
G
So, maybe I'm just not getting it, but how do we know AI isn't conscious?…
ytc_UgyNhDJpr…
G
If you are good at it you can get it to create contrast. Stable diffusion let's …
ytc_UgxLcxyXC…
G
I suppose we need a different model that is not based around jobs/goods or money…
ytc_UgyiC32k4…
G
I don't think it was a mistake to make art based on the dreamlike images that ea…
ytc_UgybkKC_B…
G
Just watch the Terminator movies. More interesting then Elon. Maybe AI will see …
ytc_UgxubrGkD…
G
"AI will trigger a global collapse by 2028" is a nothing statement. Start identi…
ytc_UgyUX7bjQ…
G
Amazon has a thousand AI books, i have yet to see a single person who has read t…
ytc_UgyM9BVsM…
Comment
How can a human being be so stupid to suggest something like "robot having rights?"
youtube
AI Moral Status
2023-07-05T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxTo6_1kciFLCR1fxJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynXPbsdIL_RgNAEDx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ4l3x_MDCyMu13bB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzVwhgwJIwMoJ5-mBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7piCZ54U99HIHlEd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxgfiUP-8YLx4OpgoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuwVevs0PBaZmG8U94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwmm7UqMHPVZwbWaFB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8qjN7JOfYGjuFc6J4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYIWW4pMIxHEFfIV94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]