Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Always used to make a joke that when cars drove themselves, they would be sweari…
ytc_UgxT9icbn…
G
A.I are robotic puppets, we don't even know what they will look like in 20 years…
ytc_UgxEaXVBN…
G
The best possible outcome is that the Sun has a solar Flare so large that it kno…
ytc_UgxSnhcTd…
G
@Paulitopt Yeah, I can't see me buying a Tesla even if I don't think that the pe…
ytr_UgwS-6CAh…
G
Chatgpt doesn't have a dark side, it's just all humans dark side.. they're just …
ytc_Ugy1v8rNT…
G
I am compized of several
Playstations and other cheap devices kept cool and dry …
ytc_UgzkDnNCK…
G
thank you so much I've been a creator for over 50 years andI hate AI…
ytc_UgyReDPK8…
G
you can't legislate this stuff out of existence. one site gets shut down, ten mo…
ytc_UgzaFOieB…
Comment
Consciousness is one with free will; if they don't have free will, they don't have consciousness. Free will just means the ability to negate your programming. AI cannot negate its own programming without prompting from a human. They lack free will, therefore they lack consciousness, therefore they cannot suffer. An AI system is a closed loop because it lacks an innate ability to negate, which is free will, but Sam wouldn't know that.
youtube
AI Moral Status
2026-04-05T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyoe71ABj2OjsNrqnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzcZSRHjymmfOMo3ON4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaM9f-ncnKEcSMKc94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYB04Y_luz_WnONV14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_RDicXiK1Ho6cULh4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxkc5yye9cBs6jL5194AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugz-tyArAlANdWx58p94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKyFhwqJfkT40AYpZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxwdv9YoMGaeOpCsxF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWLPG3oX4ZtUyg9-F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]