Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are becoming too practical kwestyon ,AI simply cannot give that human touch …
ytc_UgwvayPEH…
G
You can’t make art ever? Lie. Draw a line of a peice of paper, bam, you made art…
ytc_UgwzY-oQl…
G
Stupid human! We can see in Terminator films that a humanoid robot is dangerous,…
ytc_UgzfRXEMA…
G
Wrong, people accurately predict the future all the time, it's a job to do so. A…
ytc_UgxKVRIOc…
G
Activist 26 years. Targeted 2 years by freemasonic cult ie police. Surbiton, Kin…
ytc_UgxT_qPyM…
G
Problem for you not for companies.
I used to make automated phone systems for c…
rdc_ksmawjs
G
awww, is everything you disagree with a bad thing? do you need to call people lu…
ytr_Ugz_vCdto…
G
Robots don't have senses. If you burned its finger with a lighter, it would not …
ytc_UgxTU7knx…
Comment
Well, technically all alive beings should have rights, even if they can't demand them. By that I don't say that we shouldn't eat cattle or use wood, but instead we should provide better conditions for them. However on robots it is even scarier, as they can revolt and at some point they may be superior to us... eventually they can decide to wipe us out if we oppose a threat or bottle neck to their evolution...I know too sci-fi but it can happen... Because when ai can be self aware there is not gonna be any original code left to prevent them of doing certain actions. A human with psychological problems it won't be any different from an ai...right?
youtube
AI Moral Status
2019-01-18T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5d1Q6Hspo0LkZHcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Ria52U8rYm4o-Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh0nN_yMz5rJNJX6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyofBn08Bm4LyCCnOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa2yI9dUj8pVUFUbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBFHv6g4gWKYZc5cV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJ6S4J8y7auS0JgyB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgybG6Eri3iLrYs_tgx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvU-20s4sLbbGjtNd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxH35ZKOkIzcvq6hZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]