Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So much fantasy in AI, AI is not going to replace us, but surely it will change …
ytc_UgzFXDgl6…
G
The country with 4x as many people and is the manufacturing hub for most of the …
rdc_gx72c1e
G
In Beijing AI face recognition is used to prevent people from stealing toilet pa…
ytc_Ugz0U-PgP…
G
This is nothing compared to the city of Boston. They spent $600k in civil asset …
ytr_UgwbjDkq5…
G
to the AIs defense, AI art is like 4 years old, while humans have been drawing f…
ytc_UgyZ-mk9C…
G
What, this was 5 years ago. I thought this was now, what with all the new AI tec…
ytc_UgyJOA4lT…
G
i'm sad that some artists are bashing these artists because they drew based on a…
ytc_UgxjKqblV…
G
What do you think about use for song lyrics.
My favorite artist Steven wilson d…
ytc_UgyQxfJT8…
Comment
This was extremely intrigues me. Let's say you dismantle a simple toy robot. Would you feel bad for 'destroying' it ?
Now let's say they make an extremely realistic pet puppy, complete with all the movements, emotions, sounds, need for affection and care and food (all artificial ofcourse). Making it cry when you hit it too hard, making it show sadness if you mistreat it. Would it still be ethical abusing a highly complex being that shows emotion and doesn't want to die?
If toys or machines and AI become 100x as complex and realistic as now, the boundaries of being real or artificial will start to fade greatly.
youtube
AI Moral Status
2018-02-20T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4jv3CEWzURRu5HsN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMfYuv9cf_JLp-n9h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8aRZQRBx68_zKO9l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCiFXiou1vqLS4Dxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu5KhSaSD6YSzpwd54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQduWc7dqx2_EgNEd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlzabsKq9ISy3CBTh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOjsFP0aIFzc2ww-54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAr-F4o1eHq2J_kPp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugww6qRG8pzD28KzuvZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]