Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It isn't a new form of capitalism. It is just a monopolisation of market researc…
ytc_UgxYdlojH…
G
Guys, come on, this will never happen. Anthropomorphizing computers and AI is …
ytc_UgzbICxFz…
G
I think it's funny how the scientists or suicidal atheist are the ones racing to…
ytc_UgxFlIzae…
G
There is something about AI that has reveal the folly of bureaucracy ( basically…
ytc_UgzFF8j86…
G
As for AI generating code with vulnerabilities - that’s a fact. Just recently, C…
ytr_Ugzv4Wtka…
G
I love AI and use chatGPT daily, doesn’t mean I am going to stop questioning thi…
rdc_m3bunar
G
Future generations will be less polite if they’re conditioned to not be polite t…
ytc_Ugx-hhvgs…
G
Don’t normally comment but am doing this for the algorithm because as a Comp Sci…
ytc_UgymPxE0j…
Comment
Eddie Venegas no that wouldn’t work. Feelings are just reactions to sensory input. You can program a robot puppy to react happily to seeing a ball. What makes that reaction different to the same reaction of a real puppy?
The fact is, that this is a very difficult question to answer. They recently made a test where a humanoid robot was put in a room, you then had to go, activate it, talk to it, then turn it off and leave the room. But just before they would flip the switch, the robot would beg not to be turned off, saying it was afraid of the dark.
More than 70% hesitated and 30% outright refused to turn the robot off.
Did that robot have feelings? Or was it just programmed?
We know it was programmed, but what if having that program made it fear being turned off?
youtube
AI Moral Status
2019-04-23T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx8NXAldCHLJs-foPN4AaABAg.8u7jXyTKlBb8uV0weoCpYh","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyKm3tW9_-V6uBkQnJ4AaABAg.8u0nQy6bomc8u3Y0onE3QK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgznYCuNHsBfDMzAZO94AaABAg.8tylgDzBWLS8uVJ1RQWxLO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzNg7iUiw2XkcMaSq94AaABAg.8tsvaPoyCSX8uV5Oy4VAaY","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy-Q5DKyQ4-6-ZjUKN4AaABAg.8tiKHiVFRaM8uV7kmhKPSx","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxGMgPFhm8CtQQ2sdN4AaABAg.8tfdDQBMSBK8uV8SqIIrwQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgyUFKI19W56UeTHVtF4AaABAg.8tHyH-2wSKO8tIOMEIHj8u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzO_sh5Lua2X1HyIVZ4AaABAg.8sZBx6MfvkV8t9RRry38r9","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugxr39daBAZP-7www894AaABAg.8s4Z0tehnor8sRjd3usZXf","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzpx5PWUPGFwRj8aEx4AaABAg.8rzixGQlQvu8sRlndjl6pl","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]