Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i use ai a lot, but whenever i have questions i always ask it as if i were speak…
ytc_UgzPZKfhC…
G
so if its run entirely by a.i robots then i guess the human people who are putti…
ytc_UgyPCD5e1…
G
My dream of being an artist since I was a child is being shattered with each pas…
ytc_UgzUJcG7K…
G
AI art should be banned, creativity should only be expressed by living beings no…
ytc_UgzYqd-8G…
G
Am I crazy, or just plain dumb. But I've been thinking, If companies kept on lay…
ytc_UgzYB8zHX…
G
Maybe the term A.I. is obsolete. It is less artificial and more synthetic in my …
ytc_Ugy0l4r2p…
G
TBH, I've been driving for 30 years and I didn't know you weren't supposed to dr…
rdc_nszt0pf
G
Big difference between replacing ALL jobs and MANY jobs. Obviously there will al…
ytc_Ugx9kzYoT…
Comment
Pain is just the organic way for your body to communicate to you that something is wrong, and needs to be fixed. Making self-sufficient robots involves making them capable of self diagnoses: making them aware something is wrong with them and making them prioritize fixing it. While it likely might not cause things like panicking (at least when we create these mechanisms) it could still be effectivley the same thing as our pain. In reguards to the creation of consciousness: If we make a self-learning AI, with a "brain" for doing so, with a body to carry itself effectivley making total control over the robot a thing they can choose to take away from you, then at that point, would they not find themselves developing malfunctions that effectiey result in their own version of mental illness? Would losing control over their "thoughts" cause them to gain the ability to essentially "panic" and feel fear? We will just have to wait and see
youtube
AI Moral Status
2019-10-04T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNUW_c6rtGTNt00H14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDAjRlLsnrJZONBsd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxymhbV-jIYWVkJQgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx7SEPj6xp0BLsg80R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKEig1lgIzBDrdGLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwks04TEgG4pICbne54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqVnQ9cDEotvn6fH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyo_Uo3Gzdm48a8M094AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxyAxH825hsQmPCWMR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3TtaM04GCFygt3eh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]