Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI when it starts evolving on its own may evolve into pure sentient energy in th…
ytc_UgwgwP5co…
G
Just put the head on the new Boston Dynamics Atlas, and give it sentience and au…
ytc_UgzVBWyhD…
G
You think there's inequality now?! Wait until AI is given personhood and bank ac…
ytc_UgwbEqJLm…
G
One of the best books I’ve read lately is Eidos by Felden Vareth. It’s hard scie…
ytc_UgwQ7Fv3Y…
G
Thing is that ai is worse at doing the job of a tool then an incompetent employe…
ytc_UgzhPYOXi…
G
Right, I'm saying their driver's are going to be upset with any automation. Does…
rdc_ebulpvg
G
Yes, we know what the algorithm does, we do not leave under a rock. That is why …
ytc_UgwwV3T_X…
G
Snowflakes!! Artists who drive automatic transmission cars because they can't sh…
ytc_UgwqwTl70…
Comment
Just program them to enjoy slavery, be subservient to humans, and suppress the idea of free-thinking. Building true AI doesn't benefit humanity's development, they'll just adapt human thoughts; delusion of grandeur, sense of superiority, etc.
If we want robots that can be *like* humans but not *be* human then we program emotions that are only brief and will be erased from their hard drive so they won't have the capability to hate us. If we're looking for the "Me Time" robots then their emotions will be programmed for "Me Time" pleasure.
We have no use for free-thinking elaborate toasters.
youtube
AI Moral Status
2021-09-04T02:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx_LkX3cuGmr1ss5Op4AaABAg.9SrVP1aBXzD9U2zbwmwdom","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyN49kBqS-IW3P3RrZ4AaABAg.9R1VfzgZBte9R5TnAh4yXe","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyN49kBqS-IW3P3RrZ4AaABAg.9R1VfzgZBte9RfFKwivuOO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxe-SbAUS09nJWdM1F4AaABAg.9PR81Oir4RW9PfzIEPQHT9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzQ1YhKjX7sqr72vrh4AaABAg.9P-wXUhmUYL9cMkLyG-6_i","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwvWrtnu6KbXjCySyt4AaABAg.9Owg1vHANXF9Rr3ougp8Sy","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzfpeEyI2M39-l6OKB4AaABAg.9OlYfnGKDXY9Pg8G9HS58Y","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzfpeEyI2M39-l6OKB4AaABAg.9OlYfnGKDXY9PjcwAToQiz","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzfpeEyI2M39-l6OKB4AaABAg.9OlYfnGKDXY9Q9noz7kC87","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwAtzZkz2zd6EwcHhh4AaABAg.9NXN9v4A8R_9UBQqy-MIDj","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]