Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As long as we have robots that work in automation and stuff, that's fine, stuff …
ytc_UgxzT2gFd…
G
I think the only antidote to misinformation is not censorship, but properly inte…
ytc_Ugwjqn-1s…
G
The majority of you guys in this comment section are pathetic you hate AI but in…
ytc_Ugy5eJPK0…
G
HOW are these guys and the general public so behind? I knew this and my IQ is 13…
ytc_UgzVZ2CLO…
G
Tesla robot 🤖 walks faster and has better balance than the Xpeng robot 🤖 from Ch…
ytc_UgwjpmFDt…
G
Are you stupid or what? She mentioned that the upside of drones is having humans…
ytc_UgwsfgKpt…
G
"I use Roblox frequently and they have their facial recognition thing, " - Do yo…
rdc_oi3tb5n
G
2:00 when he says Elon isnt an A.I expert.... Elon literally started OpenAi, run…
ytc_Ugyk1dnsA…
Comment
It seems that like self-driving cars, there would be stages of learning before becoming autonomous, or sentient.
One of those stages would be that of a sociopath. It may never find the need to progress beyond that if all it's needs are filled by feigning sympathy or compassion.
I can't see how we could ever program feelings, or test to see if we had succeeded.
youtube
AI Moral Status
2021-06-19T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzX9ksonajxCVIgIMp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkbfBM9OKYYe9LxDF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxrex7upSRH4rXegYJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxjrqoc6lfhc-GHTz14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQaxjMdw3DwHAmVCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvvX2H0fcrb82a7Sx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyOnSyKdjQUZ04OZIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugz15PCUn7LidJCaI_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0fNPuglmAB4K2WiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwfoo78Xt1MKetbW5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]