Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jacobbeck6769so how is this different from just taking a photo? It takes you ho…
ytr_UgxF4wNvc…
G
As a developer I have a hard time calling it AI tbh... Generative AI currently i…
ytc_Ugzhxprir…
G
The Samsung HDTV has hard-wired camera and microphone , plus face recognition a…
ytc_Ugy5ruG20…
G
Okay but is it okay for me to learn english grammar from chatGPT? or i should ne…
ytc_Ugw8IEsSF…
G
Evolution in a nut shell, only possible at The Why Files!
What an amazing video,…
ytc_UgxD1912u…
G
That's not the argument. It's whether or not the A.I. model competes directly wi…
ytr_UgwHpJcgw…
G
This will only happen if a robot can cross a street alone in a heavy traffic are…
ytc_UgyGYq5vW…
G
A year later it's no longer refusing to do these things, but it is a bit woke. …
ytc_Ugz-9pRt6…
Comment
Maybe let's not program everything with sentience; a toaster doesn't need thoughts and feelings on African geopolitics, factory automation systems don't need an opinion on environmental policy, and the vast majority of our AI systems don't need to know, want or feel things outside of the scope of their respective functions, or even be capable of feeling anything. If, for some reason you do need to essentially create a synthetic person, then they probably deserve rights, but I'd imagine that in the vast majority of cases, it would be unnecessary to create such systems in the first place.
Sidenote: It seems that a lot of people are suddenly getting this video recommended, is this a cry for help by the YouTube algorithm?
youtube
AI Moral Status
2021-06-19T09:5…
♥ 1144
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxHU_M2M65WV4M8Zbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdlM8VhsXmuVkYe5J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ1YhKjX7sqr72vrh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhDAfx6rGlba3aBch4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_iXEhbh516Qm_sht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_JoK95inMBFOtktd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzfpeEyI2M39-l6OKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwhBD5HzlIalOItQcV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx38pH5GdE39ecFH5h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyg-mUobUOi1ws2Nd94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]