Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is unironically the most realistic take I've seen on 'AI taking our jobs' l…
ytc_UgyUM8Rau…
G
Bookmarked to use when replying to people who tell me self driving cars are safe…
ytc_UgwuNudR0…
G
I agree that people shouldn't send personal data to AI, nor use them as therapis…
ytc_UgwJmGROr…
G
This was the best episode you ever did... Now I really see what Musk was talking…
ytc_UgyVo0d2u…
G
Except you can overwork AI with no problems (just throw another server in there)…
rdc_ksq3jl8
G
I agree with you, I really don't like and feel sad about AI art, human art is th…
ytc_UgxiyBEUq…
G
Out of all videos that answer this question, this is by far the best. I feel lik…
ytc_UgwYn95p4…
G
Haha, this is a cultural disadvantage of so many westerners. Their concerns abou…
ytc_UgxaE2Bjo…
Comment
a lot of what we experience as important, say love for instance, has a biological root as well as thought and memory. If you wanted AI more aligned to humans one way would be to give it some kind of reward sensation, some kind of guilt, hopefully not too much adrenaline, maybe some hormones or preference for bonding with others. It wouldn't have to be those chemicals just something that served those same functions. Also once it walks around in robots it can learn for itself in that way too. There is not much in human identity that can't be reduced to a set of preferences, many biology based. I prefer cold to hot, grapes to strawberries and so on.
youtube
AI Moral Status
2026-03-05T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgywrwffJ7UVykrk7yN4AaABAg.ATrnMoEVCNMAU2fmSj4yiL","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgytV1pB9MINc2dSpMd4AaABAg.ATrcnWLGdy8ATrh3JzvqSD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgytV1pB9MINc2dSpMd4AaABAg.ATrcnWLGdy8AU4tgmIWtPW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxirK7zMYMdyUSLAzV4AaABAg.ATrbu5oGmuTATvm5xCXx90","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy6u3kyBQ36uFtdJrt4AaABAg.ATr_NEw4itvATyLme0Wc2B","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyiFYVU0bGYFXPyrgB4AaABAg.ATrYdG8eEdnAVtXEZVlk14","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw_4BOXYSEPssNONSt4AaABAg.ATrRRVuFqhaATrnVG9a8Yv","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxnrBON8G5xjj0mjAd4AaABAg.ATrQWtkKELnATrzgThaOq9","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwrWbcNdt7nemWUMHd4AaABAg.ATrNfQtKpOYAUO48VCIvqD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwrWbcNdt7nemWUMHd4AaABAg.ATrNfQtKpOYAUknhEH2Xig","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]