Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Id like America to fuck off. No America, no Ai. No AI, no pink hair liberals, no…
ytc_UgxI7Q_pO…
G
We'll see a massive increase in the amount of projects created, a massive decrea…
ytc_UgwXOB-Rh…
G
I draw irl cute little cartoon cats; I like using a cheap scanner and ai to give…
ytr_Ugw7kr_RU…
G
the fact we have to ask if AI is dangerous proves our tech future is uncertain…
ytc_UgxsidEsB…
G
One of those self driving cars freaked me out in Vancouver Canada i was so confu…
ytc_UgwE76nqa…
G
I don't understand why people think AI is so powerful right now... they are lang…
ytr_UgzSiAszP…
G
My god Undertale/Deltarune Sound effects in a “Ai is bad” video, we have reached…
ytc_Ugy8Ge2Nz…
G
How AI will replace the human hearts, experience, life stories, human heart?! NO…
ytc_UgwmKLStx…
Comment
Yes, the alignment is a very real problem when dealing with AI. I have a funny example.
I made an AI vaccum cleaner and one of it's task is to go to the charging pad just before its battery is about to die by taking the most efficient route.
It once got stuck in the middle of the hallway and I picked it up and put it on the charging pad again.
Now, whenever its battery is about to die, it just sits in hallway because that according to it is the most efficient way to get back to the charging pad.
This AI vacuum cleaner is not very complex, I was able to fix this issue in no time but with the complex AI that we will see in future we might never find what led to any undesirable behavior and AI doesn't even has to be conscious for things like this to happen.
youtube
AI Moral Status
2023-08-20T21:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugypjv3bQ2Tz6_WpGpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz8U1BSVaQ54S54eB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiOKIlotGd3U-H54N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw9zvAR2Zt7r1nO_4d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwadnjdnaiJXIk7-Zx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6NBuAW8DASm5TgeJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwSR5nKfd2aD4v_3uR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvVkHeFCMHKtXmkZN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxYh7MjLz_uAMCxhit4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzXX08fCS8uf74lRvl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]