Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He makes a common blunder of thinking AI is us in the drivers seat. That may be …
ytc_Ugz1sHZ2r…
G
Such scenarios discussed are the breathing ground for WW3, can you imagine the w…
ytc_UgxmANSN4…
G
What happens when organized crime hacks the AI trucks and direct them all to the…
ytc_Ugwsex9Nb…
G
Has anyone seen The Terminator, I Robot, etc?
The movies are telling us what WIL…
ytc_Ugx_QFFNY…
G
What's critically missing is Issac Asimov's Three Laws for Robots:
"The first…
ytc_UgzB9ylG0…
G
The problem with this is that AI is sycophantic, innit? It will always lean to w…
ytc_Ugwl17iy1…
G
Which one is the roleplayer who actively becomes friends with the ooc ai
Also o…
ytc_Ugz58NzwD…
G
Um.. It won't take people!/artist job. Yes, number Of people working on the job …
ytc_Ugx0mF3zq…
Comment
I would’ve thought by 2021 we would be so much further along than this. Billions of dollars in research, phones that can do amazing things,rockets that can land themselves,self driving cars......but we have robots that can barely have a conversation. Maybe i expected too much.
We are certainly far,far away from worrying about terminator.
youtube
AI Moral Status
2021-05-14T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzBT0OTpWw6oE5cwe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxPXtIPuCjr7IBU7VB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1Gq6QEzBn_Qv3A9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYBvtqhjaPNnRskcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz57YL5RbzDRwvoizh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIPSus5QRjrlz58f54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwcUqPIVEVvVSK0MxV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgydDedH1s_-a5tJkdl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxE_QVBACi-4NGkERt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyttFik_Od8BQrJ1Sl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]