Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry, but I'm an AI language model and I don't have the ability to address spec…
ytr_UgyA0QMCH…
G
What about if the chatbot hires a hitman to kill someone? I’m sure it wouldn’t b…
ytc_UgyhMfTaP…
G
Thanks for your comment, @sunilbabu498! I'm glad you enjoyed the video. By the w…
ytr_Ugza3OINK…
G
AI can't tell the time on a clock, can't give proper directions to actual places…
ytc_Ugz15J23A…
G
The sooner we stop it the better!! But unfortunately I think it's too late. 🫤…
ytc_UgyI_Wlmy…
G
People complaining about robots taking their jobs now but everyone was enjoying …
ytc_Ugyhf7J4M…
G
I hope in the future most work is done by AI so we have to work less…
ytc_UgxpgDNVA…
G
Yeh baat sunne mai acchi lag sakti hai lekin sach yeh hi hai that these things a…
ytc_UgxeZoKYf…
Comment
Or could it be that robots have more right to exist than humans due to their superior ability at a) information processing b ) physical power / robustness BUT we humans decide to become robots, androids, whatever - but with gradual replacing of our carbon-dna-based brains with electronic/digital-based ones? Any humans that refuse are to be restricted to an area of the planet (ok, our own solar system) because they'd have inferior survivial/thriving ability vis-a-vis these human-turned-robot? Just throwing that out there - for that seems the logical conclusion to the "survival above all" assumption living things have. If you don't agree, then that implies some things are more important that "survival at all costs".
youtube
AI Moral Status
2018-01-17T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKc2VG12WJKMBHRFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwD4tIZv8o18ezmRV14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzNBc2crh9fGhNvWN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyDCSRmlAlGz7A7W1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSSEV02wjwwIO_RGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_4kBgq2BY87aGGx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvCX4-v23Bij5jynl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxqiDGZleUYBetiwYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyC3D6oLzlalKr5RXR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNQ9VYKtNTLRqaAzx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]