Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am a layman. I have a B.A. in Film and Video. You can Google my full name and …
ytc_UgyecJtbE…
G
That sucks! I’d like to see a robot or ai do the job of an electrician, plumber,…
ytc_UgzDRpD1f…
G
It doesn't mean that AI won't rule us. As it is, our leaders are barely conscio…
ytc_UgyPlkcEa…
G
Yeah! That voice is so buttery smooth I can see why people would say it’s AI 😂
…
ytr_UgzXtnxI0…
G
i really hope that you're right and that we can stop this ai revolution, but all…
ytc_UgysMTU2e…
G
Senator, with all due respect. When the working class no longer exists, that can…
ytc_UgydThhhx…
G
@mierusvision Is that what you say to anybody, who's capable of using correct gr…
ytr_UgwCZhd-_…
G
I’m interested what happens when a company replaces staff with AI. It’s not abou…
ytc_UgyRYZfG4…
Comment
Yes, robots should have rights. No they should not have the same rights as humans. Even if they're self aware, their reward functions are still made by us, meaning we determine what their purpose in "life" is. We should not be worried about the ethics of robot rights, rather we should be worried about making AI safe so that it doesn't try to take over the world as soon as we turn it on
youtube
AI Moral Status
2020-05-13T13:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFwg13HIwDYvN1xzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYCNhwxammyrS6RO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjBPwzT_Qw7n9UD2d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7cMFZrGGurrR-LaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFihmfK6GnXiI18aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwF7AYOXBXbRHE0Bx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQB_SJFgAqADe4Bm54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKaH_8G5iEjt8UWut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY-XCMoxUorvFlSgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRnuXxDby7aOjA9Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]