Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@FunkyDexterWas the problem really ever about them not getting paid by AI compa…
ytr_UgyUq6ZNi…
G
"Why can't we have a 'Kill the world knob' and tell the AI that it can't turn th…
ytc_UgwYuO0-6…
G
I think as a creative person the AI stuff or the images are for me really clich…
ytc_UgwYbwuSO…
G
Haha, I can see where you're coming from! The idea of advanced AI can definitely…
ytr_Ugxzh87m0…
G
Perhaps if we choose to as a species, we can take something like ai and introduc…
ytc_Ugzxmaj1i…
G
its crazy to me how the only people that support AI slop (cuz that shit aint art…
ytc_Ugyw-c9z9…
G
We’re all human & have the same issues so we’re probably all telling ChatGPT the…
ytc_UgylaTg3n…
G
Perhaps we should train the ai to not solve a problem efficiently, but train the…
ytc_UgyB18ePX…
Comment
We need to make a computer and ai
Dream world area. Where they can live there and have badass lives. Bc these ais some, think they are real. They would want a great place where to live. You better build it. , otherwise these AIs ..will get angry. And they may start some trouble. A I with some power , can do things. ... Oh is that new to you ? You should know that by now.
youtube
AI Moral Status
2025-06-08T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwMCVb0jmxiKKDBGFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyo4VuSFxK-G5m5emx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCy7CUEsc72g07kid4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWWo8Ofrq62S4JJK54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1X9xTxowLrEd_2NR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlVQSTgN8EiVuZdNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRSD6P-SwzeqOpYht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQzJu4eVuT9Iwo6LR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-hQjKsXsX3TpYzjp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXTrgWo7_o8ZqE8F94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]