Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe many students in grades 9 through 12 are not reading at a professional…
ytc_UgxB2mMDc…
G
Because she was talking about the ethics of AI development and not the technolog…
ytr_UgwcdOUgC…
G
Gemini told me once to drill into a lithium battery to fit a temperature sensor …
ytc_UgzGLgeJh…
G
Definitely understand your trepidation. 🙂 I dunno... I kinda think people like t…
rdc_mah58bb
G
WOW 🤯 the non-alive robot wants to destroy/unalive real humans that they are mim…
ytc_UgwLnMBib…
G
If you hate open ai specifically, I’m pretty use character ai has their own mode…
ytr_UgwPPoI6u…
G
As someone who made some AI generated pictures (and that is also actually learni…
ytc_Ugybbuusx…
G
My drawing I made when I was in kindergarden has infinity times more soul than A…
ytc_Ugzj4PBFO…
Comment
I've been trying to explain this very point to a lot of my friends. We don't understand consciousness. Without understanding consciousness, there can be no true AGI because it won't be self-aware and it will have to be externally motivated. I.e. It's just a machine, a very "intelligent" machine, but still a machine. People try to anthropomorphize these machines, but AI is in a blank state and depends on a conscious mind to give it a goal. AI won't do much if anything without human motivation pushing it externally to give it a goal. The problem will be the unintentional nudging from stupid humans.
youtube
AI Moral Status
2025-05-22T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxaJbaZFJ-Iom2CUM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRFIJ_4T8Jj4898KJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYAltCmknEOXgbt094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7sebhEuUKswxmmN14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyXWkJ5pGcH9yudETd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaQz4MtajLZYLgCA94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypnKTKdwQvdGhVmhp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwg2V4zU0tbP73O5u14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPbzA2__hQz0tv2vB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNg6SzsinKGY7EgsB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]