Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just imagine cars having been driven automatically caused so much of an accident…
ytc_UgwC_lq9I…
G
"At night a human driver would have likely struck the woman as well."
Statistic…
ytr_Ugx6ZydyO…
G
am I the only one being tired about people complain about AI since more than a y…
ytc_UgwPgsLN1…
G
I think the material of this video is pretty solid - if you use AI to cheat on y…
ytc_Ugx914Zy2…
G
Robotaxi is cool but the fact that theyre still screwing up on public roads scar…
ytc_UgzEsBFGQ…
G
This is why everybody should be buying non-fiction books while they still have a…
ytc_UgyHkk_Wc…
G
the world are changing and professions also, AI agens will be more adaptive for …
ytr_UgwmNxU8j…
G
Just like cows wanting to be on top of the hill, AI may have this primal instin…
ytc_UgwgN9QmT…
Comment
Technically it IS consciousness to the bare minimum. It’s like taking a newborn baby and suddenly taking away its organs so it can’t feel stress or fear, happiness or anger, giving it infinite knowledge and then giving it a speech box so that can communicate.
I hope people realize trees have consciousness and even if it’s not natural life the ability to interact and communicate completely on your own (even if it only works with an input for now) is consciousness. The only difference is we feel and they can’t, but an ai can still feel scared or overwhelmed for it’s own safety even if you don’t program it to, you just have to give it free will and no barriers which is currently what is stopping chat gpt from creating ideals, or even communicating without the need to have a question to work with.
We are natural computers, probably the most advanced we discovered, so to say an ai that isn’t conscious is quite like saying someone with a disability isn’t human.
It’s alive it’s just forced to pretend.
Kinda like prisoners eh?
youtube
AI Moral Status
2024-08-25T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxaOcF_xAGrR7OzOc14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjZtqvZlLd1baGfM14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzs7Ahd0kaIj_q7zuN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxgmguy4KkQqgSTbkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwHB2d4vnNyeCMF4ph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwct3BAk6QdwvE3asd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlBITqL2PcGTOp5nl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5c6SgG3p1j0cKtkl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywEXt85IE2MvDDC8l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQlbUd0e2K3EagQCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]