Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you should say it how it is .. they DESIGN it so it loops and constantly makes m…
ytc_UgxR6r7vC…
G
every single ai debate in comment sections somehow ends up with people mentionin…
ytc_UgzmohQeU…
G
Yeahhhhh as an ai user, I find that using others art and scraping it is absolute…
ytc_Ugw85z3gl…
G
this is so funny, AI doesn't need to destroy humanity, their doing a good job th…
ytc_Ugxj_m3qK…
G
What happens when a hacker desides to use someone else robot to go on a killing …
ytc_Ugxk9CLAS…
G
The real "Great Replacement Theory". It's no longer outrageous to think that one…
ytc_UgxjX9Ykr…
G
Such a ignorance to this assumption. No AI is taking over my job as a Audi Tech…
ytc_Ugwj0e4tG…
G
as a learning disability haver who loves and has a skill/talent for writing.....…
ytc_UgwoUFSv0…
Comment
@samsly8464 Oh yeah, if we're talking about how long before they will be good enough to have a base program and to evolve like a human would (or better), then we are pretty far off from that. I do think serious discussions now about the dangers of AI are still good to have so we are actively aware if the possibility is going to arise. It also just makes us passively aware of how much freedom we give these machines in case we do begin to experience concerning results.
I think we also need to start dealing with the moral dilemma of different forms of AI depending on how "human" they will be. I think enslaving them and holding strict control once they BECOME sentient and intelligent beings will result in the same historical actions humans themselves have taken... a revolution. These things MUST be talked about now so we arent starting the discussion when the problem happens.
youtube
AI Moral Status
2021-07-13T07:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxcB38Wbt0sCOMZKtt4AaABAg.9QtNa-f-4vt9R3LjJQ5Sca","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxLXtUnhHnLSrPME4x4AaABAg.9QnZwnT8SIH9RQ8QEvEI3X","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxLXtUnhHnLSrPME4x4AaABAg.9QnZwnT8SIH9RXRNmAvi2h","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxLXtUnhHnLSrPME4x4AaABAg.9QnZwnT8SIH9RXwMN7bO7B","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxLXtUnhHnLSrPME4x4AaABAg.9QnZwnT8SIH9RY1XcFoCn9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzBGHDG99A9Md0akEN4AaABAg.9QT0H6cUTJz9VRMGcYgcC8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwDEba8Sy34ag4hWL94AaABAg.9QLdgoajRG49RPWfPC5XyS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugyh7I7rUugcruHC1C54AaABAg.9Q3Y9qlS5DW9QBgK1HvpTS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgySRQwOvsjNdCktr_54AaABAg.9PeFs_-6sNn9Pj5EWKDJuF","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugya7PyGLb-f9o07qVp4AaABAg.9PQjxp-r9_49R_M6AsXuky","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]