Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Companies who profit from AI are those who sell AI to other companies that "want…
ytc_UgzDCqBfr…
G
Giving the dam Woke ones that ruined all thing it pretty clear why WE ALL want t…
ytc_UgxSuXeQn…
G
yes, its a good idea on paper, however...
as per usual
the execution was complet…
ytr_Ugy7nh2gI…
G
SWE of 11+ years. Company hiring interns. Using multiple AI tools way over $100 …
rdc_obviz83
G
Americans, Europeans and chinese built their own companies. Africans are still w…
ytr_Ugz1EElGi…
G
How are folks gonna have money to pay for these AI robots if there no jobs?!…
ytc_UgxVqHJop…
G
We should allow the all knowing, all powerful entity to perfect every task to th…
rdc_ju12vn1
G
It can take your job, manipulate your point of view, and more importantly it's n…
ytr_Ugy46y-nr…
Comment
The issue I see with AI is it has to learn. In order to learn, something has tell tell it when it is right or wrong. Someone has to tell it that all this data you are seeing about chemtrails is nonsense so don't include it when you process your inputs. As far as it writing code, code that works isn't the same as good code.
youtube
AI Moral Status
2025-07-24T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzH6TXipICLYs9pgFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyWf18CvHO95gfAoR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwX5YcFnlSjRPk1Vap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCHXgRxtpPUg6cd994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMUMBop0KNA46o58N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugywzy_0LtEhRErC4Lt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwsUjs54L74Xgnvgxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwELpb3zk4KZ5kEjJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIB2DOo1JeJsdFHKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzaBNGj1b-H78DO7AZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]