Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Next video please show how companys hide their over hiring mistakes and say they…
ytc_UgyaAtL0N…
G
We're doing a disservice to the AI by starting with this post, Alex? We need to …
ytc_Ugxok1dQ6…
G
We have had the ability to fly planes for 3 decades since the first airbus by co…
ytc_UgzjELIuV…
G
Everyone spooked by AI doing the same things humans do. Shocking.
Folks forget t…
ytc_Ugw2YAif-…
G
@dp_9290 basically ai will replace ALL the jobs including yours too so don't bo…
ytr_UgwV5QwuI…
G
Key Scenarios
• Speed Challenge: Human reacts in 200ms; AI in microseconds.
• Da…
ytc_Ugxkxte6Y…
G
I feel like in a better world AI could have been used as a tool, to aid in model…
ytc_UgyIbsTLc…
G
The most popular AI art tool is stable diffusion, which is open source. One of t…
ytr_UgwinCl-D…
Comment
The AI is not deciding that because of race, it is deciding that because of medical coverage. I'm not sure that is a problem as people that don't pay for something don't usually get it. If you are saying everyone should get free medical care, that is fine, but if you are saying that people with more medical insurance shouldn't receive more care, than what do they get for paying for a service? Correlation is not causation. The AI is not deciding that based on race, it is deciding it based on probability of medical insurance.
youtube
AI Bias
2023-05-05T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxvBQe4ii_b1AsM11R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw90fuhZMcIlAn_-VV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwrxOf6Xc-R6geYI94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQlF0QE-Xvo3RqmAV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugyg7ODjg0hBxEyckul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzu4Y0m2dUM02wDPVJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5aO61dfKetLuEPrN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycY_ey3fCwXvvBjSJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymPSQmIfO8UNpBiNV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwqGQoyTOkZQQHxY4x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]