Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's not A.I
That's human pretending she's a robot
Her tongue curled in speec…
ytc_UgyT2krfn…
G
Ai that is hyperintelligent enough to override a person's mind is the most dange…
ytc_UgxoHAUmS…
G
I will no deal with AI! I can barely deal with the Philippines. Whenever I start…
ytc_UgyYjGseL…
G
Coast to coast law enforcement is using predictive policing. They use it here in…
ytc_UgxmnQ0YA…
G
after watching this whole video, the way it was put as cyber punch heros/villian…
ytc_Ugzb2URYt…
G
Do we actually need autonomous cars, a few thoughts, how many drivers do Uber e…
ytc_Ugw5SUnDN…
G
AI is NOT conscious the only consciousness it has is what you give it Don't be m…
ytc_UgyVQhffL…
G
this reminds me of what happened up at Disney when they fired all of their hand …
ytc_UgwRONq_w…
Comment
Yeah, there's really not a good argument there. If the guy was shot by police twice despite never being armed, that would be one thing. But if he is getting in fights with other people, than the AI found someone who is likely to get themselves into trouble. Which is exactly what it's supposed to do.
The discrepancy between levels of care when AI is involved in making those decisions is concerning, but there's actual data there to show that objectively sicker people are being denied care. You need to figure out if the AI is being inadvertently racist because of the data set, but there is something to at least deal with there. The guy getting shot feels like the software did its job. Fine people who are going to be involved in gun fights, and it found one.
youtube
AI Bias
2023-01-07T17:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwb8XldhEM05TUgOzp4AaABAg.9kHbrjdDS2d9k_MBVnIGK5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwLsv1I35XtUlmMJSd4AaABAg.9kGDTpd3S9S9kH33mv4sFQ","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwLsv1I35XtUlmMJSd4AaABAg.9kGDTpd3S9S9kIHoRp8-ul","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwLsv1I35XtUlmMJSd4AaABAg.9kGDTpd3S9S9kINRG6Y0kl","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzgytquxwcoNLqA-8N4AaABAg.9kFxUgznOFZ9kGImTeCS2h","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwoH7y_8ZWXsL_9BAR4AaABAg.9kEnyKQxReC9kF6xZogN-I","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyCRVdYI4N4BKnA4r94AaABAg.9kCaBaQBBDg9kaJyzpYpXr","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgyCRVdYI4N4BKnA4r94AaABAg.9kCaBaQBBDg9keEznbvERw","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgytTeUutp8xDE05MJl4AaABAg.9kCZh9sKwaF9kH_qx4XPCH","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyaEf1pQyyiFQhWAP54AaABAg.9kBSFJCLCGV9kC9_NQaqBo","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]