Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your feedback. On AITube, we focus on showcasing the capabilities …
ytr_UgywO2mdR…
G
There's no such thing as an "Ai artist"!
That's like someone talking other peopl…
ytc_UgzT4XXuA…
G
We know that there is a massive coverup in AI companies concerning the highly co…
ytc_UgwHvp9-c…
G
AI doesn‘t think by itself. It was PROGRAMED. AI is still programmed by a perso…
ytc_UgxS9JM-k…
G
These LLMs and their implementation is the greatest worst invention in human his…
ytc_UgwKgVTWT…
G
Being a current victim of Ai nonconsensual experimentation, I can fully say... w…
ytc_UgxTR1eTz…
G
Very interesting report. I've also seen an interesting video about crashes and h…
ytc_Ugy2OHKXt…
G
So just like everything else humans make worse in the end by trting to make it b…
ytc_UgwR37hea…
Comment
The second one cannot be said to be racist. As you've given almost no context to it
What happened was really simple, the prediction from the AI was that because he was a friend of a guy that died during a shooting, he was much more likely to be involved in a shooting, be it as the shooter or the one being shot. The police went to his house to tell him that so they could prevent it, either by stopping him from shooting, or protecting him from getting shot. The problema was that because of the predictive service, he was seen as a snitch and thus was shot. So it isn't clear of he was shot as was predicted, or the prediction made this happen.
youtube
AI Bias
2022-12-19T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgysLwPQSpFqZKTm77h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx_kXmkaehDgOrBa5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzhEwkrkJ16JioHJyZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgyoEbzYLggc9yMcvcx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwYzeeXocSySZGQ96x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugwk0FfPEoolcWET7EV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzUeSUL2K-080qwDx14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzLrWtzcu_-FLUwsG94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwxOlOZTewYO7MJKD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwd6GJ1rYTRSBMIiI54AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}]