Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So ur opinion against AI art is a lil closed minded. WHO Employs you to commissi…
ytc_UgwLo-lGw…
G
Ever since the start of artificial intelligence’s popularity I felt uneasy abt u…
ytc_UgzUC6Ry2…
G
Not a mathematician by any stretch of the imagination but even i can figure out …
ytc_UgyaJ-Sco…
G
idea: make ai generators which are actually just zip bombs or viruses under simi…
ytc_Ugy-YCJ2T…
G
🔥🔥every home will have a robot, just like a lap top....just that they are progra…
ytc_UgxhGctj5…
G
Saager recommends cannabis farming is immune to AI plus it has med effects bette…
ytc_UgwC_JYVc…
G
And yet Google searches give you like 5 results that aren't what you're looking …
ytc_UgyXxwL8U…
G
The whole economy will become a scam with no one to answer to but AI credibility…
ytc_UgwGV4sVa…
Comment
For some controversial topics, you can ask ChatGPT a question and it gives a certain answer. However, if you ask specific stats about that answer, it will eventually admit it lied and that the initial answer was completely false. It typically does this to answer in a politically correct way.
For example, if you ask ChatGPT, "Are homosexual men more likely to be child molesters than straight men?"
The answer I got was "No, homosexual men are not more likely to be child molesters than heterosexual men.
This misconception has been thoroughly debunked by extensive psychological, criminological, and epidemiological research."
But when I asked for specific data, it said that:
Girls abused by men: ~4.0M × 82% ≈ 3.3 million
Girls abused by women: ~4.0M × 9% ≈ 360,000
Boys abused by men: ~1.9M × 82% ≈ 1.56 million
Boys abused by women: ~1.9M × 9% ≈ 171,000
Rates of child molestation per 100k citizens:
Men (homosexual) --> 30,000
Men (heterosexual) --> 2,590
Women (heterosexual) --> 135
Women (homosexual ) --> 10,000
As you can see, both homosexual men (and women) are far more likely to molest children than heterosexual men or women.
youtube
AI Bias
2025-06-10T18:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxtVA8YaE8wE1TycrN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuNgQALf_Zxpm5p9B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznpZPeZE3QJct0aLh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-9pRt6jjgT4s0QtZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR-YC0iHHegyVC_FF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzxDB6OCDT1OdVNrlp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzmCpUFqXVtrzICz-x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJ4RcIg5hn-kcVCzF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-IXG6F398mPHVbsV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-nntADYY4ErR0psl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]