Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yawn, who cares? Tired of seeing all this OpenAI drama and don't give a shit ab…
rdc_l69etzu
G
On the philosophical point whether AI could develop to a state where we'd need t…
ytc_UgwKy-2hD…
G
Listen, I get that some people might feel violated and I 100% don't support the …
ytc_Ugx2ZwtlJ…
G
Isn’t ChatGPT abiding by the subjective vow to “never cause harm” ethic of the H…
ytc_Ugx1_jQ9g…
G
The best part about Shad claiming to be an artist using AI art, is his brother i…
ytc_UgyBm5qCK…
G
He definitely knows its driverless car and giving instructions to its system to …
ytc_UgzORmldr…
G
So now you guys are advocating for ai slop? It copies other artists and designer…
ytc_UgwhQZMKe…
G
thing is people like amouranth as you gave as an example, have allowed themselve…
ytr_UgyIBRKQl…
Comment
“Suchir Balaji, a 26-year-old Indian-American artificial intelligence researcher and former OpenAI employee, died on November 26, 2024. His death was officially ruled a suicide by the San Francisco Chief Medical Examiner's Office.
However, Balaji's family has disputed this finding and publicly raised questions about the circumstances of his death, alleging foul play.
Balaji had recently become a whistleblower against OpenAI, accusing the company of violating copyright laws by using internet-sourced data to train its AI models. He had expressed growing concerns about the company's direction and its potential risks to humanity.”
youtube
2025-09-09T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzpihwrxvAu-VjIxlx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhO2UzBHJmAVhfApB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz1GOeLPe5prnQXxhJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy6Y9sXH2zNmzrmWnJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzboKEKcZ9JF3cKjZ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwViEaNnMrrf7OuBkF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwM3B14AuUcXSGE38V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxDotbeazAN2YuhNpp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9uKoo2k8yed-LEhF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzg6lzJ3dWMVc3xFg94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]