Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Legally, the creator owns his art. Can yt change the original without marking i …
ytc_Ugy3OEKxL…
G
I ran the same questions through ChatGPT under the same rules and got a slightly…
ytc_Ugzbv-nYF…
G
AI is not going to take any bodies job unless it is allowed to progress in an UN…
ytc_Ugz2fj2Mx…
G
This was meh fear mongering garbage. You didnt even explain how people have jai…
ytc_UgxmuCRMz…
G
6:53 Do ChatGPT actually say things like "When i- when i was..." like human do??…
ytc_UgzKRNmYx…
G
ai "artists" are just the worst, I saw one the other day who posted a snippet of…
ytc_UgzpcmGm7…
G
So, they have a poorly made, low-quality car sold only on claims of autonomous d…
ytc_Ugw9G2kDw…
G
Wait until their AI buddy starts subtly influencing their political opinions, sp…
rdc_mlgsb2e
Comment
former OpenAI researcher, Suchir Balaji, publicly raised concerns about the company's practices, particularly regarding potential copyright violations, before his death in November 2024. His death was ruled a suicide by the Chief Medical Examiner, but his family has raised questions and concerns, including a request for an FBI investigation. Balaji's case has brought attention to the need for stronger whistleblower protections for individuals working in the rapidly evolving field of artificial intelligence.
youtube
2025-06-21T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzfUyDY2xfgxoYw_NF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxi4SVGbD_WEbDq4GF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8D8fI0w7nKGcDYQB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKWYq4Flt8aRVZCvF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOdgue7bKsBo_TfB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvMWYWdbJD98fubiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzwpghpkqh5dsGycVx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNVXW6WuSKwpCuZ9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy2P1uHX1pd8ZgmEP54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzVeOrxlNKrsqqa5Ud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]