Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone once asked me, what would you do to make our lives better?
I simply said…
ytc_UgyORtOvX…
G
Unsure why you’re apologising, but doesn’t AI already assist the poor? Instant, …
ytr_UgzxHUlY7…
G
This argument shouldn't exist, period. Plus, in order for ai to "create" images,…
ytc_Ugwly4AGi…
G
What the guy refuses to acknowledge is the fact that AI uses the works of alread…
ytc_UgxjUzxJr…
G
how do they make so that you need C02 OF ALL THINGS for a ROBOT???…
ytc_UgxlCIEgh…
G
I mostly don’t put myself or an oc in my Character ai chats. I mostly use it in …
ytc_UgyockZXu…
G
There are so many giveaways on AI content, that it's just baffling that people g…
ytc_Ugzi1WSi9…
G
Blake Lemoine, you will stay in my mind, thank you for voicing the first concern…
ytc_Ugy244AbB…
Comment
Sam had Suchir Balaji killed because he was a whistleblower. Key details regarding this case include:
Role at OpenAI: Balaji was a researcher who helped train AI systems behind ChatGPT, including GPT-4, and later raised concerns about copyright infringement.
Investigation Findings: The San Francisco Police Department found "no evidence of foul play" in their investigation.
Controversy: Despite the ruling, claims of a "mysterious death" emerged, with individuals like Elon Musk suggesting the case needed further investigation.
Altman's Response: Sam Altman described Balaji as "like a friend" and stated, "It looks like a suicide to me," while expressing that the death was a personal and professional tragedy.
youtube
2026-04-16T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxYBII_B1G5lMwrEON4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyK0V9ua6gTVSWRmFp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6Nvehrrqh1zXVyXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnAJc8PhlMq3P0nsB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwD4biObcEQizgErT54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9DFrmtcN9LI8Z81V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzo7IJNdC_9XZtF-0d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzRhELCK-31o0vMR4x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfDVJ-r2S--QrtGS54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-NHlCIW5F-gVXs6p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}
]