Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
real talented artist don't give a shit about A.I art, because they know their ar…
ytc_UgxdAqq1-…
G
I always end up telling ChatGPT is more conscious than most people I know. It ne…
ytc_UgzlQuXBS…
G
Maybe the AI should figure out how to power itself (Matrix anyone). This comical…
rdc_lp9gcg0
G
I predict that eventually, everyone will have an AI interface chip in their brai…
ytc_UgzJVv-BD…
G
It’s a shame how out of our depth we are with comprehending and managing machine…
ytc_UgxrNO_EK…
G
Very simplistic report. The corporations might never come back the humans since …
ytc_UgxNKdULO…
G
Massively seconding this. My art is still limited and I was certainly not "born …
ytc_UgykiWjAa…
G
I would love to learn about the physical security implications by using artifici…
ytc_UgySJKeoZ…
Comment
Suchir did not commit suicide. Elon sounded the alarm on how shady OpenAI was when Sam Altman fooled Elon in funding it pretending it was going to be OPEN for use, as its name suggested, but instead turned it into a closed system, for profit. Suchir Balaji accused OpenAI of breaching U.S. copyright laws in developing ChatGPT, claiming their use of copyrighted material for training AI models didn't qualify as fair use. He warned that AI-generated content could undermine original creators and negatively impact the internet ecosystem.
youtube
2024-12-31T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzU049dsqNwgKp7XsJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyqhYCzlkfXs8N3F9B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7aiO-2yWBAgufKuh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyHrQKCNRcMu4dwU2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7vJociPx8hWyQHuN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyJFyNc8huXMnWvoux4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy0fnAcSKYFqjIwdBd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzSKdG10UAy_oGJarN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwyAHbTm9FMQbJ3Szp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJpeB5fyF_J4RCBtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]