Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At first, there will be 2 classes of people: The owners of AI and the rest of h…
ytc_UgxgRHOSL…
G
So much racism and so much hate among humans. How are we supposed to expect AI t…
ytc_UgyXofBHI…
G
Hi Roderick, you got the right answer. Kudos.
The contest is over and winners ha…
ytr_Ugzot1eVH…
G
The people who train these models fill them with so much of their own ridiculous…
ytc_Ugz_5pX6U…
G
What would happen when that AI can create and run code for itself? When it can i…
ytc_UgyrT7mN5…
G
Personally i disagree on the “AI is stealing, AI is ruining…” as an artist. Why?…
ytr_UgzWAniC8…
G
I need to also rant about the fact that already in news media, people are saying…
ytc_UgxPC5HLx…
G
That's an intriguing thought! The way AI algorithms evolve can indeed be influen…
ytr_Ugzp7RT4V…
Comment
So i'm all for having regulations with AI. But as far as this exact example, It should be possible for ChatGPT to have a documentation process that will generate the links/sources it's pulled the information from. Like citations. In My opinion, as far as this exact case goes. The New York times is pissy because OpenAI made it to where people don't have to sleuth through it's websites meaningless BS to find what their looking for or wanting information about. It's multimillion dollar business sue one another to the detriment of regular people.
youtube
AI Responsibility
2026-04-11T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxR6pgJiXK68YCg5G54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSLqlT0uFkccswhZB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwnr-US85qnT5RRmzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1g_kQ7yfuanUJjdJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkKAwwSUCYQHach3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKX0pdAKQSbszDkzt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylHoLsXZKdsV-KfGd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrxsN8OyoTHprHA-B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzah7ADs2OjAvDH95R4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8I5FZY_SvdxmQlL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]