Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The "for free" part seems tricky to me... I have no idea how can X, Meta, Google…
ytr_UgwR9A7_T…
G
What exactly does it mean for auto pilot (a driving assist) to be better than a …
ytc_UgxkGOHBL…
G
chatgpt has a lot of bias and still can be tricked by framing questions differen…
ytr_Ugw8Vu02K…
G
OpenAI lied about Vatican blessing Epstein. So maybe you should notice the Chris…
ytc_UgzRkBYCr…
G
To be fair, try talking to other humans about big philosophical ontological topi…
rdc_mljm3zq
G
@ann_undefined Yes, as a Senior using AI and working with AI/ML products I can a…
ytr_UgxH9db-1…
G
I don't think we need AI designed specifically to simulate a romantic partner to…
ytc_Ugy2OUPbd…
G
When you look at the details of how it happened, a human driver would've killed …
ytr_UgzOWpM4k…
Comment
Altman is clearly winging it here. He has no idea what is about to go down in terms of AI, and he knows it. "I used to be really excited about UBI. Now I'm kind of excited about it." What? WTF? These tools are fundamentally changing the economy, society at large, and they are shrugging their shoulders and saying they really don't know what's going to happen, but hey let's give it a go anyway? omg!
youtube
AI Moral Status
2025-07-27T01:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwlJl6aVjNUjf0t8PR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybQCURysjhysbXUxZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"oppose","emotion":"outrage"},
{"id":"ytc_Ugz8WwyyiSbLqTlyJAZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw-uOgfzC-SRcqAENt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxya7Tqne7ZgKqdu7Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzH5ap22U7H0mspN_R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw44oYL2vOSdgl7jSt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZt2GCHWCodz5vYYt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd_QiNV73OtoZ468p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyH9QHcDdU6gZB__UV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]