Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When it comes to AI, there's no better example than Ultron. A whole 10-15 minute…
ytc_Ugx9tRauz…
G
Why do you guys want that SO badly? Where's this INCREDIBLE impatience coming fr…
rdc_m94hz1s
G
The real issue is companies are putting the wrong talent in front of these tools…
ytc_Ugwsp_i36…
G
I kind of love whoever your AI girlfriend is. She's so funny <3 <3 <3…
ytc_UgxWPVTva…
G
Rules and regulations always seem to apply to some, while others revel in their …
ytc_UgycWGogP…
G
In most cases, replacing humans with AI can cause problems, but I see one case w…
ytc_UgxrxA_4H…
G
this is like that one time danny gonzalez tried making an “ai” app and the ai wa…
ytc_Ugzq06j55…
G
So, wait a second...
According to the article, the flaw is that it was using su…
rdc_e7ja25w
Comment
I will not trust a self-driving car until it is capable of driving through Bombay/Mumbay during rush hour without hitting anything or anybody.
youtube
2026-03-26T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgwIjm_CwFoMPelMnZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIFt0xmgYbco21INd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6tg8MN4e0VKN5UtF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDc4Us9SoorjyisYt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyyOObhF9ie8JCHgld4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]