Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As the 1% person who does not use AI to finish my work i have seen many of my cl…
ytc_UgyVB2jUH…
G
This is a turtle he's a robot that is made to talk to people he is pronounced to…
ytc_UgwpHU_j7…
G
That fucker in the end should explain to us whether he programmed a sense of hum…
ytc_Uggg6DLAl…
G
If you want to give you AI agent, an identity, get it a universal profile on luk…
ytc_Ugx4UvG9e…
G
Six years of university and not a single job. Exactly where I thought I'd be...
…
ytc_Ugxl7a2x0…
G
I hope every streamer gets deepfaked. You're a public figure deal with it you cr…
ytc_Ugz1A1y0E…
G
@demodiums7216 Then let's design it better. AI isn't the problem. It hasn't even…
ytr_UgzZXvGo_…
G
When this problem with industrial automation arose, philosopher-economists inven…
ytc_UgyFw7UgE…
Comment
Nobody paused and nobody is _ever_ going to pause. If one person pauses, another gets to use that opportunity to get ahead. This is game theory 101. It doesn't matter if that seems illogical it is literally never going to change. This is a pandora's box situation and treating it any other way is actively harmful, reallocating resources toward pursuing an impossible outcome is tantamount to throwing them in the garbage. We should allocate far more resources towards advancing AI safety in parallel, not spend time and effort trying to get developers to stop developing.
youtube
AI Responsibility
2025-05-21T17:1…
♥ 923
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugze51w3Ob9E4jZ76514AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz3bv5ABhU4oZwVkzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6ly5qkRd63SuRPdJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlGAr1woayHFmckBJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzqgxx_GOyIrIhQv_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKckZe8u1grR1nO1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3ivKUgyUtbDqQlWh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRUGKC9HtfhWQ421t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxxFhxvn75o0NBT56B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyuV1EknIGxbZqBp_V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})