Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hmm, I have nothing against generative AI. But it's kind of strange that there a…
ytc_UgyN3emEt…
G
I read a lot of the comments down here, I was very glad to see that AI isn't bei…
ytc_UgzC3IgTJ…
G
Still waiting on my trip to mars for 100k......and the hyper loop i was looking …
ytc_UgwkMzppZ…
G
I think you’re underestimating it, or you’ve only barely used AI. I get your poi…
ytc_Ugxu-Gr_7…
G
Three billion artworks from hard working artist
ended on….
The survivors
call…
ytc_UgwgqqPvA…
G
But yet he trusts ai to drive a car...
I mean seriously he would have to trust i…
ytc_UgzIo3K0D…
G
this is the latest answear from AI on why did AI tell him to use Bromide'''''
w…
ytc_UgwLizaqE…
G
Charlie, thank you so so much for supporting this fight for those of us who have…
ytc_UgzzCyFoG…
Comment
Can you imagine trying to get inner city kids to behave in a booth?
I taught at public school and university.
One thing I understand about academia: they have zero foresight and their plans are idealistic. They say it's based on research and consensus. They never taught my classes.
I loved my students, even when they tried (and failed) to test me.
Where's that love going to come from with AI?
They might try it in public schools, but what will happen when the students think it's funny to destroy the equipment?
Who will supervise that?
I know! How about iRobot equiped with tazers?😂😂😂
youtube
2026-03-28T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwumkTSKlkUo73VhM14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY5k1mwxH9v6S3UjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw5b0zWUk_mj1TtmIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0lhZMMHbZFiA9oZZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWvVai9W2ZDiBWdUF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx4wCelDKLz6WtgJ-54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSjqvWkswhAJATkDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxMqDPqtoNsYy3-wmV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwrbdf-RoqBznkOkTt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEgHua6HsmOkwMgK54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"})