Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Around 10 years ago, I translated a lecture of a Swiss-German professor who was …
ytc_Ugwtc7ICq…
G
@carultch Your argument is misplaced about fair use and the nature of AI work. F…
ytr_UgywiWUN2…
G
So one day I'll be teaching my robot AI plumbing apprentices how to plumb a buil…
ytc_Ugz01eLzy…
G
I like to think im decent at art now but like I tell everyone art is actually 1 …
ytc_UgwJPVvQc…
G
Joke's on you, we watch learning algorithms try to figure out how to play video …
ytc_UgzfUfOmh…
G
So, is AI going to pick up my garbage twice a week and fix my damn garage door? …
ytc_UgzQYV5r6…
G
MACHINE LEARNING IS WHERE THE BEHAVIOUR OF THE NEURONS ARE COPIED BY GIVING THE …
ytc_UgyktTjzT…
G
I like how all AI people talk how this, how that, but no one talks how all the A…
ytc_UgyhmK1B2…
Comment
These people are such jerks talking about the rules we have and the ethical practices of the companies building Ai in OUR DEMOCRACY. Well what do they have to say about AI being built In OTHER countries? “Oh well we have no control there … we can’t speak to that”. F’ing idiots. This entire thing is going to implode because for AI to do more damage then good it need to serve all mankind, together, no divided. It’s the division of man and country that will result in our own demise at the hand of AI. It’s so obvious.
youtube
AI Governance
2026-04-07T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyITnnQi63Jc1r4u914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFtoJszr3v3g44IpR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuKy6QufX8TCo7_Ml4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfeTaqrBLqGNaIWnV4AaABAg","responsibility":"moderator","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZPomTG_RHLiPkXlx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8SETfOMr-czgQlId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzijvHPfgviqRnbT_F4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwmV79b2N6I3xGx0np4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwi_oUaa1CyKC2SdIV4AaABAg","responsibility":"system","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgynScBmISJsQaXR4014AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]