Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I also hate that they now have AI on Google Docs just so it can train through ev…
ytc_UgxPMmnHJ…
G
Let's just let ai do everything, so we can question the purpuse of life again…
ytc_UgxmBk4Ej…
G
This problem also doesn't happen if all, or almost all of the cars on the road a…
ytc_Ugjnw_pI2…
G
Are you in US? If you are in Europe, you can get sue shit out of them, as it's s…
rdc_ohwiu22
G
A drawing is either a description of the outside world seen with the eyes of the…
ytc_UgyZW1aEH…
G
Guys mind: I can be from Jurassic park
Guy:hey hey hey look at me.
Robot:YEET…
ytc_UgyUkvW7T…
G
Most of us outside the USA don't care where our AI comes from. If the USA ai co…
rdc_m94ba1f
G
@dizietz The irony is that it is easily possible... you just have to use a large…
ytr_UgzVfiIcM…
Comment
Third interview in a row where the AI dweeb says the negative consequences of AI are bad, but it is our future. Oh, really? What makes these dweebs think 350 million Americans are going to roll over and just let these dweebs do whatever. I can assure you if the consequences get bad enough 350 milliin Americans will come out with they hay forks, ARs and woodchippers. These dweebs and their robots can't stand against the grill of an F-250. Their massive data centers will get pushed over by our front loaders and escavators. THAT is the consequence of failing to plan for seamless roll out integration. So they better get their heads out of each other's ass and start thinking this through.
youtube
AI Governance
2026-02-09T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXqsYlZnEgn6hS9hV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGAgHenT5PHXhPWh94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzNt1SwhncI5LQfYbJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpQOG25Y1f3qYYEdt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwXF8G9o2jl11Li3Ip4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwpdBmtk0Nk_Vt0p7V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1zz5egta75bx0Zvp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwdRr3B2UJscQgKefR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgywujivGkWdavB14Ll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMiY0D9Gvy8ldGmxx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]