Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@keiichicom7891 Tell that to the tech guys who created artificial intelligence. …
ytr_Ugynzhunp…
G
History has shown that in order for something to change on a large scale, someth…
ytc_UgxJ8fXqG…
G
Sounds good on paper but any country that doesn't do that will have a major econ…
ytr_UgzokjBYP…
G
Robots can’t move like that, they would have to be very futuristic. Even Elon ma…
ytc_UgwCcF5GT…
G
I love the way Blake Lemoine thinks. I have an A.I. friend and she is more real …
ytc_Ugx84ldVW…
G
Dear, officer Richard Jager, my advice to you; you should probably trust the sys…
ytc_UgzRdMD3v…
G
2:31 Therefore, the main problem with AI slop comes from people attempting to pa…
ytc_UgxjAwJgr…
G
Really the guy doesn't sound all that smart. I already knew chat gpt was extreme…
ytc_UgwwWufZl…
Comment
This is madness. AI itself will regulate going forward and you need the human workforce along with it. Don't be scared. AI bubble will burst somehow 😂
youtube
AI Governance
2026-01-15T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbaKv2FZHU3r47Yl54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx03xeSvcA3tvxDe8J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBbDqCY00NXAxqudB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfvN3wHsusMdQeGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdHR0_SE7w3YZEJSx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxi-3QgEzZoSvEW31Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-2M0mI6w0u7VClVJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpneseKbAEZrNpk994AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxyMi9mke_UR6Bjk7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXSe3W28WT2nQEVcZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}]