Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JedsterNYCI don't really know about dentists and hairdressers. You sit in a ch…
ytr_UgwJkGVgP…
G
AI is the Microsoft Office of this generation. Kids in 20 years will complain ab…
ytc_Ugw-qEdvN…
G
It's the real danger with AI: people are rushing to use it & getting some seriou…
ytc_Ugxcl5Gm5…
G
Trades jobs aren't real jobs.
AI painters, AI plumbers, AI pipe layers.
I ho…
ytc_UgxACHml2…
G
I blame this vid for my over 120 hours at least of screen time on character ai. …
ytc_UgxTfrMxC…
G
I do not think it has something to do with AI or programmed response not being c…
ytc_UgynWbzYF…
G
A person using AI to generate art isn’t acting as an artist, they’re acting as a…
ytc_UgxFYeK89…
G
Even if they were legitimately attempting an experimental new approach to lawyer…
ytc_UgyDBvhhA…
Comment
Again, when you have Republicans at the helm, you have little to no oversight on being over-capitalized in the interest of solely making money. AI has many good uses, but it should never replace human intervention nor assume that there won't be human need. If we can protect vulnerable populations from being victimized from AI and keep it out of the hands of a few tech billionaires who have no interest in the human condition, then I think we can succeed. But at present, I see no sign of this with the likes of Musk, Zuckerberg, Thiel, Bezos, or Ellison. They are greedy, self-centered bastards that only use the technology for their own advantage and financial gain and not to help people or society improve.
youtube
AI Governance
2025-12-30T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzTEofDEFbZGNj7fCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4fCgyRxLaKhAMmKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1H2JsitOngbxxJRV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWb_F6gvh4Kx4Gxzt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyBrsQS3mOEu8NzgPJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzLnFGY8OvjpLIG6Pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPXXR9S45Yhabp2KF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwonoOM4c1fqFYhMwJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwuYyX9rSIdpkaV9NN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSBwQsgEdW0Z_Mkn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]