Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But ai can be configured in such a way that all the work of human developers can…
ytc_UgwY2euVp…
G
Nah, they’re the only one who always focus on positives. In these few months job…
ytr_UgysK3KHb…
G
Ok lemme phrase it this way. What is art? It(the ai) is creating nothing. It is…
ytc_Ugw9kXZe2…
G
It isn't preposterous at all, it's just plain fact. Humans don't have inbuilt GP…
ytc_UgyUe-5vT…
G
Biggest spy ops in american history,making america great again .Ther was another…
ytc_UgwFB59lP…
G
I don't comment on YouTube videos but I needed to here. I spend every day of my …
ytc_UgyEFQOca…
G
@UnemployedStormtrooper ya and that's the kind of basic bitch thinking I'm accu…
ytr_UgwBpKTMk…
G
copilot is a tool. Just like hammer. Great for some things, not so much for othe…
ytc_Ugx0-1egi…
Comment
Utter nonsense. Genuine profit seeking companies do not want war. It is governments that lead us into death and so no, I would rather trust a private profit seeking companies to have the leads on AI than centralised governments. History is on my side.
As we watch the "godfather" of AI pander to an already politicized field on academia and science.
youtube
AI Responsibility
2025-07-23T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxOabUNkGS9MtFR_6h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn8profBdVsYYWRkJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVFe7BVwjrDaPaQo54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiHGO5ATQdbipcPq94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzHL2SxgjqcAS-CO3V4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw-OonEfbpp7MnYx6N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwfkaQWhtExfpiW-QV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ2faeC74Ad9Ax99h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxaH1amCKG4niFJlRZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRH-MGYo3X6Iz3iwF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]