Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At least AI for military purposes are currently in its infancy at best. So we're…
ytc_Ugyr20mjV…
G
I thought Ai were actual beings that men created who were just really smart and …
ytc_UgwXupdi0…
G
Intelligence of the nation is the skilled and working class, intellect and educa…
ytc_UgyqWeE3M…
G
Do we have the energy to sustain this level of AI? Doesn’t AI require a ton of j…
ytc_UgxQvCaZ7…
G
I did this and it told me everyone is taking control over society elon musk is t…
ytc_Ugw2b2zdo…
G
It’s already a problem. Some folk are believing those obviously fake AI videos. …
ytc_UgwqOwcSk…
G
The difference isn't the medium of the art, it's that Ai art isn't made by a per…
ytc_Ugx-Z96Em…
G
I think too many crongresscritters are waiting until they can figure out how to …
ytc_UgxCI_0lN…
Comment
AI will not take over morals and ethics. By definition it doesn't have emotion - which is where you and I get that sense of 'this is wrong'. Immoral, unethical science is what all our Sci-fi horror and awful history is based on. Time to pray for a solar flare.
youtube
AI Governance
2026-02-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQDwBo1lDE75gIyK14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxijyw2JTSmgFlUD114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVVn5I-BfeXY47f6V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzsAHRfVavkdujhzeV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYGu41aUmCo62uTJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAdFQp3fkXaPKoSgp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQPOE1FaNy5O0Wlct4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjiJxO82MINJJ7F894AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz3bsmp2NCLLSz2yqB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzfxCQvUWZ5U7VINDx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]