Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not a robot wind me up please and I will do it for you…
ytc_Ugx4J3KzA…
G
i think the only acceptable way of saying you own an ai art is if it is ai-assis…
ytc_Ugz7zXV7K…
G
She has been programmed to be very diplomatic. Avoiding tricky questions and g…
ytc_UgyfT9F43…
G
The worst thing humanity can do is unregulate AI and start an artificial intelli…
ytc_UgzNTjPaN…
G
can we pass laws banning ai please
we don’t need this in the public if shit like…
ytc_UgxfqGgSc…
G
I imagine Palantir is basically supplementing the Claude Code style harnesses de…
rdc_o856w4p
G
I think that a big problem in the AI safety discourse at the moment is that we a…
ytc_UgzmiJxCl…
G
"Real art is abelist, AI is accesable" first of, just becouse you are bad at dra…
ytc_UgzkP7BYJ…
Comment
the man basically just told you that they creating AI and Robots to kill off humans🤨 it make you think that this “war” isn’t about what they say it make you think its really about this topic of discussion sound like a weapon of mass destruction(NUCLEAR WEAPON) to me 👁️
youtube
AI Governance
2025-09-06T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwswHnHaifaBBOte-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwF2WpH_3HooUeTvU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaeBMBwph_YWof71Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_r7SSSR6EPrKJri54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvTIt7IFCS6LI_mG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4LjCV4WInSBu24qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOXUfUZHSbbzW5FFx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUj-5rmZZNE41vy4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVpvAFzdxRpdpj0fV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx96vFLHJct_SAlK494AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]