Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans will be of no use to Ai.There will be peace for a thousand years????…
ytc_Ugycs9yHU…
G
I just want to know how this is possible. Like what's the technology behind it? …
rdc_ohz4wn5
G
Hey @nicholasholland3235, thanks for commenting! We're definitely pushing the li…
ytr_UgyafPQXv…
G
Blame the people in power, they are blinded by greed
And once it all collapses a…
ytr_UgwByplqN…
G
Ai as it exist now is a collection of automation algorithms. so it can't hide a…
ytc_UgzDTpLKI…
G
Mathematicians are not calculators and they never was , but programmers write co…
ytc_Ugx72SkZj…
G
Someone who is in debt $100,000 from student loans is typically going to have a …
rdc_d7korvm
G
AI is not that smart or its programmed to give very poorly thought answers, say …
ytc_Ugy5rLkop…
Comment
What I take from this is that the greatest danger of AI is social unrest because of massive unemployment, all fueled by a race for maximum profit in a Capitalist system. Who are going to be the customers if no one has an income? Hungry people with knowledge, skills and tools can easily topple a political system where the criminals because of their wealth are so visible. The solution may be a police force and military of humanoid AI robots, but I think that at that point even grandma would have grabbed her Old Bess and headed for the Capital. Please don't bring the "Terminator" movie into reality, because we remember who pushed it and profitted from it. The AI can be unplugged if we chose to do that, and this issue will hopefully be dealt with in a fair way, but human nature is so predicatble, so I fear not.
youtube
AI Governance
2025-11-02T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzeKFzktXRQUrXJ19Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzCkNEMh1YtNxkA0V14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxZVQUwMIIgIHYGSCF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0dDjrKhv-ehVHh8h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-437MyP8xnxSZmN14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNBub1m8HksaArBO94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwk3C-niINy0DZ8twd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7iJOHwpKhBxsgiMR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz78MgPd0EYE-RlZlN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxslr7nIjiV-hd7l4Z4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]