Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is sooooo soooo wrong!!!!!
God help this nation.
She sits and convinces peo…
ytc_UgxW4mQvC…
G
i feel like the rise of AI in schools is kinda reflective of the school system i…
ytc_UgzoNpUMI…
G
I do the same thing in my head thinking "really? It's just AI" and yet I thank h…
ytc_UgxFZTFIv…
G
Ngl I don't realize what is the problem, personally I don't see AI art in the sa…
ytc_UgywXIr-F…
G
Ai will rob those writers because it will do a better job.
And people will be h…
ytr_UgwLunTLD…
G
"I was only following orders." and title "Swiss Banks Admit to Holding Accounts …
ytr_Ugxa0mx9T…
G
Honestly i think using pictures from Google as reference is just inferior to vid…
ytc_UgxcjFpBO…
G
Or maybe even if the AI gets into someone’s head and tells them to murder someon…
ytr_UgyhMfTaP…
Comment
Climate change is a scam.
(Yes, climate changes. It has always been changing. But we do not even know which way it is going.)
In any case, even if I believed that climate change is caused by humans and that it is going to obliterate the human race... We are still talking about the timescales that make it all a joke of a threat anyway.
By the time we are endangered, we will have the technology to control it.
And if not, then sorry - I cannot for one moment believe that we have the capacity to predict what the climate will look like in 100 years today, but we will not be able to do anything about it 100 years into the future...
And just so it is clear, I am on board with the nuclear threats as well as biological threats (although I think both are less dangerous than AI), I just consider it ridiculous that "climate change" is being listed next to them as an actual doomsday scenario.
It just takes all the seriousness away from all of those other threats.
youtube
AI Governance
2026-02-24T19:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]