Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nidadursunoglu6663problem is I didn't make up this example—Gemini reportedly d…
ytr_UgyUmVNE1…
G
So if 99% of the population has no jobs, who is going to be buying any Ai servic…
ytc_UgyVGZmDj…
G
cool. now you have a robot that records your entire home and sends it to bored a…
ytc_Ugwzz5Ezn…
G
This is INSANE!! Why are developing robots and A.I? Except to facilitate our own…
ytc_Ugz3mivL4…
G
Also a technology as dangerous as deepfake should be moderated under the law. Yo…
ytc_UgzyyBMvU…
G
I wonder how AI could match up for storytelling
Could AI write actually good and…
ytc_UgwrmPJSu…
G
so your faith has taught you that only the afterlife is worth living, that peopl…
ytr_UgywXfkvB…
G
It’s very bizarre because the tone appears to be offended as if chatgpt will sta…
ytc_UgyayL-9W…
Comment
OK, hear me out on this one. You know how everyone talks about the elite that run the world, the powers that be so to speak. Do we really think that they would allow something that would eventually lead to their financial ruin, a global plague that would kill them, their family members, their friends? Or, what if the whole narrative that AI taking over the world and no one will be able to stop it is exactly what they want us to believe. What if they use it as an excuse to do just that, end the world as we know it? Bill Gates believes the Earth is overpopulated, he’s said that for years. What if they use AI to cull the population and then turn around and blame it on AI and act like they had no control over it? I think they’re going to use this as an excuse. If they truly believed that this was out of their control and that they would be annihilated along with everyone else they would put a stop to this right away. Anyway, just a thought.
youtube
AI Governance
2026-01-08T20:5…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWW7uu4faWK9YiBix4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgylhNzUTbe6R23Felx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhQJMxegBs4FaMsGd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZ5flsnCggMu-ZEEd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5UWVLQEfdaQyUGfp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwk8QPLX6US6-kI4Fl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgweL7zioowZ3BH9kRJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy7zkiLKGtmn93mwQd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxETHT0nuGAvImuQoF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwGNzHryhVgzCMhfjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]