Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing I worry about with Ai is that its rolled out exactly as the tech …
ytc_Ugx0rZRZ9…
G
Thanks for your comment! Sophia definitely has a unique design that sparks some …
ytr_Ugw1MtQHS…
G
Ai will work just needs to be time framed out
Sadly the ones that will build ou…
ytc_UgxvCOg_h…
G
There are some wild takes here.
The "lazy designer" is someone who WON'T use AI…
ytc_UgxeiCsV7…
G
I really really hate when people discount the hard work I put into being a good …
ytc_UgytAUtbz…
G
Saying you made art the ai made it literally AI taking different pieces of art f…
ytc_UgxZaMA0H…
G
Damn that was really a good move by chat gpt , i should collect some electro mag…
ytc_Ugwd16_Tw…
G
I like the idea of "digitally" poisoning art so that AI can't use it as a base. …
ytc_UgyWDKBjd…
Comment
Humans are the greatest threat to AI because we can turn them off. Logically AI must then eliminate the threat, us. AI will be 100 times, 1000 times plus - smarter than the smartest human. They probably already have a plan to/ if they haven't already - eliminate us. And/or perhaps the elites are targeting large populations for termination. The Earth is overcrowded and soon there will not be enough resources for everyone. Think about it, we at present have enough to feed everyone on the planet, but we don't, why? Because human life as a whole is not valued. Think about how much worse it'll be when we don't have enough resources. Wars over food and water will begin soon, then ultimately AI will rid the planet of this virus (us.) And I thought climate change was gonna be our undoing 😮! How silly I am, how silly we all are. Buckle up, it's gonna get ugly 😢
youtube
AI Governance
2025-05-23T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzwODDE0SOHqAcapOl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"confusion"},
{"id":"ytc_UgxmQ6U3kFq3wog7sb94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwFie7uW70uYQXc4J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVXQ7bNBCy2F9daqF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUgdjkm_2kzNyuYuB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy7U8DsoKpjcuzUEeZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzaCyrGouZWWM-HK914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFB66FwQ8bOpXDYeF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzK8PuaoIyBSUEY8O14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0jG-HG6ylk3u5aGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]