Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Intelligence isn't always a good thing or have an absolute definition. If we mod…
ytc_UgyvbulXD…
G
@LilaKarom Yeah my biggest thought is that corporations are genuinely scared of…
ytr_UgzEA3acZ…
G
Or what if we just labeled them as AI so we could filter it out? I know we proba…
rdc_lz80mbj
G
@tommydd9506 I mean, I agree with pretty much all you said here. And yes, I know…
ytr_UgxjtrDLi…
G
How are you sure about this? Because I checked some research studies and they sa…
ytr_UgwrK1XmX…
G
What I'm wondering is how these arguments will be framed and how the debate will…
ytc_UgwOuj2UQ…
G
There should be traceability! Meaning that there needs to be a platform where A…
ytc_Ugx8Q5KLz…
G
I've learned a lot about my own thoughts on ai consciousness by playing Citizen …
ytc_UgyTDmMmd…
Comment
The vast majority of people will no longer be needed by these individuals, as machines and artificial intelligence will replace the vast majority of the world’s population. Therefore, they must already be planning a way to exterminate the undesirables so that only a tiny fraction can inherit the planet Earth.
youtube
AI Governance
2026-04-20T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSBiA4fMkzapFNoul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxJ3-alPC27h5kk8WJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxkr4EsuUk3IV_errh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZSqg6QEfyjtLJegt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmafQrz0Av6n78q6l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgygkseFd1HLE2ErTm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPd6FftoHX4pmd-tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyFl74l1ay5bCgzR9l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFw7UgE3PP2EvTRhJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxmBi7t03LBZ_aXZOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]