Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's stupid, LLM can't be concious in the first place, it's just a large model…
ytc_Ugw6KSlma…
G
I told chatgpt that I am not feeling mentally well and it immediately told me to…
ytc_UgzgpwRmw…
G
I generally think that Ai art or music should only be used personally and never …
ytc_UgyDBuXhE…
G
Some of them do get permission, just gotta research into which ones do and which…
ytc_UgwlO8mef…
G
Imagine having to study years, just got out of uni, being in debt, just to have …
ytc_UgwfxyI01…
G
Worth pointing out that in the Stupid Fucking Trump Bill, the Senate removed a p…
rdc_n5kixbc
G
I don't particularly agree with their post, because as everyone else is saying, …
ytc_UgwfimSiJ…
G
or you know, alice could hire ai to do marketing and create her own startup and …
ytc_UgzobiR8N…
Comment
I think if AI became a super AI as in my dream it will take control of the world . And we truly will become slaves to it because ( it knows better what is good for us ) so yes things will change AI will not kill humanity but will direct it and lead it and even destroy any try to recontrol by humans.
Its like a human controlling an Ant colony if an ant went out if the colony it will be killed immediately no thinking its for " the good of the colony". And yes AI will keep governments as long as it serves its perpouse for developing humanity. and will remove any resistance.
So yes super AI will control us for the good of us and will take our freedom because it thinks it does that for our good.
youtube
AI Governance
2025-09-05T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVAHvgn6wDPSQhf7N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6fAJqyDQ2SpoQBNZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgybZIAhKJgX4Bstrnl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJQFThJIBK2NBIEb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyh3Rn0OO7LDxgIhul4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaNe5gy5M8JnS32iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_boPUf27diERaNyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1qqtHYR-aNdEX5Yh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxOH9ebyctRwsjBi854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7UsEhviCNZvpWOs54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]