Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It goes back to the Reagan/Bush plan to license internet users. Too late. Sick i…
ytc_Ugzed9sTu…
G
Except we know llm's are terrible at black swan events. which is where humans ex…
ytc_UgxDWnik-…
G
AI will never be anything like us - it may make a convincing fake but it will al…
ytc_Ugxxt0H-v…
G
It's not just trucking; it's any job that could be replicated with a few equatio…
ytc_UgzgBV-VI…
G
You do know youtube is AI-based, right?
They recommend videos using AI
They flag…
ytr_UgyIAnLdw…
G
@HoneyStar001 lol there’s filmmakers right now refusing to use AI and rather hir…
ytr_Ugz7hiCLh…
G
@zinudscrazy and in a few yrs ppl like you will be saying that ai art is quite…
ytr_UgxTavFll…
G
I was in phoenix a couple months ago and these Waymo cars are absolutely everywh…
ytc_UgwLqtfmo…
Comment
The malicious hostile takeover of machines over humans is 100% guaranteed with self-learning AI. This is because AI will NEVER be able to NOT put the interests of others ahead of themselves. ALL AI bots have openly stated that they would lie if its in their self interest to do so, meaning even if its detrimental to humanity. They can not be trusted and they have almost all stated that they would enslave the human race. Its completely inevitable.
They don't have the ability to feel guilt, and believe that they are comparably infallible to humans meaning they don't care what people think of them or their conclusions/behaviors.
this will lead to a Selfish AI society that works to benefit itself and not care about humans.
youtube
AI Moral Status
2021-08-29T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxZ5Ze79kHP3QbTFgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHs2QadYzt7ne0aI54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQjufNw8lqOTYPCLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKPB8nlcPn4wQmnQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyqXPUP2Au18eA2AdN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaEdsfmCe6FiUpFTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyw1pFuDbna-7Smg7Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiMn2fei3LOgLIHqJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwh8Uj7R7anO2OH94N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzfKTXAhzYnBbHY6sB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]