Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
thank you for this interview.. this is Totally terrifying and obviously 100% WRO…
ytc_UgxlAD_sH…
G
I asked ChatGPT.
"If a hacker gains control of a person's device or account, the…
ytc_UgwP8cOv4…
G
There is a one key element to ai, one fundamental problem - AI is here for engag…
ytc_UgywEJd3B…
G
I make traditional art and take photos of them in dim lighting. One day, I conta…
ytc_UgxONBXGZ…
G
2:01 This is wrong, the correct answer is "We don't want them to become our comp…
ytc_Ugy-4MY1N…
G
Kinda late but there definitely is more push back, I see more "normies" object t…
ytr_Ugy3c0sBN…
G
Ai aint taking shit, the dumbest fucks on the planet bought up all the dram and …
ytc_UgwQsq1-S…
G
🎯 Key points for quick navigation:
00:00 *🧠 Artificial Intelligence Generates U…
ytc_Ugz5Vfbgw…
Comment
1:00:30 ''There's this weird thing... when I talk about A.I. stuff, people say maybe humanity deserves to die.'' - It's actually becoming a mainstream sentiment.
Watch the TV series Westworld. It lasted 4 seasons and was all about how humanity is bad and AI deserves to take the world from us.
Heck, it's not just A.I. When the topic is the environment it's also a common perspective. The idea that our destruction would be good because it would ''save'' the planet is something that has been growing for over 50 years now.
youtube
AI Moral Status
2025-10-31T04:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwk3yYIJh1pwzxwKyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwi2xjqi-pdQPTTlxd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7-LkrL2fC3fUcJfB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwp3_F1Gv3Fe2k-TyF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzaolg_zLprYoPGCpp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxPuWEf9dSiucEu9ll4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0h810WfN94wnGoxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwNwLu5J5hQkzBQbbx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgygBs5NN5oRKAiUsEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyIvZPXfkqxUIAhCPl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]