Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Possible reason for Google to be dismissive... They are creating a sentient AI t…
ytc_Ugy9PMlQt…
G
Oh yeah I had one on a Godzilla fanpage
Theirs so many art thieves with ai
A…
ytr_UgyhjeeSK…
G
It doesn’t.
Another logical conclusion of mine from the AI boom is that you sho…
rdc_n7wkc2h
G
When I was a kid I used to fear being an artist because whenever I looked online…
ytc_Ugw17fQyO…
G
i am genuinely happy to see ASI. No more dumb politicians, scientists are taken …
ytc_UgxjHo5_d…
G
Also he did forgot to mention about the mass layoffs in the gaming industry, and…
ytr_UgzT0g0qh…
G
Hate? Let me tell you how much i hate ai. If the word hate was engraved into eac…
ytc_Ugwa_CNne…
G
This is just good. If you have money to make the high risk ai stuff you have mon…
ytc_UgwbdIAAO…
Comment
When you ask Max how many lives he would end in order to save AI, he's clearly working under the picture of the future where AI is seamlessly integrated into critical infrastructure, and the idea of removing shutting down all AI would mean crashing civilizations' critical infrastructure..
Max said you can't shut down AI without collapsing society. So he's operating under the impression that he's saving humanity by keeping AI running.
youtube
2025-11-01T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyIPBEUeu0gndIC2mp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMxIbC8lThRrSywDh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6M5mYzOsT5gnXeYB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGlwG6v1LgQjvhz6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgjKuX5Bj8d9Wlyzl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza2swWbvKBUBzRZsV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx6vjWo8Rdu0N339gp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzrq0pzhXczBwQW8Sl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_vIHbFrljOqiUx1B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9I-Bx8Hb4k0nQBeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]