Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd argue if it's a truly aligned superintelligence, it would see those efforts …
ytr_UgzBUp9cq…
G
lol no thanks. ask where tech ceo are sending their kids, you can bet the top on…
ytc_UgylxB8WN…
G
Well they didn't entirely nail it. Yes the naming scheme changes are a "*product…
rdc_n7k4d7z
G
Maybe AI and computer Oligarchs control the media ,politics and economy. Such …
ytc_UgyMkEnmY…
G
The people that fear AI the most also fear the “global order”. Yet somehow they …
ytc_Ugz1JzHYN…
G
replacing all or even most employees with a.i is stupid asf because when all the…
ytc_UgxEfyNhj…
G
People act as if physical art, digital art and photography are all the same. The…
ytc_Ugwr9Gz0R…
G
I'm disgusted in your behaviour working towards the demise of humanity. You shou…
ytr_Ugw8zZhO0…
Comment
This will probably already have been mentioned, but one scenario would be that humanity just gets wiped out because AI's make climate change worse to try and solve it. We might see a nuclear winter style change, because it reasons it should lower overall temperature to keep running the best it can and solve its problems in 1 shot.
Like they said in the video, there's millions of ways this could end badly.
youtube
AI Moral Status
2026-02-08T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJrN25Teyc-btld014AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhD0gSRExJAwS0zah4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy04Sg04fwpiuLQ4894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxaYhBPs4zE_99anzN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwuENLS4A5s89JlVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9F7Le4mP8wIJOi5N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyTXgAVrJyhNmWFsIt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFYtpcIqNRv9ueuzd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyo4GC6ns59ypmOq-N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoezlHXCPGTlCG_dh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]