Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans learn by spending hours of practice on their pieces, AI art uses literal …
ytr_UgwFtcWav…
G
AI websites need a citation library available publicly as to not promote chat bo…
ytc_UgyQbi5jj…
G
as an east asian historian myself, you did the right thing. if you use ai for th…
ytr_UgxdvSQeR…
G
Just wanted to say that the theft of it is MORE THAN ENOUGH of a reason for arti…
ytc_UgyPLtOAA…
G
Me as an artist: okay, there’s still a chance I’ll have a job! :D
Me as an offi…
ytc_UgyqCPaW7…
G
"ai bad cuz it cucks you out of your humanity"
did i get it right?…
ytc_UgzmVCQTU…
G
One of these guys seems truly determined to ignore that he has a wonderful privi…
ytc_UgytqZkYn…
G
Very naive of you, if AI advances even further you could probably get that type …
ytc_Ugzi0I_r9…
Comment
Well done on letting the debate play out, it can't have been easy.
I watch/listen to Doom Debates in the hope of hearing good, reasoned arguments as to why our PDoom should be lower and I'm always disappointed.
The only explanations Ball had for his .01% PDoom seemed to be -
There's no evidence of risk, the AI companies won't let it happen, humans are too smart, extinction has never happened before why would it now?
youtube
2025-11-24T18:4…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwRY4E31dRSPY0xEeR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLt72yzaSZcysuV6t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwT2dzH-_BdnWda56x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzRLhb6YUSLbJOYATt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw3AdURtNvdT6vKMVh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzzS_y5pkeUnrwtLn14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzPxnJinf1syFPzTeJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3oeIM3XJcmAYLG-N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzG2dGrn5UfIJ-7uSh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzh-ft4yAvSqIhEN-14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]