Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's why the robot has to be a spider; I wouldn't want to be walking around in…
ytc_UgxE8cQFp…
G
IS THIS THE QUESTION NOBODY IS ASKING!!!!
If 99% of people lose their jobs and …
ytc_UgwmclQam…
G
@Sir-Anthony-the-unworthy It's always the same 3 excuses you guys use because y…
ytr_UgzV6FmuY…
G
So ridiculous to create copies of us disconnected from what makes us good.,love.…
ytc_UgwZ9z652…
G
I can hire ai I trust it i m machine learning engineer I know how to optimise an…
ytc_UgwPdNeJI…
G
The question for medical students today is not if AI will replace radiologists i…
ytr_Ugws_ERjL…
G
We aren’t rallying to prevent the climate catastrophe from extincting our specie…
rdc_n0h7zog
G
_"Give me access to your email and your money and I'll change your life."_
That'…
ytc_Ugzvd52tI…
Comment
We are in a horrid Catch 22 with respect to charging full steam ahead on AI development. If we put the brakes on we know other actors in the world will charge forward. Someone will take the chance on ending the world because what they gain by winning is so incredibly valuable. So, either we charge forward or we let China or Russia or any number of other nations charge forward. How can we create an "AI Non Proliferation Treaty" without an AI version of Hiroshima and Nagasaki?
youtube
AI Moral Status
2025-10-31T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyqhd1ojsGCVvZgPlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsTdoYt33NfZvZ-WV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6ApNsK2WqjPxpYqd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwL0JQHjbK4UODn7jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8dhtkfIwBPGy6wbp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPCcy5NCD0BmewJep4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpQ7c_Q_2ku-3XfwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwd86KTL7vHqjcQwql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwanIugzMo42bsGlvd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwDbskoB4bN2da38SR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]