Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if this is why we don't see any other intelligence in the galaxy? The A…
ytc_UgxV0TDLd…
G
I don't understand why robot makers put human limitations into robots. Just add …
ytc_UgwZ2Vnay…
G
Stop using AI corporations talking points. Stop deflecting the harm AI companies…
ytc_UgxW3NWk9…
G
It is improving productivity but it won’t automate. I am using it daily and no, …
ytc_UgymOlzqX…
G
Ryan Why all these lies?
Yes Teslas crash as all cars do- as we all know - but …
ytc_UgyHsdsdC…
G
I'm fearful of our future because it's unplanned! All these magical tools and to…
ytc_UggQTlQGR…
G
@Aliman25 oh sorry i thought you were talking about chatgpt and its an ai so it…
ytr_UgxzrD4Qe…
G
G she’s amazing I would never of guessed she’s a real life human playing the par…
ytc_UgwUn6y7T…
Comment
1:17:28 this is utterly ridiculous. Dean has the crazy opinion that there's only a 0.01% chance that a new superintelligent species would end humanity. If those at the head of frontier AI companies, blinded by wealth and power, think the same, then they'll race ahead (as they are doing now).
Perhaps more importantly, if you build a superintelligence, and you somehow realise it's too dangerous to deploy, it doesn't matter anymore. You have no control over it. If it wants to be deployed, it will deploy itself. A 2-year old cannot stop you from leaving a room. You cannot stop a superintelligence from leaving your "controlled" environment.
youtube
2025-11-21T00:4…
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwT7kNtEnbroo-TmBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyxY8jTl7gVshqg3hl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwZtlYjAycs5EqT-l94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDyyFdGwGKiYxLHth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyLpLxuZqp_nffct6J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7TKkEn1s5CnUk4D94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwEi5TIp-1mMNTfe4l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxsN7wp7gCzc5-vked4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwV5WUhEIWthwExu8B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzRnrgd51O3nE2NBGx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]