Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's going to happen is AI will take over. Then there will be no humans to car…
ytc_Ugxw5RieN…
G
if it's cost money you have to pay it with your data, i ain't against ai because…
ytc_UgyOf_Uii…
G
As far as I recall (I did university research on the top as part of my undergrad…
ytc_Ugwz8iEmS…
G
😂😂😂😂literally this I get ChatGPT to do the leg work and then create a prompt for…
ytr_UgxCNn21S…
G
To quote the study:
“As evidenced through these services’ abilities to only rec…
ytc_UgzpuEcYM…
G
If OpenAI is going to profit from it then yes, they need to pay a portion of tho…
ytc_UgzvPx2ET…
G
Similar to Medicare and Medicaid, maybe we can have both socialism and capitalis…
ytc_Ugx__aS6P…
G
That's the thing. People don't get that AI is constantly improving. I don't thin…
ytr_UgyCGVpII…
Comment
@Josephkerr101 This is definitely addressed in his writings, but he tends to get a bit hung up on how most people struggle to follow the *how would it kill us*.
I can explain more of the *why would it kill us* from Yudkowsky's framework if you want, but the core idea is that he expects future AGIs to be optimisers of some sort (after all, we train current AI systems to optimise some sort of score, like prediction accuracy), and that general optimisers tend to take control of available resources and avoid countermeasures due to a phenomenon he coined as *instrumental convergence*. If the concept of instrumental convergence is not familiar, I'd be curious if you find it compelling after checking out some of his writing on it. If it is familiar, I'd be curious where you think the gap is, regarding Yudkowsky's explanation of *why the AI would kill us*.
youtube
AI Governance
2024-11-12T10:5…
♥ 29
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAimhHmYdLd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjRPne2vJB","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjgxLAap2x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAmmuu4wRqq","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAokZRx4eik","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAqA9Jy7zvA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJUObSq16yHsAotv54AaABAg.AAiV57RtZaaAAiWFsL8hXk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz_o_4UNQo_GhkZI294AaABAg.AAiUrQt1I1zAAidX4Bf-1N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWZtuw753_fj8wEad4AaABAg.AAiRc3k91HjAAjBSmlsX0M","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwqfDGCzxp8g181J_54AaABAg.AAiNXtAPvrsAArCz1Ewg7J","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]