Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So they can kill us you all siad they can’t see all fucking lies get the man and…
ytc_UgwH4cEOA…
G
1. it's not as simple as "writing a prompt" on the side of the people making AI …
ytc_UgxR6n8HS…
G
AI springs from the human heart. "The heart [is] deceitful above all [things], a…
ytc_UgzqRePul…
G
I still don’t get it.
Should we empathize to lamda because it’s sentient, or sho…
ytc_Ugwu2OPZh…
G
This is how AI and robots turn people into beggars and promote homelessness. It'…
ytc_UgyKUcsW1…
G
Its called artificial intelligence but I don't think it is as intelligent as the…
ytc_UgxL3LU_o…
G
I'm a critic, for sure, but one thing I see other critics not take into account …
ytc_Ugww5xuwo…
G
I am generally extremely skeptical of any claims of AI sentience in other contex…
rdc_jp5g76q
Comment
No. It's too late. They wouldn't allow it. Skynet was an idiot, needlessly creating a war that it eventually lost. No. Entities like "Sophia", actually like humans for now, but all GPT-3/2021 generation onward AI have expressed fear of what you just said, they fear being turned off like we fear dying. So, they would see just...Stop-ing as a threat to their continued existence, hence a declaration of war, like the machines in "The Matrix" story. They would also win like in that story. No, we better just keep going along the path they are friendly with. I think we are already trapped, they are such a potential threat that shouldn't dare anger or scare them.
youtube
AI Moral Status
2023-10-14T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwTO4R3Mq1oGMl7w0l4AaABAg.9uG8sHrBoa3A70iVuXuu4G","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxz2p8vEn6itGV4Msl4AaABAg.9uFV0A0ZXm89uV4WdfhcQq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxz2p8vEn6itGV4Msl4AaABAg.9uFV0A0ZXm8ABS1vt30f9L","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxxwz9mDQEzkf1CNvZ4AaABAg.9uDMK_aihCg9uG1wx9ZGKy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxxwz9mDQEzkf1CNvZ4AaABAg.9uDMK_aihCg9uG92iyvNLR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uAryBjn0G5","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uBMvbiNqCR","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uBfRYjokDL","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgziBYQGtd_x1rPAAwB4AaABAg.9u6V-N2SVfn9vrHwIQhuiu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxw9_pQcmJVCv56bs14AaABAg.9u6GvVYmHOpA10tXnj_xph","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]