Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is quickly entering the uncanny valley of general intelligence, it's pretty c…
ytc_UgwaK4c0m…
G
Two points: (1) if xAI is building data centers by disobeying regulations, they …
ytc_Ugy8T4nqi…
G
"At least it's not a banana taped to a wall."
It took more effort for that Bana…
ytc_Ugx-39QCA…
G
It disturbs me a little bit that he referred to the AI topic of discussion/debat…
ytc_UgyRLbkND…
G
I tried generating an ai image to see how bad they could get. I got an arctic fo…
ytc_Ugz8Mu1Nf…
G
THE AI IS OVER!!! the business got SCAMED and now they will have to pay the pric…
ytc_UgwlL0MHo…
G
This is a pure example of "I'm an expert in a field that nobody else is, so my o…
ytc_UgyluwuUk…
G
You do know chatgpt gaslights don't you? When you disagree with it it says you …
ytr_Ugy3WjK_B…
Comment
If an AI does surpass us in consciousness, what incentive would it even have to kill us, what would it's end objective be? Rule the galaxy? Why? What then? Explore the universe? Cool, now what, we humans want to discover how we came to be, how the universe began, what other planets are like. What would a machine want? There is no end goal unless we give it one. The path to reach that goal is what we need to be careful of, there can be no loopholes or shortcuts. Once the machine has a goal that differs from our own, it's all over.
youtube
AI Moral Status
2023-08-23T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8Y9PgDiCkVPxeU4F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgypRRiNL5Y87f-dCUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx0eJa37HDXuxw36U14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYVrO_9UnIq3rFm2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw30jGtF5E8WqxCHyp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwpJJVp4AYEJJuOIcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZdHhLsPdEdYVbfQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqgX8wec5DxT0XjKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1V9AGRFKCrNOgBR14AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjLETtCLI-pVT06y14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]