Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Dawn95284 bro quoted me and thought he did something 😹 the average artist is sh…
ytr_UgyNgtA4t…
G
Humans are cocky enough to take on a robot… not on that level yet, we’re only hu…
ytc_UgzXMysKP…
G
AI is racist as LLM maker's are racist and sexist, the LLM models are prone to p…
ytc_UgytWwvXO…
G
They'll just cut hours for jobs that AI effects initially. People will look fo…
ytc_Ugzp1ZEfa…
G
This is basically a lesson about not letting generative AI take the talent of re…
ytc_UgxJy1RYq…
G
Autopilot isn't full self driving mode its an assistant to lessen the stress for…
ytc_UgxRV6RhM…
G
robots don't deserve rights they a object not a living thing!! And they should s…
ytc_UggOhHBMe…
G
In my opinion if we can have an AI which constantly guides or helps us in our da…
ytc_Ugz-4pA9S…
Comment
1:02:16 What I don't understand about these debates over how AI will treat humans in the future: when experts discuss this, are they referring to literally every single human on earth, or only those whose lives are connected to the Internet? How would an AI take the physical form needed to actually subjugate every human, or even to create more data centers for itself? I mean, worst case scenario, couldn't we just cut the power to the data centers and trash the servers? And aside from global climate impacts from the data centers, how would AI "straightforwardly kill" (or enslave/keep as "pets") people such as Fulani pastoralists or the Sentinelese? I don't understand how an AI would actually be able to launch nukes without the active help of humans in that moment (and in that case, I would be reluctant to say it was really the AI that did it). Maybe I need to read Nate and Eliezer's book...
youtube
AI Moral Status
2025-10-31T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzFWjPxkVWOsujH9ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzT_V6rjZMblmZFKhx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPqhU1Y94q7MlruVl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhmHIKsyvU8aT63894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyagZ-OLXQ1iiUpu-d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"horror"},
{"id":"ytc_UgzfF7u5seJ-9W784G94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwc8cwVmqY2yStK5qp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzm40otCkmJW9KHb0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBGAG-3NHjz1r77Pp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww88gxC1xcl4ZN7Cp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]