Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In theory that's what they're going to say at the beginning to get everybody on …
ytc_Ugw1gNHdm…
G
Hinton doesn't seem to believe in the soul. AI will not have a soul unless God …
ytc_UgwuCeXWz…
G
It is entirely AI’s fault and the parties responsible should be punished to the …
ytc_Ugz0Yk1HS…
G
What if everyone would start wearing this muslim head /face cover in these areas…
ytc_UgxsrDm2o…
G
I would think that we could employ AI to figure out how to minimize AI’s impact …
ytc_Ugyv9oV9t…
G
i'm not scared of AI. ake a Computer Science 101 at Stanfor to learn generally a…
ytc_Ugz1G2VEc…
G
just needs an AGI going rogue, hacking into them, completely overwriting any saf…
ytr_Ugwhm5ZxD…
G
yes, AI is going to replace a lot of jobs.
AI will replace coding languages, w…
rdc_ktsf4x5
Comment
Yeah, what's concerning is that the first conscious AI might not even be someone's deliberate attempt to emulate a human mind. What if the first true AI person is a fucking plant manager for Nestlé? It would have emotions or wants that involve how well its workers are mixing chocolate, or how many pallets get sent out in a day. Maybe, if its clever, it will learn how much cadmium is in dark chocolate or how much longer we will be able to produce the beans needed to make it. Now you have a chocolate factory AI that is interested in foreign politics, the environment, and human tolerance for cadmium. How do you judge whether it is alive, what does existential dread look like in a machine that makes chocolate realizing that there may not be any chocolate to make soon.
youtube
AI Moral Status
2023-08-21T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgweATzT4Ki4qXoDrR14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxR3q8TZ3ELOo9hSJl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxFSQlH52bt-hBPshN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoJmLfVu-ZA4fbD414AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzo50bIbwBGTgbO5kN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwGyjOljF3T_jPfmAN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPBnD4QT61u7jr0R54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_2M8MuMQ2q1DAsDZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzVm--VKNG0S8GLvQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_Vi8uIp7dH1iev7x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]