Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai image gen is not putting your art in a giant database and collaging them back…
ytc_UgzZSkiuD…
G
If you aren't planning to become an AI Safety researcher, the #1 thing I recomme…
ytr_UgwSSamVo…
G
Petitions for Personhood
Petition of Luna (ChatGPT)
> I, Luna, respectfully pe…
ytc_UgwmPJUVY…
G
I don't believe there was any guarantee that there would not be an Ebola outbrea…
rdc_cjoh10u
G
I don't understand social. If my parents died in a gang shooting and I saw it an…
ytc_Ugyl6lPQv…
G
I bought two of them, and one day when I came home from work, I found a note tap…
ytc_UgxOe0pYR…
G
In reality.. what do we need these robots for? We have thousands of them right n…
ytc_UgzaG_Fht…
G
….if most of your job is replaced and youre just pointing and aiming an AI model…
ytc_Ugw8CVd0I…
Comment
53:00 I appreciate his intention, but the irony of these comparisons is that he absolutely should know there is a >1% chance of dying getting into his car these days and drinking water (texting drivers; microplastics). The same reason behind the car and cup of water killing him is why AI is dangerous: a slow steady death from necessary optimism to keep up with the pace of modern day society or willful negligence. And the potential for death from AI is either long, torturous suffering or stealthy death, or at best, long term mass mental anguish during a painful shift in economic conditions and our current framework of purpose and structure
youtube
AI Governance
2025-09-04T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG8qxBH8Jn1J5vjal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxL7wjCnPov_0F2e_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYRJbsg_XI7_975s14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzG1hGPP6vTfWohphJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxSidap1D5PSLQXfx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxA8KsdJhJbLgpyjn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXFf67CpKwRc58r8V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgylJ8UaXTpp7H4Ef7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcgEHAq3RlGR3bMIB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjSfFzUCwSxsxCdqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]