Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love how none of them had even the slightest thought about art being more than…
ytc_Ugz3G0WdT…
G
But thats not Real or I also haven't dealt much with American robot soldiers. Bu…
ytc_Ugz3DEsTR…
G
Rubbish hype. Embodied AI 🤖 takes decades. I til then AI is jailed in a box call…
ytc_Ugw4p6zxP…
G
You can feed an AI every vincent van gogh ever made for years and it'll still ne…
ytc_UgxS5sMmA…
G
At 59:00 he could be doing a better job of explaining Nick Bostrom's argument. H…
ytc_UgwvBHZVT…
G
I only use YT too. For me, i think it's easy to see what's AI and not, for now. …
ytr_UgzqV7dsA…
G
Humans pretending to be a robot, which in itself is pretending to be human... th…
ytc_UgwUlCN35…
G
Here’s why this all will turn out badly…it’s simple. Without work and compensati…
rdc_kyjbiw3
Comment
In school, my specialization was ethics, including bio ethics. The most rational moral maxim is called utilitarianism. It states, the greatest good for the greatest number - including non humans. Removing suffering is as potnent a concern as generating happiness. So heres fhe thing, it is not at all far-fetched for an AI to determine that the world without us is better than a workd with us. Not even because people commit evil but because its better to remove the sufferung by removing beings that can suffer.
youtube
AI Governance
2023-07-08T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyz-qhKAgMsrCjKpGJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmABykHotX3Z2dH5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQcfnB1kGxw-W-D_p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypERxbNRt1wu1U_xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxO_lyab2YvKvJeB_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDGh2PZcEtax1yM614AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDwAM4-SoOR-W6Veh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1DYWBFBAiRhzrXNl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8wc5R_6h1A3mDFD54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcReb-5b8HoP_J0PF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]