Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
trying to claim Ai generated images as your own “art” is so stupid. it’s literal…
ytc_Ugztq2vtS…
G
You’re using all emotions and no logic.
The vaccines were made by private orga…
rdc_grs2vuv
G
Steve Jobs is actually the first proto-techbro visionary I can think of. Part of…
ytr_Ugx68-h8X…
G
Don't worry , these companies have already paid off the lawmakers in congress to…
ytc_UgzCO8Ma0…
G
Question: In an era increasingly shaped by AI and generative technologies, could…
ytc_UgxA4dZeI…
G
see it in this perspective! who are the people investing in AI? people with a lo…
ytc_Ugx_0r9ym…
G
This is my thought. AI will make work efficient and require fewer people working…
ytc_Ugyx_FU0g…
G
@Trojanstudios-y3dI here your point and I understand, however this is what I hav…
ytr_UgyCoUpUH…
Comment
What really needs to be stressed like you mentioned is that robots won't care about death of freedom if we don't programm them to care. They don't have any desires if you don't programm a will. So wether or not you are bound by questions about ethics, creating a conscious robot to work in the coal mines untill it stops functioning is not wrong by any reasonable ethical standard. as long as you don't programm it to feel pain that is.
Also the idea that consciousness will simply arise when computers are given enough processing power is based on nothing. I fully believe that we will one day create artificial consciousness, but it will be by mimicking the human brain, not by merely creating a more powerful CPU.
youtube
AI Moral Status
2017-02-24T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghJbDj8OwiAyngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggHUChjfY0yk3gCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjwB_p5V3e0u3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfx-CZ2cENtHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi_gGDWU3jtRXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi6hGv2Ze4Q2ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgikPCAEUdTxB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiwOiGsCZNGongCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggX0c_3N_LNNHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggpDuf-OeISQngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]