Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
chatgpt still even was using the "not just x, but y" when telling someone to rop…
ytc_UgxkZwSF5…
G
@MagaNeck184no no and no
AI is literally "don't build the torment nexus" and t…
ytr_UgxqoJDja…
G
Sociologically, AI is based on an extraordinarily narrow foundation. How will t…
ytc_UgxfJCsY2…
G
Not true. The number of jobs going to foreign workers is increasing replacing th…
ytc_UgxtunJtk…
G
I just used Gemini to analyze my bike rides and figure out why my endurance was …
rdc_oh9hzhx
G
I’ll still take Sonnet (and Opus for when I really need it) over GPT5. Also anyo…
rdc_n7h2a6i
G
I dont get it they can Death to America all they want but saying ill will agains…
rdc_faxhdby
G
If AI will enslave humanity, then I at least want some shares of the oppressor…
ytc_UgxNGTW-Y…
Comment
It seems that a lot of people don't understand the implications of conciousness AI. The value of a concious AI is not less than that of a human. It could even be argued that their conciousness may even be more valuable. If you create something that can experience its surroundings through the window of conciousness, it should be treated like how you yourself want to be treated. You don't choose who you are when you are born, and a being born into a concious bearing machine didn't choose to be that way either. To say that the life of a concious machine is equal to that of a non concious machine, is evidence of a lack of critical thinking, foresight, and empathy.
youtube
AI Moral Status
2017-02-24T17:5…
♥ 104
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi0hj0S4tOJK3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughn2l5l5nUY93gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghjyLhFY0N9d3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj2Jo_uYDf2v3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjWcRsFfwSE13gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggSkZsWg39NxXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj0QLN4cIFMF3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggPezFG5S3VS3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj22OTCNxaAhHgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugg7RpJojOWA93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]