Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, I don't care so much. Targeted adds? Ok, addblocker. A book about my depres…
ytc_UgyhP7AcH…
G
One day, the Robots, Machines, and AI are going to become sentient and self-awar…
ytc_Ugw2P52LO…
G
y'all really just believe anything huh?
Go research AI, go understand how LLMs …
ytc_Ugz87oJ55…
G
To me the only thing that matters is that we are seeing the models can be so muc…
rdc_m9ghdvu
G
Detailed Context:
On November 8, 2023, An industrial robot crushed a worker to d…
ytc_UgxwXArbo…
G
I thought I've seen the first woman as she is in the "deepfake" version before w…
ytc_Ugw62BMci…
G
Man is a child playing with self-destructive hubris on so many vectors. I give t…
ytc_UgxnWUcBu…
G
I don’t care how good AI is I don’t put it at the same level as a human research…
ytc_UgxQT5ehF…
Comment
Respect for Sir Roger Penrose, but I think his argument about AI and consciousness has logical flaws. He assumes intelligence requires consciousness without proving it - many systems show intelligent behavior without consciousness (like ant colonies organizing complex structures, immune systems adapting to threats). His definition of consciousness uses circular reasoning when saying "understanding requires consciousness." The Gödel theorem example doesn't actually prove consciousness is necessary for understanding mathematics. He's essentially confusing intelligence (problem-solving ability) with human-like understanding. It's like saying cars aren't "real transportation" because they don't have horses (just as horses were once the only form of "real" transportation)
youtube
AI Moral Status
2025-04-30T06:0…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxlg-5UOm9vl96i6IB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxw2Oxwav0EowsDsl14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWuXciZp25XxRaLKx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwARcp23wfD6hn6Mt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZd1kEP1B3vi3kjdB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHa7pj52iHoqkDYbt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtFNiUOKdzz6EohZt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyq-5aMNXrCsz-dXTd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRDNXTTF4BzXEfZPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyaRzbajUyZrMxYESh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]