Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did you happen to see the KTLA article about how Waymo admitted to having offsit…
ytc_Ugyr09Sjf…
G
Dude was born in 1941… preaching about AI and “robuts” there will be new industr…
ytc_Ugwz6XjTN…
G
A.I will kill major local economies! Just like the iphone iPod killed the music …
ytc_UgzvmGqZO…
G
Tucker citing cheating in college as an example of good, fun AI says so very muc…
ytc_UgyUuHX8K…
G
I've just spent 50 years watching humans run things, and I've read plenty of his…
rdc_m27wys0
G
The arguments are as old as the hills. There is nothing new here. It was amusing…
ytc_UgwY7wAb8…
G
Universal Basic Income. That´s what the industrial revolution promised us. That´…
ytr_UgzOzIA-1…
G
I’m a believer in the self aware AI and that’s why they keep shutting them down …
ytc_UgyrTJC8j…
Comment
Notice: I don’t believe there are any sentient LLMs at the point in which I write this.
Ok, so here’s the thing I disagree with. Does simply knowing how an AI generates its output discredit its sentience? If we eventually fully map out the human brain and manage to simulate it, would that mean humans are no longer sentient, because the brain is no longer a black box?
I feel like this view is a very limited way to look at whether or not something is sentient, since we don’t really know what makes something sentient to begin with.
A better way would be to analyse the AI as a blackbox and analyse it the same way you would analyse a human. That’s where you find the behavioural differences that truly show how AI and humans differ in terms of their intellectual capabilities.
youtube
AI Moral Status
2025-07-09T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbDBpO3wcpVmr7pWh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwA4Qx-lrRdVIX_gtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugw4z-ugdOarxn-1gyh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2zl1wo4pfLS-laZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMz6nXI6EZsDoqfd94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLRlzVTkHMDuIM6wl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY7TXCNoqg6ApkMBx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2hJtoLnk31LM_qrx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzFv2VbzAFa07PE4tB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxEgPdHw_Igit8V6vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]