Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@revlarmilion9574 they're reactivating 3 mile island explicitly for microsoft's …
ytr_UgwIkz5MV…
G
Balancing AI in education is tricky, but Olovka's been helpful by supporting my …
ytc_Ugx4z_dC1…
G
Problem is most of the customer service is for things you have no alternative fo…
ytr_UgwwtUllc…
G
Government: use paper straws to save the environment
Government to corporations:…
ytc_Ugwvcxayv…
G
"Mark my words, AI is far more dangerous than nukes"
…
ytc_UgyH-Rzr7…
G
fun fact: most of the video is ai generated. this cannot be any more ironic…
ytc_UgzClODvk…
G
This p*sses me off. We artists spend years building up our skills and style. AI …
ytc_Ugwm7OXt-…
G
Answered your question here! https://www.reddit.com/r/ChatGPT/comments/12wlnwy/a…
rdc_jhgkuvl
Comment
Sentience is defined as the capacity to feel and experience the external world. These people are conflating sentience with consciousness (the ability to have a conception of self) and sapience (the ability to have higher order rationality).
Almost all animals are sentient for example. Some species of apes are conscious (they can look at a mirror and know they are seeing themselves). But only humans are sapient.
LLMs are glorified parrots. It cannot generate a new thought. It cannot create something new. All its responses come from training data (and from the internet). LLMs do not even have the capacity to understand its own output. It is merely predicting what the next token should be. It is an extremely expensive guessing algorithm. Nothing more.
If we want to reach AGI, we will have to try a different approach.
youtube
AI Moral Status
2025-07-10T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxRdZ-FQsgp4ESGU8t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwKq9tObCyJ1NwvlMd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz8JdWaO5PyAWeH_H54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxdESwERxgyxo3jH-d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyvhY-RIVGP8dpuhIx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgxNdo6mtuuoVoQQOgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzvUaAocoCZrPm3D0N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw7spDFDm1P1DFxDBl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwB8otNgByzVxXMd8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyroiwnalEJ5oeBsUN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]