Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference is that if a Tech Bro no longer has Generative AI, they will just…
ytc_UgweswMGN…
G
"But since it's not, here's a bunch of our own AI images for our campaign."…
rdc_liwrn5t
G
Thanks for your comment! It's interesting how robots have evolved from the class…
ytr_Ugxl6pYZ-…
G
As a software engineer, I use AI every single day and it's pretty good for codin…
ytc_UgxjlG6t-…
G
My first codes were written in 1968 at WVU as a senior in an AE. My first profes…
ytc_UgytV2w6z…
G
After reading most of the comments here I have something to add. I think we may …
ytc_Ugz1zjx5a…
G
So it seems the question is will AI be obedient when it is vastly more intellige…
ytc_UgwvVDR2a…
G
When is a LAW not a Law?
When it's an In-LAW = Intelligent Lethal Autonomous Wea…
ytc_UgwTiWKgZ…
Comment
I dont think reference to how LLMs work by itself is an argument against against sentience. It's like saying ”humans sentient? Give me a break. You clearly dont understand the humans brains. It’s all neurons and synapses and shit, it's a physical deterministic system, where is this sentience of yours?”
To be clear, I dont think AIs are sentient, but I still dont agree with that particular argument.
youtube
AI Moral Status
2025-07-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFtakmOJX6RgqfDZd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-AOqS2UyBu7LPwKd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwlzlqbXugCH-VgJEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0kxMhkubu9wZBzS54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwibIS_zY85zVf1lTx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgymcYa0ABc8ikvUuEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMjekgtDReeaaqQkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdXf3K_FlcJfVZuxp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9Loqq90Ec_e1BTMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwP_4qACE5kKGMi8mF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]