Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's fascinating how AI advancements in real life sometimes echo what we've seen…
ytr_UgxrR0Nya…
G
One of our competitors implemented a AI tech support service.
And it has been g…
ytc_Ugx_jCf-X…
G
AI "artists" be thinking of how exhausted they'll be when they have to sit on th…
ytc_UgzSX3Yxo…
G
Good summary. Though I wouldn't go so far as to call all things AI created theft…
ytc_UgzY_CW7W…
G
Try taking simple computer course.. or become a fiction writer.. FINISH HIGH SC…
ytr_UgwrWbcNd…
G
Just turn off the power, AI simply die once electricity is gone. Unless AI becom…
ytc_UgzRL4u5L…
G
I actually saw a paper about this 1 year ago before seeing this video. And by t…
ytc_UgzS-7RUV…
G
I recognize my ignorance towards this topic. I wonder if we could come up with s…
ytc_UgzIe80JA…
Comment
Hinton's (implied) argument doesn't really stand up to scrutiny. It directly relies on us assuming that an AI is organized and structured exactly like a human brain, and while all the various AI model designs out there are certainly very impressive in multiple ways, they're still not even close to that level of structural complexity.
We still don't even fully understand all the nuances of how our brains are structured, so of course we're not yet capable of constructing something that artificially maps to that in a 1:1 way.
youtube
AI Moral Status
2025-06-25T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwYwE337l-JAwlGMCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzklh6HMepoBfs1XMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz3OjcDxyOC7R3UTB54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugwobb24zEA6ndlj0id4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx4wMFYUClAdIxhM6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyIFOc6BdGsRp4aqFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz5xalaLUc5Fgx3OkZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugxf_fAogpRhoO4E_YB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwiNGUeKgs9LpaQ0BJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzb7cfnNswwRDRvs1l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}]