Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you are in school right now,there is no need to learn how to use AI . Most pe…
ytc_Ugwg2e0vX…
G
Funny how you avoid directly answering the originality accusation...
Because, as…
ytc_UgyBaforu…
G
I don’t think it will kill everyone - it will keep a small amount of people in s…
ytc_UgxYktEcm…
G
Probably, control to energy consumption and atomic technologies in many ways... …
ytr_UgxH38vUS…
G
But the robot is still a programmed computer that does not have a mind of its ow…
ytc_Ugx03iE8l…
G
I agree. Let us assume that Jobs go the way of the Dinosaur (soon, or very soon)…
ytc_UgwMHamjg…
G
We do not need this much ai. All it’s doing is taking every bit of privacy from …
ytc_UgwhDvCAz…
G
Fake hoax slam whatever shxt u want on this comment section...... I declair sham…
ytc_UgzKB7WUS…
Comment
This is a really wonderful interview. I'm definitely TikTok-brained when it comes to hour long videos (i.e. I bookmark it and then never end up watching it), but this whole discussion was very interesting. The consciousness thing, while something to be avoided is also interesting. Because you know what else is probably conscious? A bug. Yet no one cares about mistreating insects generally. Also the whole continuity discussion (i.e. does switching off a computer kill it), we don't even know if *we* have that (hopefully though). It's also entirely possible that your non-AI computer is conscious (or maybe even your toaster), since there's nothing I think we know that is inherently special about neural networks that would make them more prone to subjective experience. It's completely unknowable, but intensely freaky.
youtube
AI Moral Status
2025-10-30T23:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQQvsM0pZ2Pcc6XyJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTd3mmuuFbwFhDz2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyU6LkjfEy6yE7vsLd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7__ixBdzXWW0i3wl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxA85KcJuD5T0fNzWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLmMZaehelGsHIHO54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzp_SS048QJdJ2zu8h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhKU69kTjEZEGjyO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7kVAmMghtd4GIvqV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5FQXrM4GiUFcw07V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]