Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
jimmys dead wrong. The whole of industrial history proves that humans are always…
ytc_UgiziBhZz…
G
This is why I just don't post my work online, I love transforming my own origina…
ytc_UgyFgic1k…
G
as someone who used to do online rp and wanted refs and "selfies" of my characte…
ytc_Ugzpmligc…
G
They sell to the few that have jobs around the world. They give tickets and rai…
ytr_Ugw4_wBUJ…
G
This video could've ended in 10 seconds if ChatGPT simply said, "I'm an LLM mode…
ytc_Ugz6-3jjq…
G
Here is a suggestion for those interested in the possibility of ai consciousness…
ytc_Ugwk2ARgG…
G
Don’t worry because rather than our government combatting the real problems here…
rdc_lj9148h
G
Do you understand how dumb the general population is to think desk jobs with deg…
ytc_UgywAAF0G…
Comment
This guy's thinking is deeply flawed in at least four ways (beyond already pointed out by Sam Harris). (1) There are powerful incentives to view the AI's as people, including economic ones and personal ones (maybe this AI is your lost parent, or is a real friend, or people want a doctor / teacher / therapist who is a person). (2) People can believe AI's are not people for reasons beyond this guy's imagined need for people to want a slave/master relationship / economic benefit; maybe people have different belief systems about consciousness than this guy, like intelligence and consciousness being different properties. (3) This guy pretends like consciousness is a solved problem (actually Sam Harris hints at how ridiculous this guy is when he mentions that the Turing test was not a thing). (4) This guy does not realize that if the robots actually are not conscious but we treat them so, then we will be replacing conscious life with a husk.
youtube
AI Moral Status
2026-04-06T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxd-16xAhgIrJXWzLJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxgCWSW-eClYf6NIPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5mppCvlbj-HDaEBh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6CT7hF5hLGFa-0mN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjBEyNbxEmyuNRhgR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwz3JiMAKtryQWqEiR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxuFhVcfbKYwQqv5E14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAZ_LxCu3tiF12E9l4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZXjdSS7kd3w6LMSJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyEhROkEKS8Esph1v14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]