Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It all depends on how you define consciousness - surprisingly enough, there is n…
ytc_UgzcbTP8K…
G
AI fake art is being propped up by venture capital that's starting to pull out..…
ytr_UgwFHlUq7…
G
Mass adoption of Ai= mass reduction of jobs. Mass reduction of jobs means ai fai…
ytc_UgxluX-dF…
G
I think professional writers will start doing this soon, if not already. Adopt A…
ytr_Ugxb9H27m…
G
AI came with all these lofty promises of improving the lives of humans and curin…
ytc_UgxZcLBw0…
G
Idk why some of you even call it Art. I always called those AI images or videos.…
ytc_Ugzs7aVrU…
G
I've stopped using chatgpt due to the constant arguments. It seems everything I …
ytc_UgyyKaibF…
G
AI lacks the one thing essential for any artwork: creativity. You cannot make an…
ytc_UgzZpW1Qm…
Comment
Okay just to be clear on one thing, Yudkowsky DOES think he's predicting the future with his far-fetched sci-fi nonsense, the man has been pedaling skynet-esque conspiracy theory since lonnng before the advent of chat GPT and llms. Just look into the rationalist community, the man has honestly pushed conspiracies that have produced an alarming number of cults and I honestly thing the fact that the wider scientific community still takes him seriously is a little dangerous because of that. Nothing against Hank here, I just think this kind of thing needs addressing in regards to any mention of Yudkowsky
youtube
AI Moral Status
2025-10-30T19:2…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZaFKIYyCfdsSS1R94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy-H-lkhzRZ5AlKyL94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugytbd7OXqG2YXVgmGV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYSjrR-3YQGIB4WPl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAveG1jMFX8tL4D914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxdAjh07VRTFIFpxst4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbH6zoV39vGOifhLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwXTRkXwUWPubPuc-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOJ3BpZjD82hW7Uwx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzBT387s47sOwcPs1Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]