Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m glad I work archaeology. I’m not replaceable by AI, so everybody who claimed…
rdc_o5xfkra
G
Man i wished ai art didn't exist now We're getting called ai artist when We're r…
ytc_UgwlhUF1H…
G
Elon Musk’s and H. Tech approach should prioritize defense due to escalating con…
ytc_UgyYN1ZwY…
G
To all the AI stans out there talking ableism, genuinely look up the history of …
ytc_UgzNajOyT…
G
Just a thought for discussion. If Quantum Information Panpschysim is proven real…
ytc_Ugx671th-…
G
Its one thing if its a human artist is learning to draw in a different style tha…
ytc_Ugz4JiGK_…
G
I've used a few times. It takes a bit of thinking just to ask the AI the right w…
ytc_UgyjsJXio…
G
4:55 This is getting close to The Big Question(s), in my opinion. Today we have …
ytc_UgwR5aqfE…
Comment
I listen to this, then I think about how much energy we have used to create these chatbots and models which we dont understand, and I feel like its a mistake. We have opened pandoras box here, and rushed over a precipice to which there is no going back from. I dont really know how to put into words how the evolution of this technology makes me feel, because its a lot and complicated, but its not good.
youtube
AI Moral Status
2025-11-01T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz6idoqSOMT011KrQ54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6do0hhd3IvUePHQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNEnzhSgveLp2ys-d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRGPaiFomrZXKI3Fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxFSvkAbdRnDfCWTIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1sIOM6nQI3GAX3UR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzVouUmDwYZPSfiL7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUgjrdxau62lzsYJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_UfQv7wTVXnaSLJB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnWgz4MBt3ZkNVAj14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]