Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbh, the reason why so many AI bros are so ignorant of the technical side is bec…
ytc_Ugw0l3Lle…
G
Why are scientists still developing AI if they’re ringing the alarm bells that t…
ytc_UgztLCw8o…
G
Listening to this now after seeing how AI has been used since this presentation …
ytc_Ugx4dbA_F…
G
y'all watched the matrix, and still created that shit, and after hearing what t…
ytc_Ugx_1OcWk…
G
I've been creating software since the CP/M days, 37 years already.. and I've see…
ytc_Ugz3FjYEv…
G
Bad timing. A Tesla robot just quite literally yesterday attacked and clawed it'…
ytc_UgyKc2DDZ…
G
I think AI is going to b a disaster because any technology which came prior to A…
ytc_Ugwe3J3Om…
G
Idk about others, but chatgpt doesn't train based on live feedback, and it doesn…
ytc_UgyKYX2a7…
Comment
I also think that many people are actually excited by catastrophic scenarios. It's in the human nature and many people proove it everyday by enjoying horror movies (or simply movies in wich there is always some kind of negative perspective and so on.). And I also share a similar point of view : we see some people working in AI and high tech fields talking about horror stories that will happen because of AI and so on. And it can also be a way to attract attention you know... When you are an expert in your field and try to scare people a little bit, it could also be a way to feel like people really need to refer to you to see more clearly about the situation. A kind of interaction based on some kind of anxiety codependance if you will... But there are also probably plenty of people who work in those fields who do not share those negative views. In fact, yesterday, I watch a video about one of those people, who has a much more optimistic vision about it.
youtube
AI Moral Status
2025-12-30T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkC2kxOYE6EJyljkB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxobV1Zq_2GhvursrR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweXz3zDuc5snqRLW54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOwJobqViJKblxmpx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoJErS9Fc0ZSMxPRp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwh4LXkO_QewU6hfeR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxz4QRV46B6u5iHkgt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrGEWFMYRxwttGSTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJ4zIRL_w4mdPPkId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwwoz0eeEByULU16PZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]