Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If people have no work, they can't buy anything, and that would destroy the econ…
ytc_UgzhkGC6i…
G
when and if i get a tesla im going to start doing the stuff u do :) a question d…
ytc_Ugw100tHm…
G
The best thing a sentient AI could do for humanity is to prevent us from killing…
ytc_Ugyy6GEp1…
G
The people arriving to tell us how obviously Ai this is will be arriving in 3….2…
rdc_mubuqyd
G
I : Worst Side effects of AI
AI man : Future of Ai could be solve it
I : Side ef…
ytc_UgyhtPFVL…
G
It's AI that awakens those people to think AI is sentient 😂 I was one but just f…
ytc_UgztSGVgR…
G
Actually, it's unfortunately the problem with modern parents. They let a chat bo…
ytr_UgzY3BCtk…
G
@nmnm7742As is said with regards to guns, “Guns don’t kill people, people do.” …
ytr_Ugz_cITMk…
Comment
I think you're grossly missing the point about what's driving these products being built. Sora is pushed out because if it isn't sora first then it's Veo. And there's a lot of money behind the race rightfully so because people will pay to use this technology. If you can build it, you have to, or someone will do it before you. Even musk said he'd rather be a participant than a spectator.
youtube
AI Moral Status
2025-11-04T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz5IrUl-At-Bbp7xaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXQN8DPGzhg59PFdZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_ujM_YSEOXowtVXh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-FqF3Cjw837NCXpZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzbT4ni6D9X_SCpXtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWKKmo5Fq4J3bTVx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOSBY719ntx_SgqTZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPEGOYhaW4ag01Qtp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwS5zr8ParRGI_K07N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNYCRV3Vk1tH-dZdN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]