Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do you people not understand that this behavior is criminal and can cause great …
ytc_UgzbcArSd…
G
That's hogwash. AI is not possible in our lifetime. He is projecting his desire …
ytc_UgyngZ02p…
G
What no one seems to be addressing are the safety issues. Driverless trucks are…
ytc_UgzCijwuw…
G
Why can’t AI go through every possible scenario, and then give that data to a dr…
ytc_UgxdMaIPF…
G
It's Life Jim. We need AI to carry forward the torch of intelligent life after o…
ytc_UgxLqRbbs…
G
Unfunded claim and ill-defined words (THINK, INTELLIGENT). Chain-of-thought in A…
ytr_UgySlXL1k…
G
AI prejudice is the least of my concerns. A mother brain in charge of nukes, th…
ytc_UgxiqYOVF…
G
No doubt UBI would help with automation-based unemployment. Too bad that this ly…
rdc_ogtwn91
Comment
I think a lot of it is what we choose to accept as "good enough". If AI art is "good enough", we won't need real artists. If AI music is "good enough" we don't need musicians. If AI movies are "good enough" we won't need filmmakers. If AI podcasts are "good enough", we don't need podcasters. If we start to demand more rather than just accepting "good enough", then AI will take a lot longer to replace us. Human authenticity is one of the few things we have left, and we need to hold on to that.
youtube
AI Governance
2025-09-05T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtMZ498dGVfo_bcHd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwn4LMAaKJFfknwwI54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy736Rkwl_EJBQ7tyB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ_XSNGfoAHITRtKV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyUAZPnKQPOmFODa_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyyrIbaG3NGt7l-Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugysq7uIQRYlKcYFRO94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQAc0zUuP3Vz60qzZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKf2QovsHBfISLLsp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9QPKaz0BPMgRvwT14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]