Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need laws that force self driving companies to forfeit their business to acci…
ytc_UgxJIkyjI…
G
When I was a kid, it was illegal for kids to sell street food…or cook it. Now th…
ytr_UgyIwGP12…
G
Listening to a Ai CEO talk about Ai is like listening to a Hippy from the 70's t…
ytc_UgzSJt5lF…
G
GPTHuman AI is the best one i’ve used so far when it comes to making ai content …
ytc_UgyGmaCAV…
G
Won’t even take that long. Model collapse is a known problem when AI gets traine…
rdc_le5qh0c
G
Almost no one who signed the pause letter actually thought it would lead to a pa…
ytr_UgxcPB3M2…
G
While everyone is joking, these robots are serious. We need to start treating th…
ytc_Ugy0L0S6t…
G
@bleachedout805 I look forward to the new $20 per image model AI companies want …
ytr_Ugzj9eBe7…
Comment
@cxms-d5u If not Oppenheimer, someone else would have built the bomb. The science had already matured, and the idea was inevitable. In the same way, if OpenAI stopped building ChatGPT, someone like Elon Musk with Grok, or developers in other parts of the world, would continue the race. Innovation doesn’t pause; it passes from hand to hand, driven by human curiosity and ambition. This is human succession, not in terms of bloodline but in ideas and breakthroughs. But what are we really succeeding toward? Perhaps extinction, perhaps evolution. Humanity keeps pushing boundaries without always pausing to ask if we should, only whether we can. That is our greatest strength and our greatest risk.
youtube
AI Governance
2025-07-24T22:2…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugyzwi8OAlM7xgmC_cd4AaABAg.AIc-gkpv8YZAJ9vGqbMEK5","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzd3qbt8icrrb3bLL94AaABAg.AIZMmTOCmXXAJFh9wTIlfr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzd3qbt8icrrb3bLL94AaABAg.AIZMmTOCmXXAK13SLhtQIB","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxFfcs7EIhEYA5a6H94AaABAg.AIVeZL5NBsCALwtaQx8idb","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzmK57Jy93t2Bi2n_p4AaABAg.AIQ-LDgXkxOAIXD1MJ6tJ9","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxS0CahHJ3aMGqHCdV4AaABAg.AIOjzshv75JALwrZV3QSr8","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxECMSp9QTZTxn3kyF4AaABAg.AIOY2DUltjzAKOVRs_6nwT","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxECMSp9QTZTxn3kyF4AaABAg.AIOY2DUltjzAKyxvUozUgV","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxIvh3pwdFBQeoTCg94AaABAg.9BRI6T-zUVo9LJ4MLd_qIO","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxIvh3pwdFBQeoTCg94AaABAg.9BRI6T-zUVo9vijlSmec8J","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]