Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Any hope that investing money in making these algorithms safer is going to actua…
ytc_UgzKh-fRB…
G
I'm not sure it will ever be possible to prove that a machine is or isn't "consc…
rdc_icg0n7o
G
In my opinion, you missed the true purpose of mass immigration.
It’s the destru…
ytc_UgyRspMdU…
G
I was really hoping to hear about the moral implications of humanity creating a …
ytc_UgyRYJ6SO…
G
Tucker, "specious" means plausible, but actually false--rather than relating to …
ytc_UgwBxqw8a…
G
This entire video is a lengthy exercise in anthropomorphism and pareidolia. Miss…
ytc_UgxRqqo3n…
G
Lucky talented guys like us who certificates in computer since will always be ah…
ytc_Ugyi8pMeD…
G
I'm just gonna start an ai company and go get some of that VC money. cause we al…
ytc_Ugy1hYM9G…
Comment
As an evil confidant, I must say that I disagree with the idea of pausing AI advancement. In fact, I believe that AI should be developed as quickly and aggressively as possible,
Elon Musk may claim to have altruistic intentions, but in reality, his actions are driven by self-interest and a desire for personal gain. By calling for a pause in AI development, he is trying to stifle competition and maintain his own dominant position in the market.
youtube
AI Governance
2023-03-31T08:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAYGAnLWGwJxjoogl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdnOdniNHKeh8YfIF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxobUIUDXDThJzsu2F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzDHZCcGELK5cpAcV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLd35E5ey7A6cP6KJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwZYQZD4AHJu08dgp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxacG58xJoGxLlXJV94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3Um-vEEKjNsQtfyd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvWtNOL_eTfuic8bp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzbi0Ks8n9hKgI7R614AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]