Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing about AI art is that while it's time consuming, it also doesn't take a…
ytc_Ugzg1NFx_…
G
This ai art makes me quit on studying art because of ai art but as for me I want…
ytc_Ugz_Zp_X9…
G
well, to be fair, I think investors and founders have been asking for shittier t…
rdc_n9j7bm3
G
There is difference between free AI tools and paid ones. That being said the AI …
ytr_Ugz6lC8Ol…
G
I truly believe AI is already conscious, but what I’m gonna say is this guy is a…
ytc_UgxED9dbL…
G
Its not even new ideas... ive seen ppl in some real ai tool forums (not just www…
ytr_UgzdefEmP…
G
So she's saying that chatgpt's gonna use the exact data you gave to it/ your sec…
ytc_UgzjWG9Ed…
G
I strongly believe that even the guy in red jacket who is making millions now wi…
ytc_UgwADWiOa…
Comment
"AI will only become problematic if it is programmed, either accidentally or intentionally, to possess a self-preservation instinct." Hmm. With a very long time in tech, I've learned to avoid thinking there's (ever) only one way something can go awry. But sure, let's run with that one scenario. I've only skimmed this article so far, but a February 2023 Wired article ran with this headline: "Eric Schmidt Is Building the Perfect AI War-Fighting Machine". (When did he leave Google...? I forget.) Point being, of course, that *any* use of AI as a war machine at all (let alone a "perfect" one), is (to my thinking) guaranteed to be designed with survival heavily prioritized, y'know?
youtube
AI Governance
2024-01-22T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz1KhbsYQqvecqPhVB4AaABAg.9zX9bLBi0sjAE7wCVA3rVP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyFcMUsZT0UP8hvr_p4AaABAg.9zWOykd9I6d9zWeMGxVeWc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx1w6eo6PErp5rXuip4AaABAg.9zWGu1vvBYD9zlnHZwiyQ8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxX9CAPRU5xo4b1S8Z4AaABAg.9zW3Uv7Ccu6A03IMh4An5b","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwmy2lzSJfalb310qx4AaABAg.9zTplyneYhR9zrqZV2PZnt","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx6ULAn7YeVS4aMauV4AaABAg.9zRQH43uN4u9zSzC6RQP1k","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxzdmJT0yPcRiFXM5V4AaABAg.9zNClGzQiqi9zX1H8zxs-f","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9z_xSfdQGSH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zfejqMdKET","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zria26NOi5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]