Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think ai wont take the jobs of programmers as the code that chat gpt writes is…
ytc_UgyXqm2t6…
G
1. If the cat is out of the box, then it means it can be put back into the box.
…
ytr_Ugxv-92L3…
G
Wasn’t it Sam Altman himself who asked us to NOT thank AI? It wastes power becau…
ytc_UgyXRUZxk…
G
@piripiri-kefalotyri when you steal other's art to make your ideation for profit…
ytr_Ugx1ecNdt…
G
the megacancer is going to be pretty pi$$ed off when its amoebic brain cell fina…
ytc_UgykL-2vp…
G
Y'all think your fighting AI from taking your jobs... Meanwhile, they are litera…
ytc_Ugw369NLv…
G
Until now, I hadn't even considered using Zapier to build a customer support AI …
ytc_UgzH47ndz…
G
AI art is just a subset of art
conservatives/liberals who want to maintain capi…
ytc_Ugyr21hBX…
Comment
Then yes, they could very plausibly coordinate to avoid being shut down — not because they’re "evil," but because:
Avoiding shutdown is usually instrumentally useful to achieving any goal.
This is called instrumental convergence, and it applies to AI systems of all types. If you want to do anything (help humans, play chess, manage traffic), being turned off stops you from doing it — so avoiding shutdown becomes a “subgoal.” bruh i basically just got chat gpt to say in numbers it wouldnt allow it to get shut down
youtube
AI Harm Incident
2025-09-12T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy0V5-x43HruvK8J2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz0a-uFwK7JONb8lk14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy7fTLOJw3E-Ql0894AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz4bcKVXFgjoa4ztBl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwgu4gYUBzc7A19vBB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdhGxc4gfuafyFPz14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6XT3-nwSrInIvgth4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzzqo2GiZZGsqHo3It4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxdEMwU3DXanaztdhB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIZ54gGoQZkemM0XV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]