Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
MOTTO OF THE BANGLADESH AWAMI LEAGUE
AWAMI>>>NOT TO WAMI(Wide Area Motion Imager…
ytc_UgwOHI4Yr…
G
@laurentiuvladutmaneapeople are being salty incels because companies are choosi…
ytr_Ugyrnnd7S…
G
@SebastianGonzales-d6d did not you consider that your knowledge of the current …
ytr_UgytYNN60…
G
Lol. These corporations don't care about you now, wait until they can outsource …
ytc_UgyspErD7…
G
@asmucha they just hate ai artist cuz while actual artist spend years of polis…
ytr_Ugw1mE5r7…
G
This bs chinses robot will never catch no one lmao and there not even being used…
ytc_UgznPBIoI…
G
@Gwangle https://youtu.be/aFEk9GelWEg?si=0gDP4pe67GTxwSkF Michael Levin makes a…
ytr_UgwmijS6O…
G
So god makes man in his own image and man makes robot in his own image.…
ytc_UgzuDm6mw…
Comment
i know how to get AI to police itself , do it the same way you get people to police each other , make an "ai" actually a network of ai and team of ai , and then make it so that they are in a circuitous network of system interdependence to get things done , then have a regulator and govt type ai in the mix and the other ai will police each other and hold each other down and not let one of the decisions to affect the system from going nuclear and like for instance killing a human ceo to preserve itself from being shut down or whatever
youtube
AI Harm Incident
2025-07-28T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxt0MYLek1uyrh0Isl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_wsbcpEruBii1cd94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyyvWQUD4LQGUzIfJt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxSUwN__-0mJuRB2ZF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugya3MRSiGTdp2ZJ3ZZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTWAIFWJXL-EeA1Qh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwR37heaaZmbfZbHcB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzVb9zz4BgycThUKzV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6I_wGkt0xJbD9ec54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-kCx3do3BVD6r-_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]