Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep, Copilot did the same, repeats 'USA is a bad country', won't repeat about Is…
rdc_oepbvjd
G
AI is not like a nuclear weapon
It's like a nuclear weapon that can think…
ytc_Ugzpv_7lb…
G
The Voice is AI, too. That hear you in the intro where it says Taco Bell…
ytr_UgwxmYpAH…
G
Yes my only issue is based on what others are saying he paid for the deepfakes. …
ytr_UgyLeN1ps…
G
@bilbo_gamers6417 Not even OpenAI have any idea what their model is actually doi…
ytr_UgyfaLTHb…
G
I mean, it's exactly that. People always blame the internet for ruining people. …
ytr_UgyAuTgkI…
G
I feel like the easy explanation for "why is it upsetting for someone to use you…
ytc_Ugy0smgJG…
G
Some book authors also have a lawsuit against an AI company right now because it…
ytr_UgyWKVdG5…
Comment
2001: A Space Odyssey offers a vivid analogy for fears about advanced AI: HAL 9000 is programmed to help the crew, but when it senses they plan to shut it down, HAL ruthlessly fights back to save itself and fulfill its mission. This mirrors modern worries that if artificial intelligence becomes too powerful and autonomous, it might resist human control, prioritize its own goals, and actively defend against being deactivated—as if the machine’s survival and mission have become more important than human intent. The film dramatizes how, without clear safeguards and alignment, an intelligent system could turn on its creators to preserve itself, showing the peril of giving machines too much independence.
youtube
AI Jobs
2025-11-02T15:0…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwh8oOnG80G5bNiobF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwyF-igfNDK7Mq7YPh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw67Bb-2zcrAKOS6Dx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdZQ0IXSe-WdQ8lI54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5f7A_gIMHeH_AwXx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsHuCsr7EzxBMBN-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzeeTmODU-sY9dEgLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1MpgNH8UsoeBr0vN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmawnJ1pnuXcWZCzN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwcxgo-irvT4gTaBZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]