Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love stealing fit designs from ai art lowkey (from like random ai pins on pint…
ytc_UgxjPOEbo…
G
"Geoffrey Hinton outlines the individual dangers of AI, each already deeply unse…
ytc_Ugxy3lXEA…
G
Any rogue AI would find it more practical to transmit itself away from Earth bec…
ytc_UgyJmj_A3…
G
Ai didn't learn anything that it was not programmed to....one of you behind your…
ytc_UgxzSFNWH…
G
AI is both the most amazing and scary creation Humanity has produced and the fut…
ytc_UgwQ6a0zU…
G
Haha they think that AI can do data analysis but they don't mention their job wh…
ytc_Ugx-mwHj9…
G
If anyone wants to see a possibility of full automation with no jobs for people.…
ytc_UgwEfVEYR…
G
"Sheeit. I didnduu nuffin. It was dis ChatGPTEE Sheeit from da Jewz who say dis …
ytc_UgxUaTUXs…
Comment
The fear of world destruction during the Manhattan Project, stemming from a chain reaction caused by the first atomic bomb explosion, can be understood both connotatively and denotatively. World destruction didn't actually occur, but metaphorically it has been happening since atomic bomb technology was mastered by more than one country. And chain reaction scenarios become more dangerous now that enemy powers in Asia and Europe possess these weapons. AI represents the same risk, or even a greater one, because as it moves into the nuclear sphere, it can be associated with nuclear risk in multiple countries.
youtube
2026-02-13T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyoS6Bpt3SLTr4uA-x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzKLvsO-3A2s4Y1oR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYxVSbwYCt3MUzJhR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugypx6zDOLqQRi9Kz3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCXnZLdsTgoBq3SBF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYFqGTo06i_twalIV4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymPfHmEXli8WofyOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6ZpGDMYWGXnLu_j94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgypgtsABCrJdWscYLp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3JZ85StxbXtfSUAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]