Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It saddens me how afraid we are of Ai matching or exceeding our intelligence, I …
ytc_UgxUNnhc9…
G
i love how we all are saying to deepfake the politicians so they will have to ca…
ytc_Ugx6HEk9c…
G
I mean, I talk to AI's to help me process my intense anger without actually havi…
ytc_UgyHe2Eoj…
G
From the Associated Press: The Department of Defense’s move to label Anthropic …
rdc_o7yfcps
G
AI artists shouldn't call themselves artists just because they are able to give …
ytc_UgyXBp5FK…
G
What he is explaining is a soul. A.I. doesn’t have a soul. It can never develop …
ytc_Ugy3I4NFq…
G
AI is trained from human content. That means it's a reflection of humanity. Mayb…
ytc_UgxQL2QUD…
G
Calling yourself an "Ai artist" is like calling yourself chess grandmaster but y…
ytc_UgyL6KVc2…
Comment
One thing to remember, smart weapons (like ai powered weapons) let the military be more precise with their targeting. In world war 2 the solution to targeting a factory producing weapons was to carpet bomb an area... These attacks, when done in cities, would kill tens of thousands of people. Around 700,000 civilians were killed in this way as collateral damage.
The idea of autonomous weapons sounds scary, but human operated weapons have errors to this day (example, the US accidentally destroyed a girl's school in Iran just recently killing over 100 children.). So getting away from automated weapons doesn't keep people safer, it more likely puts them in greater danger.
youtube
2026-03-12T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgQRwbcMlHpGU6Dpd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU7a5jZ_azgtkoM0p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPQA_60Y9dLqP4mxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWZF_HRI6QNB_PcTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxdOdxkPGATRR-tqYR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwj8ojZCEIDgVnM65p4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFTad-ADtATZKXRFZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyvE4S7w2SdJooUT3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkGAJLEAcvlKwVDaV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwl_yAFSpOLzFHw0rt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]