Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just recently a tec genius was talking about this, he said he has see a AI that …
ytc_UgxhFZW_e…
G
What's more likely to happen is that there will be a Dotcom-era crash and a whol…
rdc_nxpjk4s
G
Don’t worry about Ai. Worry about Sam Altman a human. Watch his train wreck of …
ytc_UgzXI4OQ9…
G
@elio7610 if you don't understand why it's true then you don't understand the d…
ytr_UgzYzZZQX…
G
This is nonsense. This is transitory. Yes there are some AI evangelists being ov…
ytc_UgwHrruNI…
G
@laurentiuvladutmanea Yeah it is when they say AI art is not art. shit their are…
ytr_UgwI9NLlX…
G
if i know who start ai art i gonna go to the past and hit that guy with my dad b…
ytc_UgyTVVgI-…
G
Blue blood is a crazy cop out for saying I’m too lazy to make art I’ll just have…
ytc_UgzBKBNC0…
Comment
This subject is more complex than just taking the morality out of murder. The A.I. Problems are closer to the Terminator storyline than people realise. Autonomous weapons already exist, for years now, it is the complex A.I. That allows these machines to become dangerous. Not because they will turn on us but because they have the capability of outsmarting our human minds if turned against us through hacking or just by being used by the enemy.
Even the exact SKYNET threat in terminator is a real threat or would be if we attached all our military to a single A.I.
youtube
2015-07-30T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UggHC7rLa4Gu0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggqZ-Bfm6zNFXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiRHZUgZugRGHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghIxxuueQpi6ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghHmu3UOIYh5ngCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjviAibkxEovXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg7AFhd6A9w3ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugix_u8m5HqkxXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugi2IphvaEHTxHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjXp-Uti9IrE3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]