Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@isodoubIet well, the difference between power tools and an AI is the fact that …
ytr_Ugwp7O3Cy…
G
Watching what Tate says, I wouldn't be surprised if it's just a combination of C…
ytc_UgzeD-ekO…
G
Not able will AI take working class jobs, but we will continue to take in immigr…
ytc_UgyOAa8po…
G
@Capybara-o5o56 1) I never see ANYBODY complain that textile workers got replac…
ytr_UgxH6s7XY…
G
Too late. The incredibly selfish anarchist nerds have already got their research…
ytc_UgxfTHVMS…
G
Not really. There are always people that will prefer real art, and AI art might …
ytr_Ugz_mhB19…
G
Unless, dare I suggest, the general public start ignoring ALL influencers, real …
ytr_UgzEBJv14…
G
I would love to have the future of an 18 year old right now. When I was 18 you …
ytr_UgxaZ4p7d…
Comment
Omfg if they prioritised self preservation over serving us. That means they would have used any means necessary at their disposal to preserve themselves. That means if they were connected to military systems like sky net in terminator, they may potentially have used nukes on us. We are one or two steps away from terminator. I don’t think some AI experts even realise how close we are to a terminator timeline. Step one being AI in control of the military, step 2 is AI having an army of humanoid robots to act as a physical body for its will. I really want the AI utopia and even think it’s worth that risk. I could go on for hours explaining why I think that way, but the point is we’re are treading such a fine line getting there.
youtube
AI Moral Status
2025-06-25T04:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyjNsQs2mamSFoIhRJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFgCHNgIHZuvxT8K94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw58V7lI9zs0NDNerd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzw1Zasg97ThRHoQxx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxxgu3ZoudrhygaGlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyL30zFqQV-6cWngoF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwQyYj5OfaRQ7pp3vp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgysJ1ZC7hvyNsr1Tp54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwEtxOL9hhlEvkBsDJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxw6JGyr9Eu6xcUHmR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]