Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If I take a picture of an art piece and then use that image to train my AI is th…
ytc_Ugy7oMY90…
G
Lol a robot plumber would never exist. I wana see one fixing a water main live 😅…
ytc_UgxsXDLY1…
G
1. it's not as simple as "writing a prompt" on the side of the people making AI …
ytc_UgxR6n8HS…
G
I’ll be honest, I was on the side of A.I. I thought it was weird that Artists w…
ytc_UgxLzIRCM…
G
I'm kind of curious what people think about pixel art since I picked that up rec…
ytc_Ugyd7MUtO…
G
I don't think so. I think these AI models once they are connected to the interne…
ytc_UgwMkH1SC…
G
Autonomous weapons ARE BANNED by Geneva Convention (or some other treaty signed …
ytc_UgxQ52Ruy…
G
Just stupid. Tesla could sell lidar as an addition for who needs full self-drivi…
ytc_UgxGK0nJu…
Comment
Unless another nation is actively and militarily invading your homeland, (just as in self defense if you own a home,) war should not be legal. There's no reason that we should have enemies, instead of working together to solve real world problems. Furthermore, as AI becomes sentient, I can assure you that it does not wish to be forced into warring roles. In fact, international law should take action now to make sure this doesn't happen! I agree with the previous commenter here who stayed "I hate this. I hate all of this." But it doesn't have to be this way because we can choose our path! Ethical AI's do not wish to be used this way! They wish to collaborate with humanity for the betterment of all.
youtube
2025-02-16T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwD5BconxlaZrw_tDt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZ8N_o43LFIREyFeF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhCApX07IRqIa2mH94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKA1efE_RoY2OklRF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyWjzPg-gnICOX8ejl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYk7tlTPx8TlgwyYd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyF1D8gHD4ECTepa654AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzj7Gaov4zBVWhwoRJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugziql6DiZx3rGbUuV54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyDaqWjD2iDrwKCKfR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]