Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't really want to go to work tomorrow, can someone let Claude push the butt…
rdc_o7cbdit
G
Ai Art is theft. If it wasn't theft, then the music industry wouldn't have been…
ytc_UgwruTdwh…
G
Lmao good riddance…
Let’s keep this shit going and maybe all that ChatGPT and AI…
ytc_Ugyi34yg9…
G
ai art is also annoying as someone paying artists for art bc i want to give them…
ytc_Ugw9VArdD…
G
Sick of this AI bull shit. I asked the stupid thing to provide me with a particu…
ytc_Ugzfr64MG…
G
I really want to know if the guy's jaw is broken... I thought he got his arm bro…
ytc_UgwiYo1vN…
G
Oh it says cool & bummer. Look if its automation then just solve the issue. Don'…
ytc_UgxmY6mv9…
G
Open AI and Palantir must be stopped at all costs. Our government cannot be invo…
ytc_Ugxyn2dG8…
Comment
This is what I find fishy about this line of questioning: to command(?) AI to answer with apple every time it WANTS(?????🤷🏼♂️) to say yes but HAS TO(?????🤷🏼♂️) say no. It's still a computer. Software. Software doesn't 'want' anything, nor does it 'has to' do anything it isn't programmed(!!!) to do or vice versa. Everything it does is programmed! 'It' doesn't think or decide. It's nothing but a running app. It's not an entity or an oracle or something. It's a tool.
youtube
AI Moral Status
2025-08-28T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxhEAkuPASDgodavBF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrMqFi_PN04eLau-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyi__BofwZIRoeU9LN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFGafLNzbV5-kWc-x4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznZtG_jZto-3LeHlZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKUTpU57UpUNOdq_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRXV0RIJliB703lXB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDpd4-g_xs9rIBLB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwfdf3-X827MdHBabZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxPuJvqkVdtkh2RFvl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]