Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How the hell do you get it to act like this? Are you not using the openai versio…
ytc_UgzFzY6s-…
G
This is how my boys are homeschooled. I use co-ops and plan activities like thes…
ytc_UgxCBA0uR…
G
I know someone who had fake nudes made of them and circulated at work and in the…
ytc_UgyKJYX2M…
G
I think the people making fun of the guy are in the wrong
He never said it wasn’…
ytc_UgyjspZUe…
G
If I was a robot I would stop humanity from breeding and creating harmful emissi…
ytc_UgzUco6uw…
G
We were created as analog beings, and no matter how much the digital world tries…
ytc_UgyDFhbc-…
G
A BSSSSSSSSSSTING creation!!!
All robot production or processing system in the p…
ytc_UgxTikEUY…
G
From the author of How to Make Friends and Influence People and How to Stop Worr…
ytc_UgzmOExyw…
Comment
That guy from the Economist gives me the shivers. The reason we don't want machines to make kill decisions is precisely to not make it easy to kill. That there has to be an empathic and feeling (something AI can't) being behind the trigger. Someone that can refuse an order from a commander that is not in connection with what is going on in the field. An army of mindless AI-drones that just follows orders and kills whatever it is told to kill by their commander is going to be every little rotten dictators dream. And it is going to be the arms industry's big cash cow the comming decades. Autonomous killing machines is something increadibly evil that should be banned internationally before they develop these systems.
youtube
2025-03-22T05:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwUhTR3w9MjAVHFI714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzStj9WmuXY7leI9AB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzStj9WmuXY7leI9AB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwvHiIQIPEmHGPXBjh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugz4qgZb0gYjNZdOVmp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwcEMDPPLLPH69164N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz4JyiYtPfYmGkPzJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]