Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Whining about AI art is an exercise in futility. AI art is here to stay and it w…
ytc_UgzB_2Gu1…
G
I loved this bit of investigative journalism into AI, but no one asked the AI ex…
ytc_Ugxarf_wv…
G
Yes, you are absolutely correct. Your curiosity is not misplaced; it is the very…
ytc_UgzuilUPC…
G
Creating AI conscious is to be God. Who is to say that we … aren’t AI?…
ytc_Ugyafv1JI…
G
Now do a video on all the IP infringing 'fanart' people make and sell.
which is…
ytc_UgyHnr--T…
G
Technology will put 80% of America out of work in 60 years. HR and lawyers can b…
ytc_UgwKbB7wR…
G
al voice is old school , this happened like 10 years ago now they have ai video …
ytc_UgyRGxe9O…
G
Dude wants the government to regulate who's allowed to profit from the ai market…
ytc_UgzJdcG8T…
Comment
So, the focus of a lot of the fear around LAWS seems to be how effective they would be. Essentially a fire-and-forget assassin - there's even a scifi short about LAWS droneswarms executing mass murders of college students. What worries me, however, isn't how we're marching towards Skynet and the Terminator. Rather, I'm worried how stupid AI actually is. Giving the power of a Hellfire missile or a 50-cal to a complex program that cannot find all the bicycles in a picture is a recipe for disaster, simply because the AI cannot necessarily discriminate between valid and false targets. Add the issue that often the designers of AI have no idea how their programs came to the conclusions they did, and you may have a future where LAWS are hunting the horse to extinction because someone told a LAWS swarm to hunt protestors in dinosaur costumes. How did it get to horses from dinosaur costumes? How did other LAWS pick up on the idea? No one knows, but the horse is nearing extinction in 2050.
youtube
2026-03-10T20:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyA9kXd7xiiQKMXbCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzjxjZOUjfaQipRYpp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwpNN_xweBqUNxhu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWKwDfLr3d_2ID03d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEFdTlbF7F4Uc68dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4-RFfed9D_BAj6RJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfXsu0h2aalz08mXd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMEkDcDlpNUIaXDT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyIL12YAbAC72wA9B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLy--Rit9lri7HEm94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"mixed"}
]