Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why don't we just get rid of the small number of people working on AI before the…
ytc_UgyAnAyRC…
G
This is a terrifying concept. I know it's probably a joke but imagine if an AI d…
rdc_jcgpvht
G
One minute in and I really can' t get past someone not ironing his bloody shirt.…
ytc_Ugx6_rnvE…
G
@lanedillon6365 Whenever I don't have ideas which is most of the time, as a pers…
ytr_Ugyz9QZgf…
G
interesting... remarks:
1. MW def: "or a split roll" ... for a hotdog, the rol…
ytc_UgyahA87i…
G
I don't know that this is fake or real but i know that robot is very dangerous f…
ytc_UgyAVXu-E…
G
An overpriced lookup databases is not artificial intelligence , Not even close, …
ytc_UgyMAZlav…
G
Ai is a tool, and with any tool it has good and bad uses. Don't hate the tool, h…
ytc_UgxN9-v1o…
Comment
here's the problem, let's say we can make a drone that flies itself and, say, delivers packages for Amazon. It wouldn't take much to convert that device to shoot a gun, launch a rocket, or drop a bomb. The pandora's box is already open. We have drones that follow people around with a camera. It is just a matter of time, and even if the military doesn't put a gun on an autonomous drone, some jackass is going to do it at home.
youtube
2015-08-04T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghtSkBgzSYBtHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjViNNXfNfSJHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggmA0mXDPRJZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjCKuvjORfp8ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggRPYH0T4jMPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggAVsZqHgrQLHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UggiVbomHzBmy3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3w9U0giWCwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjfNG0lGF6WFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugi1I3DCzAfkyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]