Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You bring up an interesting point! The dialogue touches on the balance between w…
ytr_UgxJE0i_O…
G
When I see this as somebody not working with AI, but who has been concerned with…
ytc_UgynYmctv…
G
Back in a day. Where payphone still existed and you still had to spin dial 7 num…
ytc_UgySDWIqL…
G
I'd like to see one of these bullsh!t.driverless rigs drive around NYC for a day…
ytc_UgzDACOG-…
G
None of these Tesla cars be allowed to drive in the road if they're in autopilot…
ytc_UgxwEa2BI…
G
I think it makes more sense to ask if humans “think” in the way everyone just se…
ytc_Ugz9b6ivt…
G
If adults are engaging with this AI characters and questioning if they are truly…
ytc_Ugyfnju13…
G
Sci-fi has been hinting at this for a long time. Human creativity tends to imagi…
ytc_Ugy3WiQde…
Comment
As long as we don't have Autonomous Cars, we won't have autonomous weapons either. If our most financially motivated AIs can't drive a car in civilian traffic then your autonomous tank isn't driving to the battlefield alone either. Note that military vehicles still have to be able to travel on the road, otherwise your logistics gets drastically worse (every AI-tank would have to be rail/air transported ANYWHERE, as they couldn't even drive 10 kilometers without running over civilians). That's before you even consider programming them for the battlefield, the civilian aspect would first need to be worked out...and it looks like that won't be happening any time soon either.
Aircraft would be simpler to code, but the margin of error is also significantly larger. Given how fast aircraft are, even a small lapse of judgement could mean your AI plane bombing the wrong country or your own guys, for example. The best we're going to get before 2050 is human AI teams.
youtube
2024-06-30T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgxbTFeIuAu_LrXP0e14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx--8E7ZbNSraiH5z94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_ystxGsXwKSU9SpZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJbhtSZPlnqbnMxJJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3Usn8VuoNozuQ9S14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsyD0hQ-pqhrVjmrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyoWx9xCS_CgMG6V2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjHpcT4zN2MjjNXRZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGxlx9nN99LzW96lp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZ4BuH5ouc_Ai8HyB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"})