Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is another important problem with driverless cars. My car provides some p…
ytc_UgzAGhhfq…
G
Before returning devs need to demand job security. AI only grows as if continues…
ytc_UgwTP11-3…
G
I just don’t understand how if America slows down or henders ai development how …
ytc_UgxtWtYJd…
G
What could be the potential jobs that chatgpt could replace and what are the new…
ytc_UgzMu8NO8…
G
This happened to some extent with the very first chatbot programs that ran on Un…
ytc_Ugx-sFLMm…
G
Can you please interview Benedict Evans, technology analyst and former partner a…
ytc_UgxY2ziLv…
G
So everyone loses their job -> then who buys stuff and pays taxes? -> and someho…
ytc_Ugw9WUZxS…
G
As someone who works as an artist in a coorperate setting, that comment from Nic…
ytc_Ugz_AOQ7e…
Comment
Interesting issues brought up. But why does it matter if a person is killed by an automated predator drone than a human controlling the drone? The reason wasn't given.
At the end of the day, the person is still killed by a man led military. The robot is following the chain of command. If it's not following orders, then that's no different from a human soldier not following orders. Heck, if anything, and the panelist agree, robots will be better at discriminating combatants from non-combatants. Yet, people still feel icky about persons getting killed by robots.
A ban of autonomous "killer robots" seems to be mostly emotional, fueled by recent movies and popular literature.
youtube
AI Responsibility
2016-07-28T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgixlPFeQ1R8H3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiyK8Zp7jtHZ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiEG0Zg29l0mHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj34Qf8UOhxm3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjY_cocRkGtEHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ughzyf_JNlSVO3gCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughc5f8nD8LA4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggxCaNMyVyivXgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiCjS6CNIM8t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEXf-BkR-bOUoMASp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"}
]