Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i just cannot understand why level 2 car system would definetely perform better …
ytc_UgyUUkGKD…
G
Though I do believe it would be inhumane to cheat and replicate a human's brain …
ytr_UgirsstFT…
G
This is an understandable stance Japan is taking, but really Japan has a lot les…
rdc_e2vsa7a
G
Simple, because humans are lazy, and with robots, humans will be more AND MORE L…
ytr_Ugyke5sxB…
G
Would you be willing to explain why you think being emotionally invested in a pe…
ytc_UgwC9z6R9…
G
As a hobby artist and aspiring writer I am all too familiar with the sinking fee…
ytc_UgwJDDcA_…
G
Studies have shown that developers using AI are 19% slower than when working wit…
ytr_Ugy_7tOWQ…
G
This is seemingly the only right opinion in this comment section. People who app…
ytr_UgziYsUnj…
Comment
Isaac Isamov's "Three Laws of Robotics" will most definitely not be observed within the AI framework...
Whatever one human being can think of doing to another human being in order to have power over them has already been imagined...
The current naysaying with regards to AI is nothing more than a thinly veiled smokescreen attempting to divert attention away from the determining of how AI can be practically used in military applications...
The field of robotics and the field of AI are unregulated...
Neither will distinguish between...
Good and bad situations...
Hard and soft targets...
Friend and foe...
A zero or a one in the wrong order is a reality because fallible human beings factor in to the equation...
The incredible potential of AI in the pursuit of "good" is not as profitable as in the pursuit of "bad"...
🇿🇦
youtube
2025-11-30T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyQEeKcGWLFTvEZWXB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfWJUolcpM82SA5QB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkqB3QWHHqBE8diHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyvby1s8IZT3Q1L_Gh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzoDg-VihVSAoPl7814AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyCNL4V4kZLTyfADsN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwCEbQdNoJJIGulU0R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyt5S_WePTUAbbuNO14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVrSvXpH69ofDySfV4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQVlI_JDbDQZlNpK94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]