Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On 9 November 2023 A Man Got Crushed By A Robot In South Korea After Grabbing Hi…
ytc_UgyLQxdAc…
G
True. Same thing the military industrial complex would say.
Except it's not wr…
rdc_m6youpm
G
Overwhelmingly, these issues exist primarily because we collectively employ capi…
ytc_Ugx4FFSo3…
G
there's this one meme video of elon musk owning a company where humans power the…
ytc_UgxrPe1ZE…
G
I feel like the way to make it ethical is if 1. Artists submit their art into an…
ytc_UgztN9XlE…
G
I’m loosely aware of one tool that’s been developed along side AI deepfake stuff…
ytc_UgxPJRvwx…
G
well like many has commented about lost jobs like self serve cash outs and Wal…
ytc_Ugwk4V9Tx…
G
> My question is they spent 28 million dollars to train her.
This is not a q…
rdc_cjoyfun
Comment
The unaliving from "AI" its simple:
The person who deployed it or ordered its deployment without considering the risks is the culprit, look at "AI" as a hammer, not a magic wand or conscious being.
if the hammer has a defect, who are you going to blame? really not that hard.
youtube
AI Governance
2025-07-04T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgnIDuJyhDvH8v18Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzTzDMZyxkOuHWO2kl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzDi4dLcPGOw1odQKN4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw7PzMW1oazR441Eyt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFxOoqw2eFIkULkRJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMIz6nVnG4rca8LBB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwO2rAEyFnpkqujS2J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVqxWMoxJXvzo0mNR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyvh3WYrfXm3ROw8NR4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWa3WQSh9QnoaFq8l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]