Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great interview. A lot of enlightening topics covered, but I'd have liked to kno…
ytc_Ugynn5YXg…
G
I support this movement, hate to see Ai soulless slop being made and actually, n…
ytc_UgztuwMyk…
G
If most jobs are replaced, who are gonna pay for the services ? If ordnary peopl…
ytc_UgyZ46gM-…
G
It is an issue. But that is where copyright laws and other protections - hopeful…
ytr_Ugwb7gQ-2…
G
Even Amazon is getting rid of thousands of workers and replacing them with AI. L…
ytc_UgzY-pEcS…
G
It seems like we're reaching the [Uncanny Valley](https://en.wikipedia.org/wiki/…
rdc_j8y44v2
G
Looking back at this ai is such a FOMO thing now, every dev,corp,people want to …
ytc_Ugx5jCueK…
G
Until they have AI that can walk across a rough construction site and inspect re…
ytc_UgyilRKEd…
Comment
Why self driving cars are bad.
1. Hackers. We all know hackers will do it if it grants them an opportunity at money. By imprisoning someone in a car that they could make crash at any moment, people will give away their bank info to save their life.
2. There will be thousands of bugs in the code (that car showed one of them) and no you can not get rid of all the bugs, they're always be some there. A saying programmers have can help explain it "99 bugs in the code, 99 bugs in the code, take one down, patch it up, 117 bugs in the code".
youtube
AI Harm Incident
2018-04-02T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxu48WTenHCgOgQdwR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLqzF1X1NEeHPH_C54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzv81s398PkyR_p_Wh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMJgjlj3ONO3x1qBp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"},
{"id":"ytc_Ugy3Jvbtv1IIMc_Gb4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwEaj_OvSl6JkcqRht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmsL2y5MiyftdfEfh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxaGE7TUjtUpSAjVd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypSz2b-trrwPj1Wt54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpojYDdejAB2qyjP54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]