Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI we have now is trained on data but the core is word association & predict…
ytc_UgxSrRL4C…
G
AI does not lack accountability.. The accountability rested always with those wh…
ytc_UgwJifqXI…
G
The whistleblower from Open AI was unalived for exposing the stealing of peoples…
ytc_UgxaN-4ci…
G
Fully autonomous is not going to happen soon & I'm buggered if I would ever put …
ytc_UgxM0Emi5…
G
Why not only give planning to Data Centers if they have a minimum 50% or more r…
ytc_UgzQO5DeM…
G
@bjrnbulhoff499Tesla, waymo, zoox are all self driving vehicles they won't need…
ytr_UgwrqW3-G…
G
We can fix this by taxing AI providers to the hilt so that the unemployed can be…
ytc_Ugzwk2Y9_…
G
Of course Open AI would silence criticism by any means. That's just in the natur…
ytc_Ugw5KK0oj…
Comment
All these people on here commenting about how people get hit by cars driven by humans more than autonomous cars. First of all, we know that. There are more human driven cars than autonomous cars. That's a bullshit argument, though. That's like saying more people get killed by bullets than rail gun projectiles...
What this is about is liability under the law - if an autonomous car operating in full Level 5 mode hits and kills a human, who is responsible? The car? The car's owner? The manufacturer of the car? The manufacturer of the sensors? The algorithm programmer? Who?
youtube
2018-03-26T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_cYfqNU3aDhdPlt14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwHuKYnGxVVo6rFHhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9rO2wNOT-ikBr4B14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzojAAVa671GkDhr9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKQyZyilGfh0x72Gt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgX2RvjccjA-YPstx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJ9c89mNiilIm2Ybp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEzCpiwo6VeTlj5jd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy63pl_ZifMbW-O96h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyGRU8kjp5e7zZ_SJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]