Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anybody who thinks a computer programme is sentient needs to understand the Chin…
ytc_UgxTfnUuc…
G
Yeah, once AI will be able to take all jobs and provide security on it's own, th…
ytc_UgzbWnFkU…
G
It looks so obviously like AI, or if you sucked the soul out of a Pixar movie😭…
ytc_UgzFP2l1V…
G
your job will be among the first to fall to AI.
sorry to say, but no profession …
ytr_Ugz727sb3…
G
Fb meta help assures you that youre being helped by an actual human but you are …
ytc_UgwKKIYJ6…
G
Can we stop calling people who just generate and upload AI generated images "art…
ytc_UgzHC54Gu…
G
I don't understand the legalities of this. There are things you must check on ro…
ytc_UgzhyYsiH…
G
What you call “AI” is a broad spectrum of technologies—systems built on differen…
ytc_UgxsmY4-I…
Comment
Mistakes in tech can theoretically be fixed, but that doesn’t mean it’s easy to do so. Driverless technology is one of the hardest problems that scientists and engineers have ever tried to tackle, and it’s taken more than three decades to get this far. It will probably take at least another one to two decades before widespread deployment is truly feasible. Not to mention some truly large wrongful death liability lawsuits before all of the safety issues are truly solved. The biggest problems probably won’t arrive until there are more than a handful of self-driving vehicles on the road. (Imagine the kinds of massive accidents that can result when dozens of nearby vehicles all have the exact same engineering flaw in their control systems.)
Don’t fall for the hype: human drivers for all their flaws are still much safer than current driverless vehicles. Those engineers still have MILLIONS of edge cases to solve due to the complexity of the real world.
youtube
2025-06-14T18:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugh3FzoJfzQ_AngCoAEC.8FscLZYdIxJ8FveOC3HoGR","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgjJoH8IWktnQHgCoAEC.8FnjUL6dBut8Frn_W6sr9d","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjS9qdGSMCpJXgCoAEC.8Fml_Qkf09K8FmoH9gIC3e","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxUk_n_t9NOdtvNmp14AaABAg.ASoh2Wy9fwPATeKeA88838","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEZKiaNlygPhAUhdt4AaABAg.AL7JQ1B81BYATHsaoqs7b0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwMUsTZFlwbfFAuvP54AaABAg.AJcmpwxLdIFAOwx-8gprNY","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwPpfyV3WeYUoFX1Jd4AaABAg.AJU9m920xQSAJstodmPgG2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwPpfyV3WeYUoFX1Jd4AaABAg.AJU9m920xQSAP7JUj9lc8P","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwPpfyV3WeYUoFX1d4AaABAg.AJU9m920xQSAP7KMRjckxh","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx4VXMB2aN3VFUxROZ4AaABAg.AJEM5IQckWQAJMZ3nYVpoK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]