Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A LOT of this drive is built upon GREED! Not actually trying to make the world a…
ytc_Ugw4IN1tV…
G
In my head I was thinking
“That’s just ai”
But then I was like
“That’s ai that …
ytc_UgxMDNAJY…
G
Wow, those clips showing the dystopian future that would be forced upon cities b…
ytr_Ugwu7Bamc…
G
Sentient AI is or are big brother psychopaths behind the tech curtain. They enj…
ytc_Ugx8DFeOw…
G
@Pun116 No way. AI is a product of humanity and thus it will just suck as much a…
ytr_Ugxl_C2Ou…
G
The! The? There are so many problems with self driving and you use the word "The…
ytc_UgzZ7jiKp…
G
Hes right. Its EASY for AI to automate creative mental work but hard to automate…
ytc_UgzyrD4Rr…
G
'It will expand the gap between rich and poor.' Now that's a danger of AI that n…
ytc_UgxTlnit2…
Comment
Self driving cars are inherently dumb. Anchor your design to something more reliable, like millions of years of hand eye coordination. Technology should retain an interface that adapts to our biomechanics, rather than inventing a new foundation to build off of, and marketing it to the entire world after, relatively, barely any testing at all.
(Comparing self driving tech, to manual driving biotech) one has 30 years of credible testing, the other is millions. Base your tech designs in the real world and what is known to work, it will better stand the test of time
youtube
2025-12-05T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxisDonWR5EfGguGal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPNzGFuvvq_pSZMFV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWoDbKU3UvaLuDIJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuUTz-3FPvAKlJjoV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzNglMXN-55zUB8RMV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwpaD_hseQeY5pxWVt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMzxT-JCgylWRc0tV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBVEDGJbhNRqwrLmF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzY0a11ebebLyCsLmR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxUQoKQtf5O8AjhU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]