Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So you're saying AI is just making shit up? Hmmm, maybe it's already more human …
ytc_UgygNz6Vw…
G
since they want to replace jobs with ai and robots, maybe all these companies sh…
ytc_UgzotLQkO…
G
We are creating our own demise with this new global AI race going on. Everything…
ytc_Ugysg9cmB…
G
I think that if people really wanna use gen Ai in their art, the least you can d…
ytc_UgwTetteQ…
G
What will happen when an artist who has a disability making it so they cannot ph…
ytc_UgzBjFOCz…
G
Anyone who believes that AI can code is on crack. I use it daily to help code an…
ytc_Ugyq8ApFE…
G
ITS NOT A DRAWING ITS NOT ART ART HAS THOUGHT BEHIND IT AI ONLY HAS A PROMPT…
ytc_Ugx0ku1oV…
G
Man said there is no cure for Cancer ♋ but an AI will be able to find cure for …
ytc_UgylK5bkA…
Comment
It doesn't have to be perfect, it only needs to be equal or better than humans.
Crash Rates per Mile
Tesla Autopilot/FSD: Tesla reports approximately one crash every 6.36 million miles when drivers are using its Autopilot technology.
Human Drivers (National Average): By comparison, the U.S. national average for all drivers is approximately one crash every 700,000 miles.
The Ratio: Based on these figures, Tesla claims its technology is roughly nine times safer than the average human driver.
youtube
2026-01-02T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9awRnFcvfy2BCg414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzINYJcADeGTLMyG5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxyqSeZe55E-Oscs4p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxTAzU8hRCU7-vqVLN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHqJGjjTpBivIYlrh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySQ9pH9NAnXo2PS2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxeb1pQB7dQXC-AjON4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZyNaxATWHY9hzkIB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHDszg45sLWQeiZS54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyT1RUnMYqk1dhKUt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]