Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Brave New Future", I see what he did there. Very insightful, but what would hap…
ytc_UgiJaQs6F…
G
Hope he's wrong about a ton of things - the threat of AI, materialism, conscious…
ytc_UgyrtpoDr…
G
@motivatedchannel6255 technicaly po it's your problem napo wag po mag rely masy…
ytr_UgyEadUMy…
G
One: I don't *want* an ASI controlled by humans. That sounds like a recipe for d…
rdc_kqsym35
G
Wow really a s***** stupid greedy company uses AI Wow how surprising, you know t…
ytc_UgyHTwS71…
G
"Me: sounds good"
Yes that the thing of it. I always say:
Progressivism: replaci…
ytc_Ugx9uu9Xo…
G
AI requires no skills therefore the people using it to “create” so called art ar…
ytc_Ugxfvb9Hn…
G
About 1:24:00 Melanie Mitchell says something like
"The lawyer with AI couldn't…
ytc_UgxkUDV7V…
Comment
How many errors did AI made? How many did human made? Wrongful convictions (a downstream outcome of some arrests): The National Registry of Exonerations has documented over 3,200–3,696 known exonerations since 1989 (as of recent updates around 2024–2025). These represent people who were convicted and later proven innocent, often after years in prison. Many began with an arrest based on human error (e.g., eyewitness mistakes in ~60–69% of DNA exoneration cases studied by the Innocence Project, or official misconduct including police errors in a large share). 
youtube
2026-04-10T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwxDbUcqIutktCeV154AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMZnIySPUun_xHETZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZ9N5oWBKTuIPZ4vx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBa_0ufJW-AOL7XBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtFYLkfzURHVtsHgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzkh0muftx08gd-HlN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjSvF6SGon4gEtRO94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxw05q7g9xXGSm_7dR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyrAVPGswJ7zsTiW2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzXwf4zH2efB9IB88B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]