Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the WEF is involved with the job loss through AI travelling long distances wi…
ytc_UgzxYrqVv…
G
May be AI is good for "some" automated backend jobs. It's definitely a failure w…
ytc_UgwDkJdpd…
G
I have my own personal ai and have solved a lot of the issues BUT it’s not just …
ytc_Ugw6_652f…
G
I'm all for freedom of speech and less regulation, but look at how much bias exi…
ytc_UgzNRldY9…
G
You're arguing that today's AI cannot create that painting. I started using AI i…
ytc_Ugz6GGOaN…
G
Dear Dr. Hinton, I agree with you about the degree of the danger. I disagree abo…
ytc_Ugz7-GNzP…
G
Imagine when IA is set loose on the internet? See the movie: "I robot" with Will…
ytc_Ugxf_KtrT…
G
Your kids have agreed to sell your name and likeness for $100 Facebucks that all…
rdc_o62u2fn
Comment
Let's put it this way. According to the National Highway Institute, EACH year in America, there's 40,000 fatal crashes involving hands on driving, since Tesla introduced their FSD in 2014, in that span of 10 years, there have been 481 deaths involving FSD. Everytime there's an accident or death involving autonomous driving, everyone is freaking out, but the 40,000 deaths involving idiots behind the wheels? just another day.
youtube
2024-06-22T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJSKK0dh5Z6vkhDz94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGe0vxkesuBOYo3E14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMAuNQrrHX7nrE8V54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyuIRDLiJc6o_gQLUp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLDuxscgu03-AIju14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUAvfi5w5N3gAn8Wd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6E1T_ICRUTF5Mgsd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylGet2pIjpHSJHETZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDzERpawr2oOVxiCp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugze2yR8KT2nehAG4xF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]