Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Laid off as a fresher from my previous company due to AI don't listen to people …
ytc_UgxdC9q8j…
G
artist can cry all about ai being bad but it is not gonna do nothing. And if a p…
ytc_Ugz1UGLra…
G
Art is an expression whether it's intentional or not. Every stroke, every detail…
ytc_UgxiHEMM9…
G
It could be finding things in the linguistic patterns of communication that we a…
ytr_Ugy3SN-uY…
G
Using technology and using ai are different, very different, technology has TOOL…
ytc_UgyvXRHh5…
G
I heard that, and what about your AI Lover, you acted like you were about to cum…
ytc_UgxUesZpz…
G
I can't believe no American saw this coming. Even Walmart has automated its work…
ytc_UgynX_UE9…
G
Ai models are trained on stolen art material to begin with. That's the main issu…
ytc_UgxNDrqlj…
Comment
Okay Currently, there are around 35 million self-driving cars with about 6,000 fatalities per year. In comparison, there are about 280 million non-self-driving vehicles with approximately 35,000 to 40,000 fatalities each year. The ratio of standard cars to self-driving cars is roughly 6:1, indicating that if self-driving cars were as common as standard ones, fatalities could increase to around 48,000 per year.
youtube
2023-07-30T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwdQ8rmz-7exCoCrMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6WptzMVjBxvOoSDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySybH3AOEPuK0i6D14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyGe1Nurkd6qbd-XY94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoPh3giNJtLTihOcp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLf4WHBSIqYOO_ouJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6yVCwj2WM_hqJSK94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2-zH4UsWxsbKO9Tl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgztLBf8caqzf0Zp9UN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMa-OKbQooJax1eVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]