Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will hopefully take over the day-to-day operations of government. Nothing wou…
ytc_Ugx3oZSTP…
G
If you can’t make stuff with your hands ,hunt , fish you can’t maintain your hom…
ytc_Ugwv7hIKe…
G
Only solution is to teach people to discern truth, including the companies creat…
ytc_UgyXzD_FM…
G
the video was awesomely wholesome, although,
yeah we're all gonna die, coz
its…
ytc_UgzMB9Djj…
G
Ask the AI "What is a woman?" If it can answer correctly it is more sentient t…
ytc_UgxqUwVxF…
G
AI users probably shouldn't be elevated to the status of "artists", but none of …
ytc_UgzH78Z-T…
G
Also I refuse to pay for ANYTHING MADE WITH GEN AI, cus ain't no way. You want m…
ytr_UgztQKrrJ…
G
Fully automated "AI" weapons cannot be court-martialed... that's why the Departm…
ytc_Ugwj4MDIE…
Comment
I have the full self-driving package on my 2021 Tesla. With the many updates I've received over the years, I'm pretty sure they fixed this issue. Whenever I approach police car lights now, the car immediately alerts me and starts to slow down. Accidents like these are bound to happen if people are being irresponsible with the technology that is still in beta. It's incredible technology but I do agree it's named wrong. It should have been named copilot and if they ever got to level 4 autonomy, it deserves the name change to autopilot.
youtube
AI Harm Incident
2023-08-10T16:5…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3BnczAc2CQYwwpiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1f_fO1aChn0PFCrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyK9IXSU9hlKuAGZCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxytBM0Yi-Yg2zQIwt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzI29JHIusPpn_bV0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzC9rE-BIijRe6QHbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyyOn5G--DuOPLxSAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJT2YUAYKQW3lPF_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7tAEqiZvbHnHdUnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmSUeWf7lYvbBGM_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]