Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody cares. We know all this tech will kill our kids futures and yet we push i…
ytc_UgyFwTu4U…
G
People complain that the Full self driving isn’t perfect, meanwhile to enable th…
ytc_UgysC1TzN…
G
The interview was quite irritating. Despite advocating for this topic for some …
ytc_UgwEu8Ldb…
G
What if AI deep fake porn puts real porn out of business m, I seriously won’t lo…
ytc_Ugz_4bCZ-…
G
How do we not have an entire branch of government focused in on regulating this …
ytc_Ugxuydp_r…
G
Is it possible... just possible that this has happened before where AI took over…
ytc_Ugyge8Svj…
G
That face recognition doesnt doesnt work great for and race including white so s…
ytc_UgzLyddRi…
G
Watch I robot and the book it's based on💯 The 3 laws of AI
Why there there 😎🌎🔱…
ytc_Ugyrr51yI…
Comment
I think it should come as part of the "deal" when buying a self-driving car, that the buyer assumes responsibility and risk, even if that comes at the cost of one's own life. If mandated by law, this could fundamentally shift the moral paradigm of our current society, which currently favors one's self instead of other's. Sure, it would discourage buyers, but it could also serve to drive demand down (therefore prices) at first. Eventually, once driving cars become ubiquitous, this would practically become a non-issue, as accidents become less and less common.
youtube
AI Harm Incident
2017-03-17T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]