Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Guessing age is stupid af. Okay lets say you're an actual adult and enjoy videos…
rdc_n7e3tsf
G
@pointatobagroltd actually i am. Theres some good evidence, even from Elon's ow…
ytr_UgwCTEa6T…
G
*I CANT PLAY A SINGLE MUSICAL INSTRUMENT* and I am the most tone deaf person on …
ytc_Ugw3neo8r…
G
I like making music. It takes time and is hard to learn how to do so. I'm still …
ytc_UgzsB_Pyk…
G
What if you wrote a draft on a book, then use ai to help you edit. Like change t…
ytc_UgxvDCdZf…
G
I don't use AI for the fact that, because everyone else is, I'll be one of just …
ytc_Ugy8PyiAY…
G
As always, these SOTA chat bots reveal, in the most amazing way, how much knowle…
ytc_Ugz6nXCqU…
G
The big problem is actually content theft masquerading as music “creation”. AI h…
ytr_UgxPUs_cj…
Comment
Knowing the right answer to these dilemmas involves advanced programming and knowledge, something that autonomous cars won't have for a long time. The only goal here would be to minimize damage as much as possible. For example, say a deer ran in front of a car. If the car knew it could hit the deer and take the least collateral damage that way, then it should do it. More importantly, if there is no occupant the goal should be to save human lives as much as possible. Additionally, if there are more occupants or children in one vehicle compared to another, this could be another factor helping to make the best decision.
youtube
AI Harm Incident
2019-12-05T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyY8hea89mmyVR35pN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY7v7cjww7oOQlHrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyB0_b6TK8PT7vISdx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyufpnIhu4nx08cBkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydFUcos-c2u9kplcx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyP5A_agca7B0tqbB54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfznTscFqr9M1BaA14AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzjIPWV9pMw8zPPp914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4-0RV-1R8m98VkZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyD53mqEZwoHOxpHE94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]