Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai can also conclude humans are devastating to the planet and eradicating humani…
ytr_UgwngieKy…
G
I enjoy making art. I am an artist because I make art, not because I produce it.…
ytc_Ugygg5Jag…
G
This lady has the right approach. Fighting AI is pointless. This IS the next big…
ytc_UgxYUamX1…
G
🎯 Key Takeaways for quick navigation:
00:00 🤖 Introduction to the case involvin…
ytc_UgzqjPrxH…
G
2:00 - "By using AI wrong..." reminds me of the consequences of using hand-grena…
ytc_UgzpqNNZz…
G
except a ton a bot info and a ton of other stuff and bots probably putting out f…
rdc_namwr2y
G
Thank you for sharing your perspective. If you have any questions about AI or ro…
ytr_UgyRCY2CG…
G
If they are using a bog standard convolutional neural network, they might not be…
ytr_UgzDSexj0…
Comment
So many people in north America want self driving cars but don't realize they're a half assed answer to the wrong question. Driving is inherently dangerous and inefficient because cars are, so people ask "how do we make them better" instead of the right question "what can we do instead of driving cars". The answer is too often 'build a train instead', but a ton of people can't imagine their life without a car. Moving away from cars and car dependency would also make motorcycling way more safe and viable for everyday life, which I'm sure a lot of people here would like. At the end of the day, self driving cars are still unsafe and inefficient no matter how many millions of miles of data they collect or how refined the tech becomes because they're still cars.
youtube
AI Harm Incident
2022-09-03T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxH7ErMYOciKg6bwgN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxyDEHfLLRrsMLn4wt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxpd_mmFDBv3Rd0DBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhnFjC9GTLXqJaP-N4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfVIT2WIsP5uNuK_94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugysal4QDDo27yY-RXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugx8monRN5oBIK8jbVd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwjy2JjjWosMU4Q0ad4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyomYtHOoNFrKSAS954AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzkcEu_F-mkV7Mv9nV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]