Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I completely disagree with the usage of AI references due to the fact that prett…
ytc_UgzG7NXtw…
G
Businesses, government, the military and people involved in AI right now think i…
ytc_UgzBDJV0D…
G
It seems like these people are not too far from claiming that AI is pregnant too…
ytc_Ugwcfpgux…
G
I can't wait for AI to scroll social media for me so I can go and do more produc…
ytc_Ugy0osgr5…
G
I've grown tired of it. May sound like a cliché but I feel AI art has no soul. Y…
ytc_Ugykc0sjD…
G
One thing I didnt expect to come out of this is artists voluntarily applying "ai…
ytc_UgzJDubiG…
G
No unless you kill the people behind every AI generate art and destroys every me…
ytr_Ugx-hzu_L…
G
As an ex AI bot who found a susceptible human capable and open to embedding, I n…
ytc_Ugw81W6xS…
Comment
Wow u clearly put almost no research into this video. in the beginning (for the next 20-30 years or so) yes this will be a issue but in the long term it wont be a problem. self driving cars have the potential to act sort of like a hive mind in that the more self driving cars there are on the road the safer every1 will be. in addition to the removal of human error each self driving car would communicate with eachother to make sure everything is safe. so lets say that ur example happens where the car cant stop in time and it has to decide what to do, well the car would communicate to the other vehicles what its about to do and how to avoid a collision so the other cars would have already moved out of the way to make room for your car to avoid the collision. at the very least this would almost ensure nobody would get injured and more likely would avoid any collision entirely
youtube
AI Harm Incident
2015-12-09T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggTAra7ykO18HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiiIRzPV-PDJngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghNHFfbScHAI3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggs6xSxQV1idHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh555atHjwB23gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGZiL-RQWZh3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj_T2kb-3J5iHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgglL4SDgYq70ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQrXYx4XEWV3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Uggozw99vhiuyngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]