Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
shannonxpennywise " Artists are proving that we do not need AI art"
- Did anyion…
ytr_UgzFck5Yq…
G
Ai is trash and has no place in sciemce research, or the search for fact. It rou…
ytc_UgynSZUN_…
G
I think it’s cool (HEAR ME OUT) that we are at a point where AI can do this, fro…
ytc_Ugz0mLwvO…
G
AI models already exist, and I believe that one day everyone could potentially b…
ytr_UgxHHdf2-…
G
Who could imagine that in this day and age, we would be discussing about the rig…
ytc_Ugw_6qHZR…
G
This guy’s job is to spread the message “invest in ai so we don’t poorly develop…
ytc_UgxlYKASO…
G
When the AI's style is literally based off of people like him, now they are sayi…
ytc_UgwKEG6E2…
G
Kicks the tire by the side the tire falls over the robot Tire no brother😢…
ytc_UgwwzmO1_…
Comment
...And then there's the autopilot algorithm conundrum (assuming it doesn't shut off one second before impact for legal reasons...): autopilot has a choice when faced with hitting a car, a wall, or say, a motorcyclist... Which does it choose, and is this choiced based on Tesla occupant safety, potential minimal loss of life overall, or Tesla shareholders? If it's legal liabilty/shareholder safety (and this seems to be the overall mantra of Tesla), then the motoryclist is doomed. Cynical perhaps, but from a business profit/loss perspective, completly logical. Great video as usual, Ryan. Cheers.
youtube
AI Harm Incident
2022-09-04T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_k6LD4Ghb8zMXFVV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwBPmTdjZkt3H2-grx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy22RfWH05mm3j9oBF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzGdVdBWOCZZnL3LAh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwCIKtF19wbuhh_olp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhOyQKZyoX6cYaSVF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDtHfOh6VKAZnvkmR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5ZgfHiA0mD1_WBwR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-MEcXK1Ow-vc70Oh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy597L6Gpoy47JSi5Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"}
]