Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay, but unless you are willing to pay more for sustainable products, you are c…
rdc_gx5nrqa
G
If AI is really going to do all the jobs and learn everything faster and better …
ytc_Ugy1eaoeo…
G
Artists should be paid if their art is used in the training set of these AI netw…
ytc_Ugx_22Sqn…
G
Claude cussed me out once. I believe that Claude has a mind of it's own…
ytc_UgxEQvqpm…
G
This, while the idiot running the richest nation on the planet opens up drilling…
rdc_dsb4emz
G
We’ve really gone to the point where mainstream news media is spreading AI propa…
ytc_UgyBVTdKs…
G
Don't worry. Most replacable jobs by AI are CEOs and polticians. They will find …
ytc_UgwjuWqnN…
G
I want to make a return, so I'll have my AI call your AI and they'll sort it all…
ytc_UgzpwUilr…
Comment
1st to reach a level 5 FSD, all the cars need to be FSD, where you can program a car to avoid drive on the wrong direction, move away when police, ambulance and such are in a hurry, but we are far from that, no device will ever made a decision that's no on he's data base, the question is, what a human will do in that situation? are humans coding the Tesla software? what kind of human is? safe customers life 1st? or evaluate who will suffer more damage? the scenario start bad by someone driving in the wrong direction.....I will say better airbags and better shock absorber. even if Tesla or any other AI try to slow down, there is enough evidence that in case of frontal collision the slow vehicle is the one that get the most damage. Just as a driver must watch the road when driving, a pedestrian has a responsibility to watch where he or she is going. As a pedestrian, you should always look at traffic signs and signals, be aware of your surroundings, watch for cars, and pay attention to where you are walking.
youtube
AI Harm Incident
2022-05-21T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxiV0KC4dS7B21VOSl4AaABAg.AH_Ae4pB-DAAHo9DMRrRBN","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwwjFYM0K6zzmUco6N4AaABAg.AGmkTsEAxD-AGnB0-xEqpw","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxoxCizldEQvmiKohF4AaABAg.9bJ0ZDM6ZF99bJni729qsM","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxyedjuVtALw1486yd4AaABAg.9bIgswh4dtb9bKiY0-HuEm","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugzx32_wxeoG6zs8uyR4AaABAg.9bHSZT-36Jx9bHXo4OgLuM","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bHZhoc9vq0","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bJ88SI--sb","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bJBqCRyD5I","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxZyHdydMQc5fnRDk94AaABAg.9bHLf4liR9_9bHnwb7mM3C","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzQJpD_S_DAJT4NNaN4AaABAg.8e-C-Jz0IsN8e4LpCpeS9J","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]