Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am just a mere human (with what I would suggest of average intelligence) but c…
ytc_Ugy6CLT3r…
G
Doctors are $$ driven; not hypothesis driven. The idea an AI with the same data …
ytr_UgwbzdC6Z…
G
Robot: "my max speed is 20 mph, you have no chance to escape me"
Me: *Gets in a …
ytc_Ugxn-yVuh…
G
nah but fr, my dream career is being an animator and i swear if ai videos take o…
ytc_UgyBpptnO…
G
Bully1ng and h@rrasment is bad but if it's against AI promoters I fully support …
ytr_UgypmhhTL…
G
meta ( Zuckerburg) isnt a good person and when you have a person like him with …
ytc_Ugz1a8CZC…
G
If you see a robot running around at 1500 mph 2000mph we are in trouble 🇺🇸🇨🇮🍀🐕🦺…
ytc_Ugz4-kuKD…
G
10:40 i would yuse references more alighned with the topic because human shoes a…
ytc_UgzP31Z0Q…
Comment
Why would "he" take responsibility? Tesla is a company. Thousands of people make up Tesla. Do you want them all to take "responsibility"?
Posted 2023 "According to the National Highway Traffic Safety Administration (NHTSA), Mercedes-Benz USA will be recalling 136,751 potentially affected M-Class and M-Class AMG vehicles in September of this year due to a defective cruise control system that may not automatically deactivate during braking."
Wonder how many people Mercedes killed before the recall, and after it because people don't take their cars in or even hear about recalls most of the time?
youtube
AI Harm Incident
2025-01-31T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzfUS0xtiBfuFvfCmB4AaABAg.AD7SBJJ_dIkADwxsQ8WC5f","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgweVnr66i0YqUdRuzR4AaABAg.AD0vAaKzzeSAD7YcDmBR5N","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwVQlWlV-B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwr20fZ4Ng","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACwVlsbWevm","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACx7dccGkCS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxGdm19dqdWYCXBTRh4AaABAg.ACv87E-W_HcACvwaeRMegZ","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxjZse9UwquSPXxBa94AaABAg.ACu1wJvl8CpACuTbc5UP1I","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzkI3oblNzQq9goUhl4AaABAg.ACsco95dWTEACsxtfyLlyp","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzgJcQ1Wv6HfExc3c54AaABAg.ACqYngCWwk5ACr-o_BUue2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]