Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This release seemed less about quality improvement and more aligned to improved …
rdc_n7p3ir7
G
For me AI is very limited. I cannot get it to code for me because its inefficien…
ytc_UgybNkZYW…
G
If you have to make a robot, at least make them look like the Terminator💀💪. I w…
ytc_UgylJfqiS…
G
This is manageable.
As part of your prompts, specify to generate understandable…
rdc_jifhshs
G
This isn't voiced by Stephen Fry. It's voiced by "AI Stephen Fry".. Find me a so…
ytc_UgxnNuntc…
G
Ed Zitron runs a youtube channel title "Better Offline" and speaks truthfully (a…
ytr_UgxUpcuXT…
G
Honestly, I think it COULD have been a useful tool. If you could feed it your ar…
ytc_UgwleIceZ…
G
Another way to defend against an AI such as ECHO is to introduce religion in the…
ytc_UgzWbmXFq…
Comment
@daydreamer8373 not my point. We were refused a blind spot mirror on the other side of a a road by the highways dept on a dangerous blind corner. They said and it is right the mirror could cause accidents because of x, y, z. When you add a risk you are responsible for those extra risks. The AI is an extra risk. So lets say like the mirror it will stop accidents it still adds other risks. The car company is responsible for these much like the highways dept for the mirror.
youtube
AI Harm Incident
2024-12-15T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxq-Arosp4CKSLUml14AaABAg.AC43E-t0V9fAC4o6MNZYu9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy3FxGZNNXUp5Fekwt4AaABAg.AC3vg2Amhg1AC40Ltxt8Xy","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzHbqzrBBWC8OqJwPh4AaABAg.AC3okSxeu_oACBGPxGbi92","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxrOah3M9HFHRJr1p14AaABAg.AC3nyyHTi3jAC4ELnvPo9h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxrOah3M9HFHRJr1p14AaABAg.AC3nyyHTi3jAC54peZuob9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxVm54_lFIV0XLdrA14AaABAg.AC3ejRG2RnZAC44cPTwFKx","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy9ar8HNVVBaR8fbN14AaABAg.AC3_e3WNUAXAC3eSvQJIQC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx_tLvU6kwgFuAOLZN4AaABAg.AC3XyThZfokAC3eqUDtkCN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx_tLvU6kwgFuAOLZN4AaABAg.AC3XyThZfokAC4E5ROjNlm","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwfW2yb4q2Clg87emh4AaABAg.AC3X9G4I3ufAC6-SDV0uBR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]