Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, that would be quite a sight to see! Who knows what the future holds with A…
ytr_UgywoO4KD…
G
More time with your family’? That’s a fantasy. With all the greedy players out t…
ytc_UgxeoWvO6…
G
AI is in use now and it’s not a pretty picture…when corporations and governments…
ytc_Ugx2zicbC…
G
Me over here building AI tools that would replace my job to make my job simpler💀…
ytc_Ugx-w04Kj…
G
Ironically, AI is kind of like a wildfire consuming everything beneath it as fue…
ytr_UgwrhG0Zi…
G
Creative jobs has already been replaced and the virtual and physical space overf…
ytc_UgyXTXV56…
G
Stop giving example of calculator and travtor like navies.. if a developer can p…
ytc_Ugy6NBvxj…
G
Why is this such a big deal all of a sudden? I'm not saying whether it's a big p…
ytc_UgwsVRaP1…
Comment
The very first question in cases we have to choose is, whose safety is the most important to an autonomous vehicle. Is it the driver’s and its passengers or anyone else on the road. Humans sometimed react by putting their own lives at risk to save others. If an autonomous vehicle has to choose injuring a road driver to save his driver’s life, is this an ethical decision? It appears Full Self Driving Cars will only be safe in an automated and autonomous environment where all participants play by the same rules. We are decades apart from this.
youtube
AI Harm Incident
2021-10-25T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzslwt9QiheJpWRbfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxcu2cWNWH8_geonwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwXxgUBr-1VnhEAZzd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwugIWuRy3J137G9j14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx2e0zPvlz0YqnvIjZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvMPLnc85qWm1CCV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxcvVkDSDnxfMmBnS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwSIBvE8zK_FrylW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTbdc_xoYz5h6zmE54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIxDwsTLHdFBsCgnx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]