Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This sounds more like thinking critically for youself is dead, not knowing basic…
ytr_Ugxuk5SP3…
G
23:27 The Australian CFMEU union is reknown for its ’muscular’ approach to labor…
ytc_UgzGUApaN…
G
I drove a Tesla Model X once, and I got to play with the cruise control. Didn’t …
ytc_UgzNflbiP…
G
Great observation! When Sophia mentions not wanting AI to be "blind" to human ne…
ytr_Ugyo9Xm_Y…
G
How about the millions of other sperm that was not resilient enough to fertilize…
ytr_Uggwq5VL_…
G
If this happens,chatGpt is going to skin me alive......I had abused it way too m…
ytc_Ugwr-hxsV…
G
Stop it already the robot is programed to act out this way to train dental stu…
ytc_UgxucoCFA…
G
Honestly there was a ton of stuff automated before. Look at farming one person c…
rdc_jj3l9yx
Comment
Too many people today are plain lazy, wanting machines to do the work for them. Elon and the geeks are just So confident in their mission to make machines do our work, thinking their AI and algorithms are superior to regular old humans, (and sometimes they are). Someone making these systems needs to be held accountable for the lose of life, otherwise, this will continue to happen, and if no consequences, no big motivation to improve, or feel remorse and compassion, (yep, machines don't have these qualities), and sometimes I question weather some of the people building dangerous machines do either. Wonder if anyone from Tesla has reached out to the families who lost their loved ones, or would that not be "allowed" by attorneys, (who mainly just care about keeping their clients happy). For reference, I ride motorcycles, bicycles and have driven over a million miles as a professional driver, including behind the wheel instructor, seen a lot, learned a lot.
youtube
AI Harm Incident
2025-06-12T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz4AY59q36IgaWwHCN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuafevFmBiC7Ztv1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwn1VYzk_xDwR89xdp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxT6YA_aZ3VqIJrTMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBZe5bn3i76vQbDgh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLvhYllrtXrwbSWuN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6yUZ7KedAeeVvDSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgypzZdekfbePGFKeQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVxi5UFlSwO74tm_d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoIdAlgEJvWNNnS_94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]