Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also if you're switching from OpenAI because of their DoD/DoW deal, you might wa…
rdc_o7xnv15
G
Nobody is suggesting there will be no more software engineers, it's more like we…
rdc_oi3xnwn
G
Just watched a video of an Israeli sniper drone hovering menacingly in the sky, …
ytc_Ugy4atiMi…
G
all those AI-hypers sound more and more like the Jehova’s Witnesses, who have be…
ytc_UgyKrgd3Z…
G
Ai is an excuse to fire people and then they hire foreign workers who are cheape…
ytc_UgwlZxWiC…
G
We should be glad there are so many alarming reports and whistleblowers revealin…
ytc_UgyUOEkZl…
G
One of the goals was f AI is to get rid of gatekeepers. Every will be available …
ytc_UgzV7eYWP…
G
I wish everyone thought like this :(
I’ve actually cried several times because o…
ytc_UgxFrsNI8…
Comment
Not quite right blaming accidents on what is essentially adaptive cruise control. You wouldn’t blame the car of any other company for the same kind of accident if their cruise control rammed another motorist, you blame the driver. How can you as a driver not notice running up on another motorist on the highway?
Autopilot doesn’t disengage before accidents to shift blame from tesla to you, the fault was yours regardless as operator. Instead, autopilot disengages to prevent the car from continuing to drive after the accident.
The name autopilot isn’t banned in Germany, nor is it misleading anybody. Anybody forking out the 10k to buy it has read about its capabilities beforehand and nobody thinks they are completely autonomous.
youtube
AI Harm Incident
2022-09-03T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwXq3HjQzB2quHhhkt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvHBsrIpyOZYvPAyh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy67ZbxeNoCBQfq5-54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUfRHoPubgJTkiHkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH5fJDcTsxiGO-UDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzz84wo-J31GrMQ03x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw45s8wYvl_u4RiUt14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvT7QXBAILjw5TX1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1Zc3DzI9F4YNWIT54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOouGOdK2zx3_88lp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]