Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If no one has any jobs who will be buying all this stuff Robots and Ai are makin…
ytc_UgzbTiF6I…
G
I don't see ball sports or most other sports going away in my lifetime and I'm i…
ytr_Ugyz_Wb_l…
G
Perplexity* suggested I conclude like this: "Where I’m going with this is: if a…
ytr_Ugy_nRaBd…
G
For those people who dont know sora AI is a program where you can make video all…
ytc_Ugz3GMuSe…
G
who said robot will "demand" rights? did we asked animals or demand them? no
if …
ytc_Ugx2df_rp…
G
@kurthoffmann7160 So in theory... ai becomes like the Replicators of Star Trek…
ytr_UgwWQdzjD…
G
US Military moved away from using biometrics as a lock to locations or informati…
ytc_UgzdRrQhE…
G
Jewish mothers are overprotective and can have a lot of influence even as a chil…
ytr_UgzuGrGLD…
Comment
I know its off topic, but I own a 2020 Subaru Outback, that has features like adaptive cruise control (speeds up and slows down) and steering assist (keeps you lane centered while driving).
I picked the car up in NJ and drove it to Oregon. Over 3000 miles of testing these features, it nearly killed me twice. The first time on the freeway, the second time on a major highway. I'm mostly sharing this here so if others see it they might think twice about trusting self-driving features.
The first incident, there was a 50 mph bend in a 70 mph freeway section. I was in the right lane, driving during the day in clear conditions. As it entered the turn, it was going rather fast for the corner and the g-forces hit the car as it tried to cut into the turn. Rather than slowing down, what it did is shut itself off. Meaning in the middle of a high speed corner, my car suddenly stopped turning and went straight for the barrier on the outside of the turn. So keep in mind, it will shut itself off in the most dangerous situations, leaving you half a second to save your own life as it starts driving you off the road at high speed.
The second incident on a highway: it was a 4 lane, 2 each way, highway. I was in the left lane, passing a semi on my right. There was a semi in the inner lane of oncoming traffic. It was again a clear weather, daytime driving scenario. As the semi passed me in oncoming traffic, the wind hit my car, that normal wind buffeting that happens when large vehicles drive by. I don't know if my car thought it was a tight corner, or misinterpreted that wind buffetting, as it was a straight section of road with no turns, but my car suddenly swerved hard to the right, nearly sending me into the underside of the semi trailer next to me before shutting off its steering assist. So to be clear there, I was driving along on a straight road in good visibility on a clear sunny day, and my car suddenly threw itself hard to the right at 75mph and tried to drive under a semi. the sharp jerk of the steering wheel also then caused the steering assist to shut off, meaning it didn't try to correct or straighten, just tried to randomly kill me then turned itself off.
I don't know how much better tesla autopilot is, but I would not trust any vehicles self driving features at this stage. Not with your life. Pay attention, keep your hands on the wheel, and expect it to possibly try to murder you randomly.
youtube
AI Harm Incident
2025-08-16T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyssMyO0kAAt67H4DZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrdAYFX3yS5LOvFKF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5mm5eR-d3KaUzob14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzl51G2rS_vMSUVkKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-tlU-WjrTCsCy0yd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjZCH6Hk1w9QNV2Hd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzp-eWGrr7NTFkCVKt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyhZUsJSIINe5P1Gh14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxfuNjssXt8GwLMWJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxrU_jWhgswWvmljPV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]