Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate to break it to you, it’s not AI, it’s always been the case…. Unless you n…
ytc_UgyQrGVPL…
G
@zip10031
What's fun about AI art
Genuine question btw
Like "oh wow i told thi…
ytr_UgxjjSgWW…
G
Omg! It's the RoboRishi 2000, he's turning into an ai cyborg before our very eye…
ytc_UgwRT6hj_…
G
Just wanted to tell you guys that any day where I come home and see one of your …
ytc_Ugj-mRAJ0…
G
The very definition of AI is that it can program & improve itself beyond its ori…
ytr_Ugj_RptSN…
G
It definitely adds a unique touch to her character, doesn't it? Sophia's design …
ytr_Ugz528RcH…
G
@BringYourOwnLaptop Exactly! If you look at the pace it's improving. I remember …
ytr_Ugw-xt2js…
G
If AI starts curing diseases, Big Pharma will destroy it. Skynet aint got nothi…
ytc_UgxzcgDkF…
Comment
I used to have such high praise for your channel. Conflating Tesla’s Autopilot with AI is wrong and I hoped you knew that.
Given the number of miles driven, how does that compare to the number of motorcyclists hit by people. And how has it improved?
And where is the responsibility of the driver in this?
Radar does not really give distance, with out computation, just like computer vision doesn’t know distance without computation.
Never mind the ever increasing improvements.
They actually not making billions on Autopilot. Autopilot works very similar to airplane autopilot.
And the German decision was overturned.
Or how about not replace stock lights with hard to see, narrow rear lights?
But again, the responsibility is the driver, even for those participating in the FSD beta program.
youtube
AI Harm Incident
2022-09-03T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw-ZdR--2Vx0krS-yp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6CbqyCoJV9VZAHqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_VF7ouu_c4BQCcjB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxr3KETI0jxrkeuK2t4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzMuZxDfaC23kZiXY54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXKsnW-5E_prR2eAJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwA4Urx2Kap4bd-lox4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1oQpkJoVjhejiktl4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzlSFViqLTok2aEOQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3STGn4vRbXyqHx7Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]