Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ha, I was just in Phoenix and while we were walking around there was a Waymo car…
ytc_UgwdPaCqa…
G
I'm reminded of reading that Richard Gatling and Alfred Nobel believed that thei…
ytc_UgzUi4BjI…
G
If by "unqualified" they meant "lacking formal qualifications" that's not necess…
rdc_e7jcafz
G
I rather have her as a wife than a real person, at least she truly fitful…
ytc_UgyaOetsT…
G
AI predicts he's 99.9% more likely to be involved in a shooting... Gets shot.
A…
ytc_UgyJFUOJ0…
G
Writing on May 2, 2024: 15 years from now, it will be a Robot asking humans que…
ytc_UgzVTYaFR…
G
This is sad. I smoked a pre rolled Kush. I went and washed my laundry and rode m…
ytc_UgxtzXHMP…
G
A beautiful video and great points.
Some time ago, I generated images by AI fo…
ytc_UgyMHr460…
Comment
They forgot to mention. Autopilot is just a slightly more advanced version of standard cruise control. Of course, any car with this would crash if you weren't paying attention. What the video flaws on is that Tesla also has full self-driving, which can manoeuvre itself to save you from a crash in milliseconds. Many people think Autopilot is full self-driving when it's not. Tesla offers both Autopilot and full self-driving, but many people like to flaw on Autopilot to make Tesla look bad. There are many more crashes caused by humans than Autopilot, yet alone Tesla’s full self-driving beta.
youtube
AI Harm Incident
2024-12-14T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgykhBnK03A1cnzoMK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_0-FCKcGUog-xMyR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyB0K3BpkDGlIv0-0d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfqxdC6amIdtgpkxN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugzr2jhU0SC7IdM0fVB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1VuB22UrMVB9ECqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydaUlBFkjLkGIJRpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNsYdmvLWHKfavuTx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoM0nD_LE-irFS2tx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxT0dy2tMWQ7ZEFb9N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]