Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial intelligence can be much more dangerous to us than we could have ever…
ytc_UgwNdKu0B…
G
@kennethawesome She asked at the end "Would You trust a Waymo?" and I answered t…
ytr_UgxbApOya…
G
Anyone who self-publishes via Amazon or iBooks could figure out how to use AI (e…
rdc_lz5uay9
G
Is it really a good idea to give a robot a gun with live ammunition? 🙈…
ytc_UgylB1MR_…
G
Yep, you should see the autonomous taxis in San Francisco. They are amazing but …
ytc_UgxVHyMUw…
G
Not an illustrator,but putting my feet in music production and I think that AI a…
ytc_Ugw9E6NlU…
G
It all depends on how smart the AI is. Superhuman intelligence will replace VCs.…
ytc_UgxPgWntf…
G
Seems like the jobs “elitists” look down upon are the the ones that AI cannot pe…
ytc_UgzQ0GmBD…
Comment
Just 3 months debunks this fear mongering.
Insurance companies starting to offer 50% premiums for people who only use FSD. Because the data show it is safer.
Safety monitors removed from Robotaxis in Austin.
Thousands of deaths every year from stupid fallible humans. Dozens of Waymos failing and sometimes causing accidents. Nobody cares. Tesla? Front page news.
Pretty much every single claim and misleading statements in this video - including conflating FSD vs AutoPilot - is already proven incorrect.
Meanwhile, not a single report of the multitude of instances where FSD saved people from serious accidents.
The bias here is revolting.
More perfect union? Get out of the way and let people who actually change the world do what they do.
youtube
2026-01-23T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy011oVul5NVjLqJ5V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwUHADFhw6wH-Py2M14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPyG6rngHjQjt1W254AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBF2BQT9Kn_GGk4AF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQG0LE0_tqip_6ho54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXq61Pek2NPzS1Gd14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzm09lsxiPjIjZP_Uh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGUUuO5-VGM6zitgl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMcCS4Z5KjE2lMEYJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxLcagTXbrSn-172Th4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]