Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we are gonna talk about how AI gives people wrong medical answers can we also…
ytc_Ugy8D7hB7…
G
AI is gonna be like social media, some use it as a must for business, some love …
ytc_UgxWgx2b8…
G
I’m a 30 year Silicon Valley worker (3 start ups; 1 big tech company).
Layof…
ytc_UgwCi6igX…
G
FYI, this is actually an AI pretending to be Bernie because it realised that hum…
ytc_UgwDQ0NC4…
G
Dishwasher= robot, Microwave= robot, blender= robot... A self cleaning toilet wi…
ytc_UghUorBK9…
G
Where can I sign the petition to make Facial Recognizing software die. They are …
ytc_UgxyyOIpF…
G
I think its funny how "hiring" is misspelled on the first billboard. Mustve use…
ytc_Ugx6AlaTZ…
G
What people doesn't understand:
When china will finish to develope super intelig…
ytc_Ugyno1bRZ…
Comment
Tesla owner here. Putting aside my thoughts on Elon personally and Tesla as a conpany, here is my perspective:
Autopilot is really just Tesla's branding for cruise control. Full self-driving (FSD) is what contains the capabilities for actually steering, stopping, merging, signaling, etc. automatically. This is also what caused some confusion in the video Mark Rober did on the subject.
Regardless of if you're driving yourself, using the autopilot feature, or using FSD, you still have to be operating the vehicle with your eyes on the road and in full control. If you are distracted and an accident happens as a result, that is still negligence.
youtube
AI Harm Incident
2025-08-15T18:3…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3SaUrZX2yOzQ_6Oh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVHhzoJA5ZGTRNuXB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysDxWZFwMGJvuStPt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVm3rQ3D6BcwQpuwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyfVw9u8F179VQgM8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxv1KqFmv4nmdmGM5F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVNOUcSMXW2HAejnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMPT61al517ReEEip4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQ4DmOOAHfyLa3tvR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyK1M9lzunLIii4XUd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"}
]