Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Tech giants are even turning on each others work and breakthroughs in AI. Ap…
ytc_UgzyNZVtB…
G
Bernie ur so far from the true reality, your job is the easiest to be replaced b…
ytc_Ugxi5Yt-z…
G
Now,many people are concerned about these AI ROBOTS because they could already s…
ytc_Ugzb0lMDX…
G
Agreed.. Especially when data seems to be the bottleneck to AI being useful, and…
rdc_kr57tkm
G
being an AI artist is equivalent to being on a security camera and claiming you’…
ytc_Ugx7wQdOv…
G
I wonder if the Nanowrimo team were to find-and-replace every “AI” in their stat…
ytc_UgyV0QT6o…
G
Philosophically and scientifically you are making a bunch of leaps in logic. We …
ytc_Ugw_Do7TE…
G
Plenty of article that show how facial recognition and deeplearning for IA are p…
ytc_UgwAk9he8…
Comment
It’s clear that 60 Minutes Australia is anti Tesla. I own a 2024 Model Y HW4 and everyday I use FSD and it drives almost perfect since nothing is perfect in this world, of course I’m still there to be watchful from other bad drivers but Tesla’s FSD is driving by the book. I trust my FSD 95% and 5% is for me to be watchful as it pass other vehicles, check my side mirrors and center mirrors just to know your position in the complex traffic and so it takes away my driving stress. The crashes shown in this video were old autopilot but not FSD( full self driving)
Have you guys interviewed Tesla owners with FSD? Not autopilot. There’s a big difference on these 2 technologies.
youtube
AI Harm Incident
2025-10-21T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyzdkvLPoiaBmtkoHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSTkXLPR0Wuhseaj94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKfgci56c0ArXTdiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVNOog-IW9ub0tEdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2U2hYAG9OzjMCfVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz444ENnpZ60wAqBtd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFa4aVrIl8zOcGb9V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyll7pk3Tw6oA3EpUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvSKiydW_CGIyHNe14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy6Vx1cZWe0xIwkfR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]