Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let’s say we have automation and only the 1% are working while the others are th…
ytc_UgyOwiTf9…
G
We just did some training filming for one of the UK's largest companies. They h…
ytc_UgwBg_I1E…
G
Relying on ai right now is like relying on an alpha or beta version, it’s just n…
ytc_UgyrxWdV5…
G
Hi, i work with AI daily on a coding project. Its not intelligent, but its very …
ytc_UgwPC_fPu…
G
There are no self driving cars and AI can’t even take McDonald’s drive-through o…
ytc_UgwNkS9_u…
G
The problem is that a lie is done with the intention to decieve. AI "lies" only …
ytc_Ugx7nm8zS…
G
I feel exactly the same using AI for technical things like coding or electronic …
ytc_UgxocipFy…
G
I have multiple disabilities that affect both my fine and major motor skills one…
ytc_Ugx7A8BCA…
Comment
I have Tesla subscription for FSD and using it with my 2025 MY Juniper it is not road ready particularly during sydney cbd peak hours. Every mistake can hurt someone or damage the car. Tesla should know this that's why they call it 'supervised' but the marketing is misleading. It's marketed as Full Self Driving, but in reality it is only Partial Self Driving but it'll affect their sales.
youtube
AI Harm Incident
2025-10-26T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyQf8swOlrJlJChnfh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz63f-TRb6quVkhT-94AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugzt_CTJdbbK7UhZIER4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZOEB6llkNmmbrYWZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvLlQz5Yolfw2WykR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-6P2nmYNVBN7RROV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygEs_7Gz-g0haaWTR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh1BTWwtF-0YTioSB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcnbzmMB4Fi2VMyKh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzPIfb1n7mD-FhgUOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]