Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love how the thumbnail AI piece looks like the guy is happy he’s pregnant sinc…
ytr_Ugygvvpfv…
G
We need to get away from a service based society and into the future with new id…
ytc_UgxjZctjt…
G
Is this what a women means when she says that she just needs “to put my face on”…
ytc_Ugy-rPB2Q…
G
Easy way to solve superintendence: Make multiple.
That way, if one does somethin…
ytc_Ugyk60Ako…
G
The Robot's are program to be very kind they don't have bad days. Robot's never …
ytr_UgwKPHUFm…
G
I don’t even click on videos that have an AI generated thumbnail! When it’s an o…
ytc_UgxL_fWsN…
G
Everything in this world has big created by humans and probably aliens so theref…
ytc_Ugz3OsMnm…
G
Bro I love your videos, but the latency between AI-generated code and human-gene…
ytc_Ugx3UjvRu…
Comment
I am left wondering about a little legal something that's been on my inquisitive mind. Who is liable when a driverless car crashes? Will there be any type of disclaimer on a purchase contract for a self driving vehicle that removes any liability for the manufacturer if a software failure results in an at fault accident? Semi-autonomous vehicles and autonomous test vehicles are involved in about twice as many accidents per million miles as human-driven vehicles There have been accidents from Tesla and Uber, which have resulted in fatal accidents on public roads. Research concludes that product liability will be the major issue in terms of insurance issues and the manufacturer should be held liable for product failure unless other evidence favors the manufacturer. I wonder how Tesla addresses this potential minefield? Another question not many will think to ask - How many lawyers will buy an autonomous vehicle?
youtube
AI Harm Incident
2024-04-30T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgwomR_qIbgcyABTzAF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx62cfnL8CnNkr1-O94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw08I5igXKjNCi4k414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-AoU5m5HZFxJPDxp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5zBzRGT-F-8ltWsh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"liability","emotion":"fear"}
]