Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Aw geez, I feel like there’s already a ton of AI-generated images hidden in my s…
rdc_n3x6wt2
G
Why make it look human. The dataset ai uses is based on what the programmer set.…
ytc_Ugz9CG5vv…
G
Another occupation that would be AI proof is altering and mending clothes. I ma…
ytc_UgxMIwjX5…
G
@sammy45654565 I don't think that's how it works or that's what his views are. "…
ytr_UgyY5iyOM…
G
@Blacksun169unfortunately crimes, especially of a sexual nature will continue to…
ytr_Ugw15iV8o…
G
If we all have jobs, then we are independent. We are free to make our own choice…
ytc_Ugw8YgcwW…
G
More people have more impact. Who would have thought.
It’s not deflecting. It’…
rdc_gx846fe
G
There is a trend on YouTube of "brave dogs" saving children. A couple of the th…
ytc_Ugy9a5Z1H…
Comment
Not going on the side of Tesla as the car did register the stop and car so it should of stopped but there is also a reason USA laws, generally all states, say you must not be negligent in driving, aka, keep eyes on road, hand on steering wheel, etc. Few states allow for fully autonomous vehicles and yes Florida is one of them. So my guess is the court has to decide if Tesla's self driving feature is a driving assist or a autonomous vehicle. If the court decides with an assist, the driver is responsible, if an autonomous vehicle, then the company is liable but this also gets more complicated as you are required special insurance for autonomous vehicles which just means the vehicle was operating with no insurance and I couldn't say whose at fault, or where it ends.
youtube
AI Harm Incident
2025-09-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2Mv8TukwB_VeXp5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwh4Fu7UDRAU7pAi3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz1l6tgW8rZ2O6IvNR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxgJ8QZX3PfSZD4tah4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7G17Z3BZ8dJT_syl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRWiRPSeOxdyV-BY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwflz_W5JsX-kJlGa94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwy8FJEbVVI2EFEf594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzvndmsEWPB0d9uFaF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxH7EpIvK13GD55Jl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]