Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The evil Ai apocalypse isn’t going to be like what we see in Hollywood, it’s goi…
ytc_Ugwzckq7p…
G
Sincerley have to disagree with some of these AI experiements, especially the An…
ytc_UgxuWB4bE…
G
She needs a heart,...without love compassion etc, she is juts maths and that is …
ytc_Ugy9Iyn_J…
G
Bro is forcing chatgpt into evolution he is trying bring humanity to doom close …
ytc_Ugw8Jr4xU…
G
Thank you Doctor Forde!! This is a serious problem that needs to be addressed BE…
ytc_UgyWDXqCf…
G
Why are you talking with ai anyways, are you socially exhausted at interacting w…
ytc_UgxbGoz-8…
G
Capitalism will define the economic model and the idea that 'fairness' will be c…
ytc_Ugyy6Ej5u…
G
Fear mongering media.
Musk plays on the ignorance of the masses.
Anyone can loo…
ytr_Ugyd8Esew…
Comment
While all that is bad, this video does not show ANY statistics on acciadents per mile driven.
They should be forced to change their advertisting for autopilot and maybe they should be forced to add lidar to their cars.
But all of the accidents in the video are anecdotal evidence when you dont show statistics that compare the accident rate between humans, Tesla autopilot and other autonomous cars. Their autopilot might kill people but humans driving cars also kill people, its only about whether the autopilot or the human kills more people per mile driven. Neither of them will be perfect but we have to choose the option that kills fewer.
youtube
AI Harm Incident
2024-12-22T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxORfCJeYhQ90mQR-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwqD49e-ky6JFSPiOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxn2uU0oiGXuR5diWZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw9Sww4677knQ7sAgV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyFOW-BPndm8X5M72x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4t-KWuoB4kr7kjUh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzK7M1kvBFAYcmy4pJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzV02fshO2oZy9Hvbx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx77lh1atO_WC2rAkV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw34_h7cYPf4j1w5RN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]