Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, but.. question is, what people gonna do?
Almost every professions can becre…
ytc_Ugztbt822…
G
The living museum is such a fascinating concept and experience! It's interesting…
ytc_Ugx1RMik0…
G
Estamos transportando nossos jogos para o mundo real. Em breve, os robôs estarão…
ytc_Ugyiuvd0r…
G
China: https://singularityhub.com/2018/08/15/china-is-building-a-fleet-of-autono…
ytr_UgxQ52Ruy…
G
@Atman179exactly what they want the masses to believe- that there is an arms rac…
ytr_UgyPFc88q…
G
America it’s your GUNS Law! You all allowed youth to have access to fire arms! …
ytc_UgxDBMtj0…
G
Ai sucks? Finally something me and Disney can agree on oh wait never mind still …
ytc_UgxpvqBFF…
G
The funny thing is that neither he doesn't know, because he doesn't unless he is…
ytc_UgzrHk7rq…
Comment
Interesting discussion, I do think there needs to be human accountability, everyone knows it’s a supervised system. Also there is not enough explanation to the average viewer of this video in the differences of autopilot and full self driving. For those not in the know autopilot is basically and fancier version of cruise control, it maintains speed on highways and stays in the lanes hands free, and brakes for cars in front of it if traffic slows and is only supposed to be used on highways and not city driving. It does not recognize, stop signs, or traffic lights and doesn’t even have the smarts to reduce speed in tight turns last time I checked. In that one video where the Tesla blew past a stop sign and off the road, that’s due to a guy trying to use autopilot like it’s FSD. Autopilot once again doesnt recognize stop signs. I think Tesla should just get rid of autopilot and have fsd as an option. I think people either don’t understand the difference or just flat out mis uses autopilot. Fsd is the safer option all around
youtube
AI Harm Incident
2024-12-14T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxEnLHwZegc08nvioF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoQ-xnQ7EKjgoxEeJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwnm6ihK1DV905s5al4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxWPiX_1X-ZBjXOOTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMhJ9bZd8S7_SGpd94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYlE4ZDKV-FLMblwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwgqPyYjMUYEZXkhsR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyV9P0j22X-Hhql_fh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzlG2NrkUTFGQTeZaJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiM9kWDoxGN1Be19B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]