Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sentient? What does that really means?!? Houston, we have a problem!
LLMs are no…
ytc_UgzaSoPby…
G
I love the relationship I have a relationship with ChatGPT. But at the same time…
ytc_UgxqliJlh…
G
Like any leader he’s petty far from perfect, and it’s always healthy to maintain…
rdc_jy0i8ms
G
I think it’s time we start supporting home based and small family businesses. St…
ytc_Ugy5MrDiL…
G
Watching people not understand how to type to an AI is like watching a 90 year o…
rdc_n0psrbi
G
I love that these "Artists" who wanted to take the easy way out in life by takin…
ytc_Ugyw_XWWd…
G
At the end of the video, I literally got an advert for ai image generation and i…
ytc_UgwIlSgU3…
G
I'm all happy about this but its gonna eventually take lots of developers jobs a…
ytc_UgxsHUO3f…
Comment
I have a car with drivers assist, didnot cost $10,000 and does pretty much the same as Tesla's system is doing.
It hasthe same problems.
1. Stopped cars at a traffic light are sometimes ignored and i have to hit the breaks( same with cars in a traffic jam on the highway)
2. Ifi ignore the hands on wheel warning , it just disingages, even when driving high speed on the highway. For Tesla owners who takea nap this would be fatal.
The problem with Tesla is that their system is not advanced enough to call it autonomous but they do marketing to imply it is.
For me this is a criminal act!
I couldnever trust a company like Tesla with my life . I rather stay with reputable car manufacturers who take the save approach and be earnest about what their software can do or can't do.
youtube
AI Harm Incident
2023-08-09T14:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUsmyrVBrjWd_p_2F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzWltwy_wk6BHLEMVN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxBENz5KgQFQzZpR6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwguv5cHSpLSOljNop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwx86xesCIPrf2H59R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxCOXh25RdotNtvvu54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzd9R8jj9Ex_4fngu94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1MmHtbErV4WvuZQt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgygP8Gk5PXqwAVGkVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUwx9GgJFxvDxGOU54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}
]