Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So when AI takes everyone's jobs, who's gonna BUY their services? RICH people ha…
ytc_UgyVWD035…
G
Whether or not AI makes art is a futile debate. Everything made by humans can be…
ytc_UgzU_D2V-…
G
It seems more like unchecked capitalism than ai. I’m just cautious of not leanin…
ytc_UgwGYnCrq…
G
This is not an AI issue. This is people being lazy/selfish/unscrupulous for thei…
ytr_UgwoYJ0aM…
G
Large language models are basically predictive text. They are fancy versions of …
rdc_nnl31sd
G
Don't steal art and make it ai but I'm glad that they didn't leak the girls name…
ytc_UgzoOBX_K…
G
@LinkEX Well, tech advancement is exponential and not linear. So, the same fit t…
ytr_UgzZNBFGR…
G
she is right. i don't trust him cus in the adam grant podcast with him as a gues…
ytc_UgwPeZ8Ul…
Comment
What I think is truly stupid is "training" AI by showing it ten million images of something, and hoping it figures out what to do. What if there aren't ten million images of overturned box trucks, or stopped school buses or whatever? Someone dies, that's what. How about starting by programming in the basics, like don't run into stuff in front of you? Then program in the laws, like don't pass stopped school buses with their red lights on, and only then show it ten million images of crosswalks, etc.
youtube
AI Harm Incident
2025-06-03T15:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZN40dbUO8iW1I2D54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9IcuOJhD_qu13TS54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw7lJOgxCxuzcrr0X14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmZUOTTg1hgqawAZN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDRW70rXfdESBfb_J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzyo5-rzBNjf-zc_2V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxWsEphWy4zNaWNB_Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBzjAcE2lRwsdW7Y54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw6luAGfJlNWg7o6_14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPOYCzP2NtlOYtlEF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]