Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ones scared are the ones that can’t take care of their daily needs without t…
ytc_Ugxq1Oghn…
G
As an actual researcher in biology, I laughed out loud when he said AI can do Ph…
ytc_UgyJhUy98…
G
I hate to bring it to Neil this bluntly, but he's utterly wrong on the subject o…
ytc_UgwLMSGpK…
G
Idk i like what you’re saying but it just sounds like copium. If what you were s…
ytc_UgxdaPp86…
G
if ai takes over who's gonna buy things? rich people who else could it be…
ytc_Ugy9w08OB…
G
Bro this is an intesional face we know robot are not cable are not so advance bu…
ytc_Ugw4_yWGY…
G
Everyone’s panicking about AI like it’s some sudden existential threat, but soci…
ytc_UgwmNesAa…
G
'You should take a human taxi around here. Why would I do that when there's waym…
ytc_UgxvjNfYS…
Comment
Self-driving cars face much bigger challenges than the hype suggests. Companies like Waymo have made real progress in limited areas, but the video is right that full autonomy is still extremely difficult. They still rely heavily on remote human operators for edge cases, construction, bad weather, and unpredictable situations. Mapping the entire world perfectly is basically impossible because roads and cities constantly change. After spending hundreds of billions, true robotaxis that work reliably everywhere without any human backup are still years away. The technology is impressive in controlled environments but struggles in the messy real world. This looks more like a long, expensive evolution than a quick revolution. Interesting to watch how it develops.
youtube
AI Harm Incident
2026-04-24T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAmE6WFrpH79DLRGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzCjgaFEh83VUiYSd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDhspAqJOkTYj0k7t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaHJNYfrWfY8Ku_K54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxagg68u_yyF4egUG14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1sktigPFtf2eRQQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyYLrYZnzkJne7C1XZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMeFv49mTEeTt0ObN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQiF2alDI_Y_CUWLp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyauiBGrLE4lAQJ0Xl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]