Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can go straight to AI for everything and forget how to learn anything becaus…
ytr_UgwJEShoE…
G
The team that manages Howard University's websites did an amazing presentation o…
ytc_UgwRtFy82…
G
Focus on electric trucks over autonomous trucks. I don’t want to drive on roads …
ytc_Ugzt_2e6W…
G
Search lesser weeder, this thing uses AI to I'D weeds, then hits them with a las…
ytc_UgxTIekQL…
G
Again with this argument. You are inspired. The AI is not inspired by anything. …
ytr_UgyxbI2NL…
G
It's so annoying that we're on the precipice of the future in Star Trek where no…
ytr_Ugy7RF5T8…
G
LoL, the two critical terms thrown around are “intelligence” and “learning.” Thi…
ytc_UgwEQjcxN…
G
TL;DR 💡
AI will transform labor by increasing demand for services rather than el…
ytc_UgyBNtFHT…
Comment
Tesla Autopilot did not kill anyone. The responsible driver being assisted with Autopilot killed them. Since autopilot is still learning, this is why drivers are still responsible for any actions taken by their vehicle. Autopilot some day, will be better than any of us. As a rider myself, I am not afraid of Autopilot, just inattentive drivers.
youtube
AI Harm Incident
2022-11-29T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzxz48txMxhQeiCBNd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhuiJI9p_eVb8udIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzPyIM8H7iRtRiqqF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKQLq0botMsua25RZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR9i35eMPKGoLn7ZZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyF6aWxUW0YAyLJSst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5oqkYGEbplxxy90d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynoeuXTRbvaAdSGoZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyWHtt-bvpIkZp8YkR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyIIb2CL_D9Co96AR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]