Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like a looot of non artists just under appreciate drawings/paintings yk? …
ytc_UgzullNkA…
G
How can we prove whether or not AI is conscious when we can't even really prove …
ytc_UgzGjnHqv…
G
@jpablo700 Is there a sexual orientation term for people who are attracted to A…
ytr_UgwDwRWfi…
G
I usually avoid commenting because the comments section quite often on any indiv…
ytc_UgzWbGCNZ…
G
More money and energy going into artificial intelligence than human intelligence…
ytc_UgygkCvmb…
G
This is why AI must be regulated properly as soon as possible. This will get so …
ytc_Ugytv7zS9…
G
We won't, but we can know this for pretty well certain, if an ai can be consciou…
ytc_UgxsexWjY…
G
Ai is not a tool drawing software is a tool.
Ai is like a pencil that draws for…
ytc_UgxZmvrGt…
Comment
sure, river was at fault. However the system not recognizing emergency lights flashing in its lane is a massive design flaw. this is not a video game, you dont release broken code to a self driving vehicle. because as we know, humans are often stupid and will do things like sleep of a night of drinking while there car drives them home. knowing that about humans.. its why police stop and often arrest intoxicated drivers... Tesla should have programming that can react the same way as a sober human to ANY road hazard. It doesn't, yet there selling them to consumers and people are being hurt or killed as a result.
Perhaps the driver of this car might have got an Uber home if they were too drunk to pay attention to the road. but chose to drive, trusting that the car would help them avoid an accident.
youtube
AI Harm Incident
2023-10-06T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDmyYGb04_fl5Vhx14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwPan9_yS9N0kZ3Ln94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzD7Z_RuEo0hqILsQ14AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy0Vd-5pqYE53uy_Ud4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy8QI_p5qcZnihxq5Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_2mObqmhfhrkC-5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyHjykj2neMsIYtQpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzo5urqv9tcQ9rn5q14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-r2lvVjWBCFXTVr94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzycewy1LUka5BbmMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]