Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly I've been giving openai product ideas ai don't have time to build mysel…
ytc_UgwEkF0At…
G
AI isn't a singular thing. We've had incredible advances in AI for decades. Curr…
rdc_jmv6ge3
G
No, this doesn't mean that we need to get rid of AI; this means that we shouldn'…
ytc_UgxFuWH5g…
G
I thought AI was fun, it was cool and fun until i found out that artists all ove…
ytc_UgzKjolFL…
G
i am pro-ai, but if it doesn't work, why continue using it? broken stuff shouldn…
ytc_UgybaP9nC…
G
Wait no, I don't have robot humans on my bingo card, can you wait till next year…
ytc_UgxcYx2Z2…
G
The AI guardian (which can be the recipient of the AI) can make a request to the…
ytc_Ugxv-v1nE…
G
then everyone will just die then. But people won't accept that. They will rev…
ytr_UgyPsUztZ…
Comment
I wonder if you can open the door while this kind of car is moving. Like, I can only think of 3 options:
1) The safety lock is automatically on and you physically can't open the door until the ride is over or the car is stopped
2) You can open the door but the car realizes a door is open and comes to a stop
3) The car doesn't notice a door is open and keeps driving
If it's 2 then that would make for a simple and effective emergency brake for the passenger in instances like this. If it's 3 that's just plain dangerous why would you design a self driving car like that? And if it's 1 I feel that would also cause issues where the passenger just gets trapped in the car like here.
youtube
AI Harm Incident
2025-01-27T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy29KjxLa48Bn-SCTJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzARL_OEG6MhneFjEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmmoaM3YGkROVxI1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy6NjpU6i7JnIWnv4l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxJ9MaVc3CsNR3saYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPWYJMttc_P44I7Ud4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcxEBywmPOeK3OASl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw37TcFf-Mel2ML85p4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzhJrrSY2vPwpTW1TB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeKru1Mu7QeDTkGQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"})