Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like Anthropic? They will sign the deal with govt and you will think they are s…
rdc_o86rqy4
G
This whole premise is just absolutely ridiculous. Just because Tesla allows some…
ytc_UgxMiJHEF…
G
If your young and using AI, then how can your brain develop when there is no fri…
ytc_Ugzq7xQCG…
G
Now jokes aside, if they do let this dummy copyright this AI art, we're gonna ge…
ytc_UgzFChr_R…
G
As a tech bro, a good if autistic argument against ai art is simply one of capab…
ytc_Ugz2mW-nV…
G
A. I. has been around for decades and yes, creatives use it to collaborate work.…
ytc_Ugy7rfVHr…
G
The point here is, when an artist uses other artists work they make sure to cred…
ytc_UgzSnSdyS…
G
Ich finde diese welt einfach nur noch schrecklich und gefühlskalt. Jeder denkt n…
ytc_UgyrO1bv7…
Comment
I think the self-driving car always should favour the lives of the people outside. Also, I think the driver should always have his hands on the wheel. If something comes up, the driver could make take over and make a decision. *edit* Let's be clear, though, self-driving cars are much better than humans.
youtube
AI Harm Incident
2020-11-26T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyr0RMpIPjrTwnqobN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-IgbdLnExAW_EhZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN7dBvLWbIzar2GFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEykcQ7MVTCsx7g7d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwvKTI5iOHqUc77Qtp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0YcSTWrba2PmZtwp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugylzv1Xaz0MgWpxUfJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3EQZ9U6NBDWibkPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxJ57uNPwCyWaPdS5p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsG5XLEWsnnq4EsCR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]