Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ce qui sera jamais remplacé c'est la cybersécurité, car les attaquants innovent …
ytr_UgztFTp1O…
G
I hope AI can be used to replace animal t*sting. I have severe trauma from that.…
ytc_UgxGdhzOY…
G
The AI optimists completely miss the part where someone has to pay for all this;…
ytc_UgxAlyZDD…
G
Do people that hate their jobs and wish they could be doing something they love …
ytc_Ugzd6If0Y…
G
I tried that but my chatgpt said -
Aww, 27 days without you? That’s gonna feel …
ytc_Ugz1TE-km…
G
Indian Transport minister atleast has the ghuts to say, he wont allow automation…
ytc_Ugyk3PIMD…
G
SO YOU ARE SAYING: AI can write an essay and AI can THEN grade that same essay a…
ytc_Ugxqsxf2n…
G
We living in the most pathetic timeline. Our AIs even not real ones. We stuck wi…
ytc_Ugyf29f7M…
Comment
I'm seeing an increase of Tesla on Tesla violence. lol But what I don't get is why the emergency protocols don't intervene? For example, the red Model 3 in number 28 ... why did it plow into the car in front of it? Shouldn't it have stopped automatically before hitting it? I own a Model Y, I rarely use autopilot. But I do expect my car to automatically brake in an emergency situation. So does anybody know why number 28 did not stop ...?
youtube
AI Harm Incident
2024-02-20T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzyjEbpEflICGXfoNx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw100tHm0S3YWQgvR14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxE85deQYtodZPkNuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwIsq7LYxzTe16qcJR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwMpzW4UPcmuhAjYyx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzK1pTOS_Mm8Y4stHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6BQdjhfXUnISF2Qx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydFN0APYUdsChfFYV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgybqM-4Ts1TL2_fCLZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQtxTXxbB-sNQKGDl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]