Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok here is my stance on this. I think if you want to use AI to create scenarios …
ytc_UgywF0J2z…
G
People are already using AI for medical advice. It’s going to be necessary with…
ytc_UgxJhX8MT…
G
if they post art that looks too good tell them to show the layers and select som…
ytc_UgzHdaJb7…
G
@greymatterindustries Once again you people prove you have no idea how AI art ac…
ytr_UgxVw7keu…
G
If Dave sets ChatGPT to interact with him in a Dutch way, which is super direct…
ytr_UgyzrymNp…
G
Да, а наше время такие добрые глаза могут быть только у робота. Человек выходит …
ytc_UgyL96RKT…
G
On no level do I see it as a replacement for actual artists! For me, I have no m…
ytc_Ugzdjts3R…
G
the male robot is designed to be a potential killer "how can you be so nice"…
ytc_UgynbDnht…
Comment
You do realize that Tesla full self driving is in testing phase, and Tesla is still collecting information? The driver is in full control not auto pilot. There's only 100,000 Tesla cars with FSD until just a fee days ago. They have expended. Drivers are in full control and responsible. Tesla is not nowhere near the car being able to drive itself yet. Everybody knows this. I don't understand why you would make a false video telling people lies. People driving Tesla with FSD know they are required to be in full control and must be alert. Accidents that happen are caused by people not following Tesla rules, and they know the rules. They are fully responsible behind that wheel.
youtube
AI Harm Incident
2022-09-25T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxXVkeLc73exKwsnlB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugyu0FZNCfrbF-KwDKd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzF0u64lGT56NL13LN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5bVbyOPWJ9RZ6jLB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZnoHi1HspvpD2gZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPndd7uyLJPKqe2ER4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwMWvk4rXhKTaOH7gp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMHxJ1BUzHabVI-k54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwqH2p6mCR22Qxf7794AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJBGD_pPEExd3J8rN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"}
]