Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A question I have is can you create Ai without it ever malfunctioning.. makes me…
ytc_Ugxy6Vi1Z…
G
Like it or not there is a presence here and they are here to stay, we could you …
ytc_UgxSHRV0Z…
G
I am surprised and saddened to hear of these cases. I’ve had instances where Cha…
ytc_UgxX2W5Ix…
G
I was so embarrassed when I proudly announced in class last year that I did digi…
ytc_Ugwe0PpqH…
G
I think this is a similar problem to determining if animals are people: they're …
ytc_UgyX53NRM…
G
We appreciate your perspective. It's important to remember that while AI technol…
ytr_UgxQvC2Rp…
G
Thank you for your posting. I have read comments about ”writers” who are claimin…
ytc_UgxG-_YmG…
G
Giving a machine rights would be silly. We have to be controlling - we can't be…
ytc_UgzRnz8y6…
Comment
Very sad story. I would never trust a self driving car, let alone a prototype.
The company should be held responsible for this.
youtube
AI Harm Incident
2021-11-18T16:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"ytc_UgyIuqFcQ-WfWqe2Avt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOCEiIcC6sQ5VSpgF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqvfYf4dyrIJLFpId4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwjl0BKPMuUFgaNlMt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwV9ERZaSoelEhaIIJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"}
]