Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
please also upload cctv footage of Tesla crashed at high speed with self driving…
ytc_UgzTnva45…
G
So if ai robotics replace our jobs, we wont have the funds to purchase these rob…
ytc_Ugzg1xUTI…
G
Awesome video.. but there's a lot more negatives about Ai that should be said.
2…
ytc_UgzK8NrCJ…
G
This is just manufacturing consent by the very makers of "AI". Their only uninte…
ytc_UgzxblkWu…
G
would choose forgiveness and restoration over punishment wherever possible.
Non-…
ytr_UgzHmGKuq…
G
I don't think CO2 is a real threat.BUT my suspicion actually is we pay in Europe…
ytc_Ugzz2DxID…
G
> "I think people don't realize the effort it takes. It takes me several hours s…
ytc_UgyaTzJWQ…
G
How can AI keep receiving electricity after we're gone? Why can't we just pull t…
ytc_UgypZrH6s…
Comment
Ai has one major Flore it has no common sense it may be intelligence but that does not make it right lots of intelligent people are stupid they are good at what they do only but can’t become involved. If common sense is inputted it puts a considerable right and wrong into the fact and it should be the priority that right action is the best outcome for humans not the ai.
youtube
AI Harm Incident
2025-09-13T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUPrrlbpENjit66xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLSkGxAQv8TLTgQ854AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpaBbtk5-oZOzliPh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUAaro-XLVKSpUS4J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnVpN0B8bP8Or2VKx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWXw-IKvTZq_ONn6B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwLN1WFAd7iZq7Kngl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMbeUK9D5KwjpsExN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrDYBui0s6iHtEfEh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoLy3FJtHrX0THrD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]