Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing ai should be used for is assisting people and internet humor. I w…
ytc_Ugws3-S-P…
G
A.I can't make art. What we see coming from A.I is not art. It's just taking art…
ytc_UgyjhQ3IA…
G
dude... he does the exact same thing to any other ai, male or not. maybe stop pr…
ytr_Ugx2vsFF8…
G
If telling the truth is terrible i don’t care millions of people have that app m…
ytr_Ugx7a5zsX…
G
Agi will never happen, ai models are primitive a form of filtering nothing like …
ytc_Ugz36WPoa…
G
@claytonwoodcock6942 This system is not one of censorship; it's a system of AI …
ytr_UgxfGQ0FY…
G
A robot walks into a bar.
The bartender says "We don't serve your kind."
The rob…
ytc_UgzYoOiZa…
G
...if it thinks like a duck... well, and this is where it becomes most uncomfort…
ytc_UgyLSSHsX…
Comment
People, for the most part, are terrible drivers, as is demonstrated by the massive road tolls every nation with cars has. Now, before you go all 'terrorist plot' keep in mind that 9/11, the most successful terrorist attack, killed fewer people than are killed anually by caffeine, but 34000 people died in car accidents in the US in 2012. If we can take the control of 2 tonne killing machines out of the panicky, incompetent and lazy hands they're in and automate it we'll be better off.
youtube
AI Harm Incident
2014-05-26T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxDaMBOSiNJD2siXy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyN5DcdkN4_CPV9W4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJgB9bBLAVLTAPPcl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvkgE0W9UcAxjR8s54AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugimzyh73Mem3ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-0SjwbhJOkngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiSbOcYg8LpjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNHMI5Dl8n3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiKPIaGrg1XJHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggDy9xEJWdA5HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]