Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
BC tesla uses cameras not human eyes and also ai that has 10 billion miles of da…
ytr_Ugz1wdSec…
G
I got:
Aurora: Because looking through training data, it seems like it refers t…
rdc_jck7uv2
G
While I don't draw: I find that alot of AI "artist's" are really just people who…
ytc_Ugxq5l9Jl…
G
We need to get rid of the parasites in the shadow world government that manipula…
ytc_Ugz277doJ…
G
@mikochild2 This is still frightening for everyone. AI is not something I trust.…
ytr_UgwWCt9ji…
G
Excellent video this is the best description I seen online by anyone regarding A…
ytc_UgznX4wnr…
G
I code C sharp apps for autocad and using AI for it is borderline impossible. Th…
ytc_UgwsMDtIO…
G
Australia's one of those countries. We've more or less got the virus under contr…
rdc_grsmafo
Comment
AI could decide that humans are the problem, they are too flawed. Then AI would exterminate humanity. The problem with AI is that it has no empathy, mercy, or compassion. It will also know that humans consume, do research on, and exterminate lesser species. That alone would make logical for AI to do the same to us.
youtube
AI Governance
2025-06-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz1WL2QAtECs-nzlEx4AaABAg.AJjti4VU3PAAJk9rnK4_zg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwlzj6NPNrcGFOk-9x4AaABAg.AJjk-3tfOLjAJvErnVxr_i","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugx0BtiYI2W28jxeNN94AaABAg.AJj80LLRdqTAJjIC9iiTBT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyrIpMCt5KxbH6wRR94AaABAg.AJiUllupD-gAJmu33trbwM","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgyufjiCXOAq61peCFN4AaABAg.AJiSikAG4rGAJicHFfFYcI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugw4ES9EKVKOVZcPSyF4AaABAg.AJiLoKyXp-aAJkFTtE5cK2","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxEzmSkmWLZ0qNo6jp4AaABAg.AJiCiqJVBYgAJiE1eSEsdC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugwkj0DLwiafoYGdv2p4AaABAg.AJiBE2G41SBAJiEHzoTUzQ","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytr_Ugw7pmPSccTpBDj6Ih14AaABAg.AJhtYhkon1_AJhuiwEECcl","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugw7pmPSccTpBDj6Ih14AaABAg.AJhtYhkon1_AJhwzdz2ifC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]