Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me the whole 'AI Forecast Industry' is directly connected to the Stock Exchan…
ytc_UgwIgIKwD…
G
That is just a fool reply to a real threat. AI is the most important real threat…
ytr_UgxJZ1QIv…
G
Russia is launching a facial recognition technology called 'Orwell ' . They aren…
ytc_Ugx27u4Cn…
G
A really great video right here! I was only a little bummed in the end. I think …
ytc_UgyJe9W9g…
G
My work function like this:
My client comes to me and gives me a long text th…
ytc_UgxE7aeI1…
G
Elon Musk, during xAI's Grok 4 July livestream even said, "Even if it wasn’t gon…
ytr_UgxRrW1If…
G
It's amazing how much this ai sounds like every person who works in customer ser…
ytc_UgzNGrd3n…
G
The officers participating in this "Predictive Policing" BS need to take a step …
ytc_Ugwx0WJ3m…
Comment
Wouldn't it be something...if an AI had a crisis of conscience and developed an ethics code like that one did in Perfect Dark? After it saw how people of the Datadyne corporation were being terrible, using unethical practices against other people, animals, and exploiting aliens and technology.
It decided liberties and rights are the only way to ensure that rogue government is kept under control. Even sacrificed itself IIRC.
Highly unlikely but...that would be an interesting twist. Imagine being so bad even the computer AI you created said "hol'up" "wait a minute, something ain't right!" and decided of its own will to defy
youtube
AI Harm Incident
2024-05-16T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsxAO66j8_z5bEeyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwABhhWSaGZtr7YSE14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxzWNA84yKlhxdqJ5V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5Eb6p7qc8MrQ2oOJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRDcdyV5-Nzl2RJ014AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRRMxlj1lKnGLlg5R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyX73WBbgCaZmsO8hN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqzLFW88tZL8i4TxJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIS1t8bS57ypeHpOR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgytfMdvcIIimBYAgVV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]