Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Our enemies don’t stop using AI just because we stop. They would not enforce law…
ytr_Ugz7fFJTI…
G
Machine learning and ai will allow to process data that scientists have collecte…
ytc_UgzrxFmf0…
G
Just train AI on open source content. I'm not anti-generative ai personally, bu…
ytc_UgzQz1IPE…
G
The factories are already fully automated and highly efficient. If anything he'…
ytc_Ugw6p7ikJ…
G
A single unit landlord who doesn’t use a broker isn’t covered by the fha and is …
rdc_ikpn1zo
G
Hell yes i can't wait to this happen. I can't wait for the wipe out of the human…
ytr_Ugy2qeKfU…
G
brainrotted slop enjoyers, the level of cope is crazy. But as long as ppl like u…
ytr_UgwaOrO0U…
G
The problem is the 6000 that die by AI are not part of the 40000 that die by man…
ytc_UgwdASYEo…
Comment
Well, humans are destroying the planet, throwing garbage in the ocean and destroy anything for profit. Many humans don’t seem to care - they don’t care about nature, animals, they don’t even care about humans. Since the 70ties, maybe longer we know about the micro plastics that are very bad for our health, but more and more are produced - just for profit 😢 Good humans are killed because they try to save us - and humans laugh about it 😢 do I really trust humans to lead us to a better future? Maybe we just trust in intelligence far beyond us to save us. Maybe AI isn’t going to turn out like the evil / bad humans - maybe AI will help us and guid us to a better future ❤ let’s pray for that
youtube
AI Harm Incident
2025-09-16T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzsnN8nQrzsROSwKjF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzaQEe_66-YHlYJKgR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw_ZS9-z9G3hV7syKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxgwqiceG4ZosDZBBl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyFscoHoSxlL9q4rIl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgywZ_RkaN6FrbK4fVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5W3jsquS7rZD2w2p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8tzdit3rk8owlTbB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxiKYMKH1c8D4Cl7jR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOr8XdTOaQNCGLpyp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]