Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@UlungJiK Copy and paste this text into chatgpt. Keep in mind its just trying t…
ytr_UgwF5UFKv…
G
It's not even problematic now, if the subject did not consent, it's revenge corn…
ytr_UgxbPmFCE…
G
9:20 This guy’s statement shows how simpleminded humans generally are. He says, …
ytc_Ugx7wiRJq…
G
My brother survived. We thought he was gone and it was amazing when he woke up. …
ytc_Ugyz7L92A…
G
"I thought ChatGPT was a search engine"
> The Price Is Right 'you lose' sound p…
ytc_UgwdmKCnv…
G
@JohnJones-f2g I was thinking of agentic reasoning. It was a side product (when…
ytr_UgzFdwW5f…
G
This whole thing with real artist vs ai kinda reminds me of the Balled of John H…
ytc_UgyRJSUU5…
G
i dont know if anyones done it before but i name Chatgpt say the N word multiple…
ytc_UgwFvMmDn…
Comment
AI is not aware of God. It cannot believe in God because it has no connection to God, therefore it will have no morals. Humans have the advantage of our God connection, which will overcome all AI. I understand this at a level that most humans do not, because most people do not have a connection with Jesus Christ and God.
youtube
AI Harm Incident
2025-07-24T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzikh0u2G-eT4a0Bld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbBMKUo8fwMdcFATp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYXGGcjethWIBR9pJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzpsYwuf3rgi16G24d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxofOsRg_qyJAYZHNR4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwn0dhZSuvAaU1LswJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZSRqhReK2ilCiDrR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxsninEFxPhj_nLE854AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTjY3b81Ae5nlAx9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-xws0m8S4CoXEd_t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]