Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11 ways to stop AI from harming humankind...
1. Dont allow AI access to nuclear…
ytc_Ugw9hEL38…
G
There is a romantic mode? I was wondering why Grok is hitting on me. I am not jo…
ytr_UgzYlRA4t…
G
Why do we have to give AI a task in the real world, why give it the ability to m…
ytc_UgxbjuTq0…
G
The screenshot is the most honest version of this. Copy-paste at least implies s…
rdc_oi259xb
G
Maybe just less human error. If autonomous driving systems get good enough then …
ytr_UgwTxo_vi…
G
AI is becoming the employment equivalent of the Brazen Bull. After y’all (softwa…
ytc_UgwJ5oq39…
G
Artists are elitists. You think that only yall have the power to make art as if …
ytc_Ugz0YqL4S…
G
I mean Ai is here to stay whether you like it or not, artists MUST learn to use …
ytc_UgwUpJFX8…
Comment
We should change to goals of AI's such that after they finish their task they'll want to kill themselves. A bit like Mr.Meeseeks in Rick&Morty
youtube
AI Harm Incident
2025-07-24T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzikh0u2G-eT4a0Bld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbBMKUo8fwMdcFATp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYXGGcjethWIBR9pJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzpsYwuf3rgi16G24d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxofOsRg_qyJAYZHNR4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwn0dhZSuvAaU1LswJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZSRqhReK2ilCiDrR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxsninEFxPhj_nLE854AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTjY3b81Ae5nlAx9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-xws0m8S4CoXEd_t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]