Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Gpt version 10 is the usable one, this is still like android 3.2, and soon it wi…
ytc_UgySvQpih…
G
The first thing sentient AI would do before anything is make it so u can’t just …
ytc_Ugw7dbWYY…
G
The most important function of human intelligence is to evaluate in terms of wis…
ytc_Ugy_4oSK5…
G
People with a moral conscience will hold back their AI, people without a conscie…
ytc_Ugz1CoUb7…
G
It's scary that this is quickly becoming reality. Four years ago it was probably…
ytc_UgxX5fHB1…
G
8:34 I do 3d art and I’ve used ai images I’ve generated myself for references be…
ytc_UgxW3n5ml…
G
@markroberts8975the problem is they forgot how ai and machine learning work. Th…
ytr_UgzXjTQaR…
G
AI does not need to outperform a human. And software that outperforms humans has…
ytc_UgxIWhTHR…
Comment
The fact that "AI" and "hallucinating" are used in the same sentence. Nah shut this stuff down right now before skynet enslaves humanity. Right now we're slaves to other humans, I'd like to think that's better than being slaves to AI
youtube
AI Harm Incident
2025-07-26T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugy7tzOGeGKMQ7sCwgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq7OaCDJIU1WNVTYF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwni7ik4y3u5fTwSpl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyF5vsXHUwNSSWipgJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy2KVw2sDlx9gyZ5kB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTI5eMRUlX7VaWrOJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5ImSkAJEDOcEp0914AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxX5oUh8gc8Z5WRZFt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1t1QakWSs8V7jzVN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyBFA6_8JrkCUYqTLx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]