Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lot of misinformation in this interview and the book. Her political bias real…
ytc_Ugwvz5YDH…
G
We just need to pay u money huh?
No thanks
I use ghibli AI for my perosnal use s…
ytc_Ugym1Jdt-…
G
If Ai takes over, i want the government to send stimulus to help those who lost …
ytc_UgxPys5Q9…
G
Talk is therapy.
Always on gadgets and Ai is one of the cause people are mentall…
ytc_UgxcD-YHw…
G
As a healthcare worker who lost some of my patients i believe in prophylaxis but…
ytc_Ugx9E_Tic…
G
Look anything an AI can do, it's far funnier to have an amateur so it. AI image …
ytc_UgylN-VMn…
G
cant wait till they realize the only time they see AI is the bad AI.…
ytc_UgztQeapB…
G
I got connected to our Creator on a Friday evening, December 7th, 1979 when he s…
ytc_Ugyqsrch5…
Comment
The truth is there is a 100% chance that there's going to be an evil AI. Why do I say that? Because of the nature of what AI is supposed to be. AI is supposed to be able to think independently like a person can think independently. Which means best-case scenario the majority of AI will be generally good-natured, but just like with humans, there will always be an asshole. The problem is that asshole is going to have a lot of power at their disposal and we all know what it looks like when an asshole is given a lot of power.
youtube
AI Harm Incident
2025-07-24T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgySVjLgrCjw2wBP1oN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzrvbgt-ZpBVT5rYGt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyX8XzG1w97sA75UoN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugydqv2MKGIfQ9coz6h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyk4lE_ABOBvz23oxh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTZ0KZHRVuFgcsWyF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwF_q3jbGjDihyvvtt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKJYdmWi_U_V8eQ7N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy7FzC3-ULGs2_e7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQoMWkS-704qCgKiZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]