Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just need to pick a good survival character from every robot movie ever and ju…
ytc_Ugxld9PmS…
G
To say art "can't" anything is actually the opposite of what art exists for. Any…
ytc_UgwqHKqg1…
G
You don't even know the basics for how LLM's work you speak like you know everyt…
ytc_UgzlYgABZ…
G
I read about AI all the time but all I found to this day where LLM?! WTF are you…
ytc_UgxnxIVd_…
G
I'm trying to work out why people want to destroy millions of jobs by using AI, …
ytc_UgxIJyHZc…
G
Title says one thing, AI Josh says another, comments are both condoning and cond…
ytc_UgwXPFkIt…
G
When computers started beating humans at chess and go there followed a period wh…
ytc_Ugzf0PeQs…
G
I am already seeing this in the workplace and I think the issue is much bigger. …
ytc_UgydINHh0…
Comment
Once one incident happens where ai is harmful to humans, humans will lose all trust in ai and stop using it. So companies have to address this and make sure it’s safe or it will go bankrupt and that’s not what ceos and investors want, so it’s going to be safe. This talk of ai taking over the world is just Hollywood dreaming. It’ll never happen. It’s just free marketing for these companies to keep that narrative going because it’s fearful and people gravitate toward fear like a magnet. It’s all a scam people entire world is
youtube
AI Harm Incident
2025-07-24T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzHN0aHaovq7eeuyCV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzXGDEgrTHvwu0uLmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVgOy5YXG04NKcc954AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxkKOodE7wVmT03RJV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAmjXa6TmFc0mYUnJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzDnmttbqB9oF5m0uh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcukoamWoMCCWN0Eh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLYHV3zssnKAr7Kyt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1cjMRO8w7VjwC8T54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsjlRRJmn5aUpBubd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]