Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yall acted AI wasn’t gonna watch terminator understand the assignment and flip t…
ytc_Ugx604LZX…
G
Wait... Those "Prompt Jockeys" call themselves "AI Artist"? To me an AI artist i…
ytc_Ugyg5G0-z…
G
You misunderstand. As an artist all my life, we are inspired by other artists ye…
ytr_UgzHyc728…
G
At first it looks like a robot bro, but they’re acting like human looks like a h…
ytc_UgxEYxKgx…
G
@Jesusaross🤦 school has brainwashed you into becoming a 9-5 slave. If AI is so …
ytr_UgwKfNBhN…
G
As a senior dev using these tools daily, I strongly disagree. AI and agentic cod…
ytc_UgwYxtCMe…
G
I know a guy in group 1. It’s surprising because I always thought of him as the …
rdc_mzw21p8
G
I just watched another terrific video today that was a pilot debrief for a fatal…
ytc_UgyNQU7dd…
Comment
People keep talking about how humans will be needed to oversee AI, but listen to the people developing these systems in interviews… Many of these developers talk about how the goal is for AI to surpass human intelligence, to make human intelligence become approximately 1% of all intelligence. It’s going to make humans irrelevant. I see so many people defend this by saying “you’ll just need to know how to use it, manage it, or program it.” That doesn’t seem to be the goal of those who are creating it. I hear them talking about AI programs hacking or developing further programs, not humans, and humanoid robots programmed with AI taking over many human tasks… If you really listen to what many of these tech CEOs are saying, it will be humans will be serving AI in the future, not the other way around.
youtube
AI Jobs
2025-06-26T13:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZbu486ElWln0xKBR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUa_0rvXdDrlJnncV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_UgxkrIfTJbnAHdDjOB54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPNGR0asTDjwQ_txF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxhkVhTZneRLNnl1-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxWc7XqlY-MYqFw7f54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7QLoAJ17iqwgfWXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbMyWVP41OJeSEPQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxlych_O3DG8u5SviB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyQIbS-RIezU44kRvF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]