Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm on the fence...but... the AI computer farms: Just because it thinks fast doe…
ytc_Ugz2e7nSD…
G
Interesting worst case scenario. But out of the steps outlined, the first is the…
ytc_Ugzbo0M12…
G
I will be so pissed if a consciousness robot is made in my lifetime and not give…
ytc_Ugg8fO3M_…
G
Automation and robotics will necessarily replace human workers due to the curren…
ytc_UgwYQvCXA…
G
1 is definitely ai 2 is the real one if you think 2 is ai then you dont know wha…
ytc_Ugzzjr_g4…
G
I don't look at any art and say why is that dot there. I say, I like that, or I…
ytc_UgybcZhcA…
G
I wouldn't worry until AI can actually drive a car. They've been saying that for…
ytc_Ugw5ZbFIT…
G
Nice. Yeah, if they can get a strong LLM behind that to make it smarter then I a…
rdc_mfgc7v2
Comment
They're trying to force control through fear. The goal is to impose rules that stop regular people from using AI to compete with corporations. But the truth is, there's no way to audit what data these bots were trained on control is an illusion. This is just corporate protectionism, nothing more.
youtube
AI Responsibility
2025-07-16T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwLWijgKdxuhN5eyzN4AaABAg.AM2RVy2uJU_ASOLJCJzAKN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw9i9K2c0-xFms5pmd4AaABAg.ALnUK3BXWSgALt-b30scJ-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxxRt-IFadooFzBpGh4AaABAg.AKoLuL1HzpUAN9vXFZC8wC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzi6MeZJ4G0Ah1TP5h4AaABAg.AKM2wVgNwlNAKM45hA4ImD","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgwdTum2wRqjtsO4c814AaABAg.AKLuO2mjCytAKeShIUaAK0","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyVfbZveQ2pB4vPJbF4AaABAg.AKJN3OhqGzTAKeTgaT4BJp","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwsXUxsYQhRjPzg2Tp4AaABAg.AJosPni6ZEUALxQEOLVZ7t","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw9TAsaTarq49Vb37h4AaABAg.AJW6tXF5EjJAKeMbeySi0x","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugw9TAsaTarq49Vb37h4AaABAg.AJW6tXF5EjJAQYdpyWyDXn","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxeNpTkeAElp5MPnG94AaABAg.AJKGQL8r-28AJMq6Hrz-Qw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]