Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use ai art but I use it for world building because I’m a team of one and I don…
ytc_Ugwlgt5zQ…
G
You can't take those chatbots seriously. They always show you snippets of conver…
ytc_UgxcH1uCI…
G
AI is better than human in many many factors if you believe in spirituals
If not…
ytc_UgxZFpHAd…
G
Well this is two entirely different segments so let's talk about both:
Segment …
ytc_Ugz_B_ULI…
G
Why don't they do something about the actually dangerous part? Anyway, my point…
rdc_o0kcq59
G
I really like this video, you did a great job of showing why artists dont like A…
ytc_UgzZZLoto…
G
My job as a ghostwriter got replaced by AI only for companies that mostly rely o…
ytc_UgxSayR-j…
G
We' ll have to make sure that AI works on electricity so we'll be able to let t…
ytc_UgwFeOTiI…
Comment
The biggest danger of AI is automating human work out of existence. Imagine a world with no attorneys, no teachers, no journalists, no truck drivers, no cab drivers, no chefs, basically no humans working in any field in any substantial way. Imagine linking Boston Dynamics robots to an AI that is smarter than every human who ever lived by orders of magnitude, and you'll have an idea of what is coming.
youtube
AI Governance
2023-04-19T12:1…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyqbRAFFl2b2VUq54x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy0F_7Iw4VYcYXYaDd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyuNsXX3txlE0m82-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvG5j86zdAvQQX47V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcdJDGMVXDPfUP_Rx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysYfUR-9jEMryiSCV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTEyP9ejywH6gWDMx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxveA3JtZGM2HyQf6J4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzVLKuwniyRoY5k5AV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEdSeUrydmUrbTwOZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]