Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI future is so scary.
Yes there are a lot of good that can come out of AI …
ytc_UgwRpkUZS…
G
Dude high on hopium 🤣 comparing the cute little calculators with the monstrous A…
ytc_UgwqidiQi…
G
Lidar is too expensive, that's why they'll never have it. Waymo is much better i…
ytr_UgyCN_1HF…
G
There are a lot of good use cases for ai, you just shoudnt use it to replace cre…
ytr_UgyClVfyN…
G
AI art is fucking soulless. Yes, as an artist I didn't learn how to draw art in …
ytr_UgzJx0Sd2…
G
I don’t understand if you are disabled and can’t use your hands and fingers to d…
ytc_UgyM2Wp4K…
G
@SocksWithSandalsEnjoyer It _is_ snobbish to react in abject horror as the char…
ytr_UgyKqzwW8…
G
Just keep asking those questions and you will get different answers you can even…
ytc_UgyO32wy1…
Comment
The “utopia” will be for the few. The irony is we choose it. We don’t have to give Alphabet more money by using their products. As he said, we will choose the cheaper thing made by AI and robots over the more expensive thing made by a person. A company like Alphabet is so large they can just keep buying different diverse companies and bank more and more money. They will make things more automated and streamlined so they can net more profit. Beneficial for them and consumers at the cost of the people in labor.
youtube
AI Governance
2025-12-04T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz9MCy7-LWsK6ywqNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUVl1Ne2CGo_UhLEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy6VJPQ8LAtdmOrcXt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzc920NKu5EhIZhzJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVZIrEnPauMp9KOdB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwqAoXCctafZpo6GW54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpWG10qIYRgXqFhdd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx2nppn9o6X_5M9Y0t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuI4KxQ3wqDFjFWuF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwY94ItOEC4OZ2KAHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]