Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would we be free to actually take care of each other and have time for each othe…
ytc_UgxXj8zLz…
G
@KakyoinCherryMaster1989 it's fucked up and wrong but, this has happened for ye…
ytr_UgwWJLppr…
G
Great video. I've encountered the toxicity you refer to as well when challenging…
ytc_Ugx-xCLDx…
G
This is absolutely the best explanation of how these AI models work starting fro…
ytc_UgzIl4YDW…
G
What bullshit is this? AI is just today a fancy word for a nice algorythm.
We ar…
ytc_Ugwv9JBuz…
G
Really isn’t that serious. The AI is also using multiple sources not just the ti…
ytc_UgwgfQDWF…
G
look up algorithm-generated dinosaurs. they all look like eldrich abominations (…
ytc_UgyqUM3yU…
G
Why do fucking old school senators like him think that AI can be regulated and c…
ytc_Ugwyj7-oi…
Comment
Automation has always been framed as a negative and yet we always seem to gain more productivity and wealth as a result. The extreme level of automation brought by Ai and next level robotics, though, might be so disruptive that it decreases our quality of life instead of increases it. Still, I think there's little we can do about it. What are we going to do? Pass laws that say you can't automate things?
youtube
AI Jobs
2025-05-29T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxTTPxZcUVeEf3q3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhblFDj-zm0v4c9254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8vtYLLWfQq02cw-x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzA-6EuFi8IqBf15R14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxZIb6tXa2hL-DCNdF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCtQv0cAwbGTlHmxB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9JaWBRzh2Xynj-vZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx40VrK7eaIVe4lFC54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgytlE-O7T3l8nMIyA54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuNDAKLxZ7i9r6Z-t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]