Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
don't worry it will take 75 years before a AI robot could do any job reliably. …
ytc_UgyiQnyx0…
G
AI will be achieved maybe some time in future. I just don't think people realize…
ytc_UgwNftwT3…
G
When I tap the subscribe, like, or any buttons at all in landscape mode, they do…
ytc_UgxWyYjcY…
G
Come on are humans controlled by algorithms? No algorithm can imitate humans. Hu…
ytr_UgztoN_on…
G
1. This is a modern spin on "slavery". The largest element of operational cost…
ytc_Ugx8xywTt…
G
Let’s not forget when Google made an AI specifically to diversity hire but had t…
ytc_UgxeSEhIg…
G
I hate to break the inevitable to people who should know this,but over 70 percen…
ytc_UgwdXmnFM…
G
This is by far the scariest episode i watched…with the technology we have today …
ytc_UgzIf3G_J…
Comment
@davidjstraley1647pretty easy to come up with some explanations.
1. Because if they don’t these people will quite literally starve and no longer be able to participate in consumerism. There still needs to be a way to acquire currency in an economic system.
2. You heavily tax owners of the AI capital who replaced the jobs of the people who need UBI.
youtube
AI Jobs
2025-07-31T12:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyDEgaC6dLUEwiFID94AaABAg.AJqrZh261FdAN67hTpP-fJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyIZDdfZIrhqQnG5F94AaABAg.AJqaD-50k84ALiLLFO_GN6","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugw0lzf3ucFWcvpanz94AaABAg.AJq16_Xboe2AKc0n1D3hCi","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugw0lzf3ucFWcvpanz94AaABAg.AJq16_Xboe2AKnTXBHkkcn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw0lzf3ucFWcvpanz94AaABAg.AJq16_Xboe2AKyhHKEJDF0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyQyG4Q-Ch4ysw5NBF4AaABAg.AJpcc1lA8B_AJtzvyhYTq9","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgyQyG4Q-Ch4ysw5NBF4AaABAg.AJpcc1lA8B_ALfPhot7NMf","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugx6VBxzt8oz9Sx9X3Z4AaABAg.AJnvU_aQl7GAJq1Szijd1u","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx6VBxzt8oz9Sx9X3Z4AaABAg.AJnvU_aQl7GAJqNbFKTGtr","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgycEL58-Wv7HIkOmEd4AaABAg.AJmTKv3Xx7fALEuKX4_yr5","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]