Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stop building human robot the creation is belongs to God do not create this huma…
ytc_Ugw6bnE5M…
G
An interesting variation of this would be to allow agents to debate one another.…
ytc_UgyrXlxS2…
G
It would be funny if AI destroyed us so they could build enough processing cente…
ytc_UgxQzXWUp…
G
"It's not because any of this is true." No, this true, based on facts. The pro…
ytc_UgwXEhqMB…
G
No, you're not.
Example:
[Have we, professional developers, already lost the b…
rdc_oc3bcp4
G
One way to temporarily help people with AI is to reduce the work week. We can st…
ytr_UgwWAYmZW…
G
or other scenario: a multi-polar AI world
Hypothesis 1 – Eco-AI as Guardian
A c…
ytc_Ugxusezzj…
G
Lame points. Larger NNs will be able to be more multitasking. The example with s…
ytc_UgzxjIkuR…
Comment
Imagine using AI so efficiently, making products so fast, so cheap… but No One is able to purchase BECAUSE they lost their JOBS 😅
People dies from starvation, Businesses go bankruptcy 😂
This is my “innocent” imagination, any Economic Genius here can share your opinions?
youtube
AI Jobs
2025-10-21T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWx43yQ0JvfWxZStJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzBnVqo0DcyZ4lMYnd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQThPL1eEs3CyVUjl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyAUSITXvryySkdFR14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwocHxkjCmkGXj6aYN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugza2z0Xf0wRyj_pAa94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwfybqnFjBvfw3qo7F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxpXArrYxO0hl8awfF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyIWB6XFmCArYNWMcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwDaXD2t6kUzKVhYId4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]