Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans are vey very scared species. They are even scared to give permission to t…
ytc_UgwXGP8Zl…
G
We put a team on preventing hallucinations. Once they showed some immediate succ…
ytc_UgzP70ix2…
G
More Knowledge Gives More Wisdom..!
Human Knowledge is Very Poor to gain Wisdom…
ytc_UgzfWPyNC…
G
Only thing is that if it fails we can recover. If AI takes over its over once an…
ytr_Ugx3RUL4o…
G
There's a reason why a guy was sitting in the front seat of the Tesla. He's ther…
ytc_UgxdwpNMJ…
G
Humans must eat these AI robots! It's the only way! Seriously though, it will be…
ytc_UgxWTB5JK…
G
I think regardless it’s here to stay. AI art companies are going to start poppin…
ytr_UgxLgOxHG…
G
How come developers of AI fail to see that the very thing they create today may…
ytc_UgyQbcQh6…
Comment
The problem is, (as it’s often said), who will fund the demand if everyone isn’t working?
We’ve also seen companies REFUSE to engage in pricing wars when demand spikes. Hospitals are example… they’d rather pay overtime of run short staffed than bump up pay to meet labor demand. The same concept will occur with demand from AI production; they’ll save billions in labor, and keep prices the same. Wait till nuclear power occurs, you’ll see prices the same as fossil fuel energy.
youtube
AI Jobs
2025-11-17T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhLwZmUJ2F8IeWw0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywqxswB7s8wx_o4N54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXLfq2JEaNEXekPvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgykwYTEtYFZcuOv3bJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzPvhKUQx7LJ8jIh54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyzfL2mV9cHSOb1hN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXSK_N3-Eedu7tCNZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyN9fsVoWAxyYs5_954AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZUK9pnCaEKUItEYR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMj7WBXIydGrv-2-p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]