Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If that guy did post an AI image as a comparison before that post then no, you a…
ytc_UgwpF_maD…
G
@ someone said that AI is exactly like a person‘s brush strokes but it doesn’t e…
ytr_UgyqMKDD4…
G
gates would be one of the last people id go to for anything on ai.…
ytc_Ugx4Jelm8…
G
Funny thing about AI servers.
It requires gallons of water to cool down. and h…
ytc_Ugw0ryjVn…
G
The unions did this. All across the county they are striking. So they invented a…
ytc_UgxkprNhW…
G
Why does this guy claim that Elon Musk has no moral compass? As he demonstrated …
ytc_UgzqaH1MP…
G
My phone can’t spell words , auto correct is still a joke. But AI will be perf…
ytc_UgyFViyVP…
G
Professor, I am currently reading your wonderful book "The Age of Surveillance C…
ytc_UgyKWlcv_…
Comment
In theory, one would have no issue with this, if we were living in a Star Trek-like world. As long as there is a true universal basic income and the means to produce resources/energy independently, etc. But we don't live in such a world, and people that would benefit from UBI fight against it. Even if we made robotics and AI illegal in America, US companies would still be competing with automation overseas. The problem with technology like this is that it's a true Pandora's Box; we can't go backwards, and the technology will absolutely happen. Solutions to this aren't necessarily to try to force people to live in a backwards facing state, but in the short term, there are no other options. Until we reach a point where true UBI is accepted and feasible, we have to chain billionaires from ruining the planet.
youtube
AI Jobs
2025-10-08T16:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9k0bcMPvsAIgnwuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAJpalnUcADfjZgLd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOAcHqaht_vjaaEBB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyPJIQPRr7mx0o9DYp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzo-UX0eSuV3RWZQvl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuMnzYunU49CiH9e94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwiLrUlK0nezpO7dd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzm4ViWohVcY9JEbYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzmvxKgfpTOCpOBIzh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwua7i1r0_VHXJ847R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]