Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think all the technologies that we didn't invest in, so we could free up capit…
ytc_UgzcakTuQ…
G
The only good use of deep fake tech I've seen has come out of Hollywood. It's be…
ytc_UgwW4pUHv…
G
Look at the eyes emotions are a mix of many muscle and faces keep traces of the …
ytc_Ugxz1jn-G…
G
imo robots deserve Rights but not like Adults , more like Children have a few Ri…
ytc_UggpPoRog…
G
The I in AI does not mean intelligence - means idiot. AI is human generated pro…
ytc_UgwXj31gZ…
G
😢out of it all the crowd cheering for a robot says it all we are doomed…
ytc_UgyO2sgHQ…
G
AI can reveal uncomfortable truths that some in society are unwilling to acknowl…
ytc_UgxmpC7s-…
G
It feels more difficult to *not* use AI. I fundamentally disagree with the usage…
ytc_Ugz-fEsip…
Comment
@mr.frandy7692 I think ignoring inflection points in the cost of fuel, which we've used to simultaneously replace labor and produce goods and systems that human labor isn't capable of building and maintaining is a much more pressing issue than the facade of "ai" will ever have.
There is no AI as per Gödel's incompleteness theorems that is simultaneously logically consistent and complete. The LLMs are constrained by the perimeters set by the equally flawed logic of the programmers, who seemingly don't understand the simple fact that there isn't infinite fuel in a finite volume, and as Odum's Maximum Power principle proves: that the most complex systems are the most energy intensive.
Are you really in a position to dictate whether you'll "let" people starve? They won't starve, as Gustave Le Bon explained in "The Crowd," they'll kill each other.
youtube
AI Jobs
2025-09-22T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyUy0uLCgZAZY4LjTp4AaABAg.ANNkDwwkw6EANNwnpjQyKR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwH5YM9uXZSnkZ1YbV4AaABAg.ANNjuHKy1aOANNpauNRwxt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwH5YM9uXZSnkZ1YbV4AaABAg.ANNjuHKy1aOANNzB-Xr7jO","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwH5YM9uXZSnkZ1YbV4AaABAg.ANNjuHKy1aOANO1oW_02WJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzoNOT_oEn9Gbqcn814AaABAg.ANNjAHu38qFANNnsXg--Y9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx7x5AK7DkZ1aCZavB4AaABAg.ANNgMRXe61yANOCu8jNMid","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugxzk3oJX6vhnU8ZDBN4AaABAg.ANNeHlRy3y4ANNhtTG_JCw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyVYs4EJmvYnIbyqYl4AaABAg.ANNdfmSJ8FsAOREZcb44W8","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxz0ssUpkG2fSwdQBl4AaABAg.ANNdZ3ubKAfANNgC852dsA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxz0ssUpkG2fSwdQBl4AaABAg.ANNdZ3ubKAfANOrf5RS5tS","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]