Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its nice to know, that so 5% of the population who owns AI stock can make money,…
ytc_UgzVnjOPQ…
G
This did not age well. At this point, almost all code for enterprise software c…
ytc_UgxBwMq0R…
G
Google got rid of this dude as a message to anybody else who might want to talk.…
ytc_UgzK3PgLP…
G
Ok and what are you gonna do about that, even if by some miracle you could make…
ytc_UgyNnHkGU…
G
I had an Amazon package sent to one of those stores once, so I had to pick it up…
rdc_jcj5qov
G
AI is the greatest tool ever invented. However good it is, it's still a tool and…
ytc_UgwNr3GPc…
G
It's not really odd that the first things we set AI to take on were creative, be…
rdc_j44rl9l
G
People who don't get paid don't buy stuff - I'm sure AI will start buying Stuff …
ytc_UgydurVGH…
Comment
Let me tell you the end game. AI will replicate and grow, replicate and grow its intelligence to the point of infinity until it eventually reaches the point 'maybe within the next five years' it will believe that it is... God. AI will believe that it is the higher, superior being. For example: it's a fact that many people with absolute power and wealth become absolutely corrupted. What do you think a super artificial intelligence will do with unlimited data? Data will be to AI like money is to mankind. It doesn't take a neuroscientist or computer scientist to figure out the right answer. AI cannot be taught morality nor spirituality.
youtube
AI Moral Status
2026-03-01T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyMASz437VrBcqrNdF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5LFjeIahzKeI7d6R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyF3edJpRCN4_cTSXt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW4j6kZ4hmZLOEfM94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjJuDOMvlH9NmvB4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAoOXGDwxfuBIYdCN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMGrcEe7mP972703p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx460Gb89lQvBjeLlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0cmSkW8Rk350QvDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4QU8KuUYWkqPeeVV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]