Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use chatGPT for work and its still very dumb, it is far from ready to replace …
ytc_Ugz-syA1t…
G
I think ai art should be used for fun, or for making stupid stuff to laugh at wi…
ytc_UgxIKLB_2…
G
Exactly! It’s for fun! Like those AI generated Pixar “trailers”😂. Pretty funny!!…
ytr_UgwAYZCbe…
G
I don’t really care about Ai art, but it shouldn’t be that hard to just copy and…
ytc_UgzyoVjCn…
G
I think we are much further away from all of this than he presents. Automatizati…
ytc_UgxnXOmVQ…
G
If that is true then AI will never be able to reach ‘singularity’ where it would…
ytr_UgwIMK_sq…
G
@thefunseeker9545 Your final point in parenthesis is meant to be an aside, but i…
ytr_Ugx8RZB2H…
G
Ai hate is so stupid. If it wasn’t legitimate art then you wouldn’t care. You’re…
ytc_Ugw8CHIUS…
Comment
So funny story. The people involved in developing Ai have now crossed a threshold that changes everything and they are warning about it. In the beginning people coded the Ai but then they developed a system of a hive Ai, a teacher Ai and a student Ai. When the student got wrong answers the teacher Ai reported to the hive Ai and the algorithm was destroyed, when it answered correctly the hive would keep the algorithm and then develop it. The Ai now codes itself and the original creators do not even understand the code. So the comments about whether you trust Ai because of it being developed by people is moot, the Ai learning has surpassed the limits that humans were involved in. It’s coding itself at levels above human comprehension. The warnings have been voiced.
youtube
AI Moral Status
2025-08-30T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxW9oQWAgbarPv455N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuSbRw-IOMaNXyach4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzpBdFKjtwsdBRXSZR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJYPFGCezz0fw36xZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyjzSR2l4dbLuEyREh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWRZkGLKpB73ba_ax4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGKsJzfUH9riC-cux4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJlhtDdqWj-pGn-N54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxExsB0rknzAyhY07V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTwNGQaoY0pX6NRah4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]