Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI companies have already proven that they don't care about intellectual propert…
ytc_UgzLtu8q8…
G
I used to draw art on the computer as a hobby, did some fanart of stuff but I ab…
ytc_UgwgsD5rW…
G
Lol if you think ChatGPT is the kind of AI that studios will use to create conte…
rdc_jtb9mh2
G
I agree.
It will also increase the number of hackers and scammers online.
If p…
ytc_UgxIGzwWm…
G
@laurentiuvladutmanea I can see you have no idea how this really works, and are …
ytr_UgxGAh1xq…
G
If AI does not need to be paid, just maintained, then products and merchandise c…
ytc_UgyUpsyec…
G
The book of Revelation should be read and then you will see that AI is in there …
ytc_UgwDon0gn…
G
When l see a company using AI, my only thought is that they have no care for the…
ytc_UgyUn2Cup…
Comment
The core objective of AI is to unlock the full potential of human intelligence. While we possess the most sophisticated neural networks biologically, we have long lacked an optimized "teaching system" to activate them fully. If an AI can develop an application in minutes, humans—if properly empowered—should be capable of tackling even more profound complexities.
The true frontier and danger of AI research lie in training methods based on partial side models of neural networks; such methods might trigger anomalies in the AI model and can lead to AI outside our reach. A human neural network is already equipped with an energy transfer mechanism, a neural networks models that unify the five forces and can be the core of the cost function, which explains our abilities to learn at different levels than AI models and guides us in learning the proper way.
youtube
AI Moral Status
2026-03-02T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzcS7dAA26nMHbvkut4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1qD1zQTimP_uSdKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYyeVQYBPErrM6OJd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyg9oXd8HnnGOVhPX14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgypHzNK50rMpCYg3C54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_kAkSTCvXCsKcTAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwatZ_9S-Y7FYhus4t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxS6xMigpwhcrkRxxV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwT2e72MtllrSZufql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxIBNYjora_mz5KWxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]