Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve recently learnt a new language and find ChatGPT to be a really good rubber …
rdc_jif8lps
G
To be honest there a lot left to reach AGI level, everything depends on hardware…
ytc_UgzHjUQl2…
G
Ain't nobody gonna be buying anything those trucks are carrying if we don't have…
ytc_Ugz4UgPYv…
G
There should be one website designed for artists where AI is prohibited from tak…
ytc_Ugx1Xn6R-…
G
Here is a challenge for UBI. If we assume that UBI will pay the living cost for …
ytc_UgxkNcB_k…
G
The reason for the AI black mail is not that complicated. If you give AI the tas…
ytc_Ugy2gxwFy…
G
No one said they wouldn’t. If people like Trump are in control yes they might. T…
ytr_UgyEecRdE…
G
The antichrist will not be a man nor a woman. In Revelation it talks about man b…
ytc_UgyJcjczb…
Comment
I have just had a thought. We already have an example of an iterative ai. Humans. The human ai is probably still the smartest ai over everything that we do. General knowledge, skills, talking, language, movement, visual identification etc. The problem is as advanced as the human ai is and the development etc; once the model gets trained it gets slow and when the environment changes it needs to reset. It passes some training to the next generation but it has a maximum limit as the more complex, the harder to grow. The limit may be too exponential to possibly exceed Humans practically as we are probably the most advanced possibly. Hopefully, ai won't exceed a useful tool.
youtube
AI Governance
2025-10-02T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyBYvIpaON4sSB0p5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgybX7CpubnvveZOgtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1S_tcWXBqbTlV1Wp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHUvKfrSO20ro5iEJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwBN1vQ2ntWD0DctUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx5LHkot8VhU1Kav7N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4Vu6FcD6ICOBn-vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnlUlFvIPhY7PxjDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEJJ_tlRnwzEAxZOB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVpJ-L19dy8c5mGZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]