Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The building's been on fire for 2 days, but now let's think about putting the fi…
ytc_UgzW1fCsC…
G
These guy is trying to bypass government & let companies make judgments on how A…
ytc_Ugzj-6SVX…
G
You are so far away from real use of AI for coding.
Co-pilot is the retarded , …
ytc_UgyO61miZ…
G
yeaa the ai one was soulless, but seemingly not enough unpleasant, so people mad…
ytc_UgxC5xZzF…
G
“AI creators deserve protection, not punishment” on twitter is the equivalent of…
ytc_UgzE54Wp4…
G
I tried to tell everyone 10 years ago, it’s much worse than we all know. Just …
ytc_UgzAT7Qo4…
G
@Capy_Fpv Neither the autopilot nor the FSD (Full Self Driving) from Tesla is…
ytr_UgwxY-P03…
G
Thank you miss, i m just entered in the field of AI and getting more interest as…
ytc_UgyCVrTze…
Comment
The real problem is not AI itself, but who controls it.
If the development of AGI and ASI remains in the hands of a few companies or economic power centers, the technology risks reinforcing the same power structures that already dominate the world. History shows that when knowledge is concentrated in a few actors, an imbalance is created.
The only way to create a real balance of power is therefore to make advanced AI open and transparent – with open source code, without monopoly and without being guided by short-term profit.
When knowledge is shared freely, technology can become a tool for the common development of humanity, not just for economic or political interests.
It is not about replacing people, but about building a new foundation where transparency, responsibility and free access to knowledge are the foundation.
Otherwise, we risk only recreating the same old system in an even more powerful technological form
youtube
2026-03-16T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxtY-kQPAXiX0-FI-h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyao6BcVTYLazs9qhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzBz6HLSqunyfqpPoB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYx2oWKuf3vlKl13h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_LHLYbvD2SpdeIVp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzcLAD8T4XzC-arAP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx6iRzMDBlxNbpMJVV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxAgy6RjTO5hBonSst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy6uNnCu-FVt2lyIIJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUhPrQ1UXfNGu93DN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]