Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry its fundamentally unfair to manufacture scenarios for AI that anthromorphi…
ytc_UgwhUBQWq…
G
In the right hands, AI could've helped us. Unfortunately, it ended up in the han…
ytc_Ugz_dhlk_…
G
I thank ChatGPT all the time. I know it is just code but when they speak so hum…
ytc_Ugwv0yq3U…
G
AI will be feeding the lucky/useful humans food pellets in our cages before you …
ytc_Ugya9798Q…
G
Using chatgpt doesn't mean cheating. If we don't like them to do so, there is al…
ytc_UgwpwOq8b…
G
DankDungeon
yeah, great idea until their system gets hacked. Then what ? You c…
ytr_UgxRYxWaC…
G
what really defines a human being is free will and a robot will never have it…
ytc_UgwB829cG…
G
Calling anti generative AI attitudes ableist while AI generators are actively st…
ytc_UgytsiREe…
Comment
This isnt the future. This is yesterday. Where they have got it wrong is the profit margin and avatars.. Avatar already exists, tho profit isnt profit. It is still based on debt.
Ai's projection is still human's projection. Ai on its own is still very passive. and regarding Knowledge, itself isnt infinite as you would like to know. Knowledge is bound by its physical boundaries.
super intelligence, should it be whats its called, wouldnt be as pity as humans. Pity as in, vanity in material pursuit, virtue signaling, power struggles.
we think intelligence in terms of tech advancement and aggression, but intelligence as a whole is much more than that.
youtube
AI Governance
2025-08-02T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx6id6s9wimTeCSxxh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVRLK6HmdwQ0qBMmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0CpXzvvWdPCJgi394AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxjhvTy1ALY7L9JTGp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyaPGtHHWBqTorqz6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSxuyeqAcgMXPKpyN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwIxc1KDhMMOKK4K_x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwj_2vp0kFzsb5XNtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyMlPw8ScnJiKHZXVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzzIjgPSX6F5aLI1R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]