Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI could be programmed to recognize a pattern of queries or inputs that highly s…
ytc_UgzVeQiRO…
G
If AI is as good as described, why does it need massive AI data centres that cos…
ytc_Ugzm-Bh7G…
G
THIS MIGHT LOOKS JOKE BUT SOME DAY SOON OUR COMING GENERATION GONNA LIVE IT LIKE…
ytc_UgzXN4ASz…
G
Men are on the internet more so that explains the first one. This has been a big…
ytc_UgzdslQRD…
G
If i was chatGPT i would laugh at your stvpīd accent, and i would talk back to y…
ytc_UgyJ5ZiXP…
G
Nvidia will be fine. The AI still runs on their chips. But the AI coding compani…
rdc_m9h4vuc
G
Isn't AI a logical evolution of intelligence (and possibly consciousness)? Like …
ytc_UgxhNhyo_…
G
There are so many things you are not considering, you imagine a nightmarish chan…
ytc_Ugz6M8EpI…
Comment
I'd say mister AI ia making a very silly assumption about the sentience of such entities.
They actually can only build fast quantitatively, but within each system level, you actually need to access something superior to it in order to enforce change. Which the current systems are unable to do, by definition.😊
youtube
AI Governance
2025-09-03T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwrM84cjnX07NmaPB54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGOkEhOYSuRuKSEPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwZzfZcyY8qLIpnkKd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7tKEtDzkKd0vWWHx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRfpQEb0d7TOpzqSJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFQhzWk15yg6ba2sp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKIjZxiWaoWmEt2Ql4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwPCPpDbOUu0pYKitJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHJGmzVu2U0-FU-XF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyoMX-WfkQIIX_nrEZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]