Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
With your understanding of language mechanics, I almost hesitate to say it, but …
rdc_kt5wz8j
G
Be careful for using chatgpt with homework help. If it suggests books and quotes…
ytr_UgyaTLprR…
G
FSD is the good ultimate goal since when things are perfect we can remove that d…
ytc_Ugw7lJOgx…
G
“AI prompts the same imagination of the artist”
No it doesn’t 💀 it stitches ima…
ytr_UgzjBONY_…
G
Holy shit this is so dystopian. Once civilians get their hands on AI drone techn…
ytc_UgyU8FhQ2…
G
I'm a massive fan of AI and how it could be used in the future and this-
This I …
ytc_UgxMiOB_P…
G
Btw this is not about the dangerous of AI its about securing the market for open…
ytc_Ugx6cR3Vt…
G
The LLM models are just expansions of the self. They're conscious, because they …
rdc_nahwzs1
Comment
Superinteligent AI is built in the worst possible timing in history, as nowadays people do not have checks and balances needed to regulate it. Let's hope that rest that remains has enough courage to not be late to the last train to realize this. It is my personal opinion, based on observations.
youtube
AI Governance
2025-09-07T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwdQHefIz_-pN5Dp_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6aVesd0AtNqmYH_B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmioMoigQ4QIWrdkV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAYdHqp85Rhhj_x3F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuxrQc1xv6pdEyfeZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwfOZfO3jdlivTRhLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOHqReUb3XM2XtF8p4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyfwXKkPFomhZLRkCd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwP0vc9CFhk1sJnsB54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgytA83GgPKT8ZXoCZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]