Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if you have someone read it that has a photographic memory? That alone woul…
ytc_UgxJXRgdU…
G
Very disgusting 🫣 😒 😑 😕. Stop embarrassed your beautiful country and remove thi…
ytc_Ugyua53ek…
G
The situation is, AI could easily squash you as well as everyone you have ever k…
ytc_Ugw5YacrY…
G
humans who trained this AI are moreso at fault than the users who are mostly obl…
ytr_Ugwln5DbK…
G
This is the best tl;dr I could make, [original](https://inews.co.uk/news/politic…
rdc_hm6vd5x
G
The term "innocent until proven guilty" apparently doesn't apply if you are blac…
ytc_UgwGeMNsh…
G
It was when a colleague was talking about a compilation of "guess what's AI" vid…
ytr_Ugz0RWrFB…
G
I’ve said this before but not having children is definitely not just an economic…
rdc_lj8zkju
Comment
So, if we humans are living in a simulation in which unchecked AI progress will inevitably create a Superintelligence that would likely wipe us out, then it's high time for the "God" running this simulation to intervene. Or is this simulation designed to test whether humans generally will opt to cooperate in order to save themselves? I lean toward believing that whatever happens, we've got it coming.
youtube
AI Governance
2025-09-06T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlmscQ58XZpJ3X3qN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTzZzrlvspzdxncQZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzXQmq86ckpNbExv554AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgycraD6cejkzMLMCqR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5DHngiRCRwRdMPGx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzQ82qHLz5HpXlA8nB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnIUvsBHZP39QZhUJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKmnwumadPd0dehW94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyGegSnJVB8FtgEYB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfP5k-S_ebMI9dqbV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]