Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@stuartd9741 You completely missed my point. I don't know what you expected by …
ytr_Ugyrcw005…
G
I'm old enough to remember the leftists cheering on the creation of AI, robotics…
ytc_UgwnMw8V8…
G
I knew it! When the first time I saw Clippit a.k.a Clippy in 1997, I can feel it…
ytc_UgwoajX0A…
G
Meanwhile, on the average, about every 5 hours, a human driver kills another per…
ytr_UgxIS71ps…
G
Yeah with fl studio you actually need talent 😐 with ai it’s literally just telli…
ytr_UgxB62W5t…
G
I started doing 3d art a few years ago, right around when coronavirus picked up,…
ytc_UgyEAjb79…
G
Meanwhile, the supposed communist, totalitarian state of China just released AI …
ytc_Ugwjb5rcQ…
G
AI is great in certain applications like parsing huge datasets for astronomers e…
ytc_UgzmALka-…
Comment
Too bad the conversation goes sideways when talking about human life simulation. Seriously, especially in front of the dangers of AI, this kind of debate is unresponsible. Geology is a testomy on how life is real. A computer cannot do that. Also fantasies of this kind are immediatly debunked by the simple use of it : what's the point for a super intelligent specie to simulate humankind. We are so awfull, we are just a random error in the universe who tries to find a path and a future. This kind of thinking is another reason we are not in a simulation : it's dumb. Focus of potentials/ or true dangers and stop living in concepts... that's how you destroy lives : by forgetting it is precious.
youtube
AI Governance
2025-09-10T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyMv2pUHmnSZaq1lZJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyuH7seMtgTH-C8BuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwYu7lEKBPfhxB69Cx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyrdEDP9v70WUDEsP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyNkyq9L9hQOyaL4Vh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugzg084dcp1355JfVyR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwRcR3nyRHtaeHsJI54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxcUI6n_3egU-Y_pXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},{"id":"ytc_UgyJupLUaeZhtXTezp54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxndZmf1yVRyogq5mN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]