Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The "We had a policy against sentient AI" is hilarious! We all know policy is a …
ytc_Ugzr2qcdV…
G
ai art is fine as long as you dont claim it as your own ngl…
ytc_UgxN9YB5v…
G
Ai will take up jobs ..not today ..maybe tomorrow ..but definately day after tom…
ytc_UgxpAT7ph…
G
They need to make a law. Ai shouldn’t be allowed to replace human jobs, it shoul…
ytc_UgxpKOko6…
G
We’re arguing semantics with this debate. If someone wants to call the AI the ar…
ytr_Ugxm2w06R…
G
AI is ruining everything online where you can’t believe anything anymore, it’s r…
ytc_UgwaUTbH1…
G
Not if we stop it it won't.
No one knows how to get a broadly superhuman AI to …
ytr_UgzjLsKym…
G
You're missing the point. You can't compare AI to just "better tools". And the c…
ytc_UgzLP1Gsp…
Comment
I avoid pay to win games like the devil avoids holy water.
It is a matter of selfrespect and the value of my time in life.
If i play a game i am looking for a fair challange where my skill and intellect is tested against those of other humans. There though also may be the crux for the future, as AI develops further, destinguishing it from other humans will be more and more difficult and while for training runs that may seem fine, my sense of competivness tells me that i wont much enjoy playing against AI as they either let me win or don't, but there will be no contest as the AI could always win, so as a wise movie onces told us, the only way to win is not to play.
youtube
AI Harm Incident
2024-07-30T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz338t9p9FeIIt5AMV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxEgTSlnqNlb8hJop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGupDnAhWuM-HbyPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpGGNfAqMHa8rT7MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCcl8suyuvDTHge0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJYuFlVyVUoivdyfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyiNkXl_wFjjjChPgh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx_aIf2fNPyTCLVWe54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxyIumokLGdYbU_e_J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyqenx6aplnocPQIIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]