Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The killer AI that is out to destroy the world could assemble the weapon, make t…
ytc_UgyT_hMms…
G
It's probably not incorrect as much as incompetent at doing such task. That is c…
rdc_jflttlv
G
Honestly, the more "advanced" "AI" gets the shittier it is. Every AI chat model …
ytc_UgywFlQ10…
G
Meaningful things don't have to be done for money. I don't see the problem. Ther…
ytr_UgyYMju6Q…
G
These are the type of questions this individual wants to ask a machine like this…
ytc_Ugxot3AgP…
G
AI art is, for the most part boring
no effort means no achievement (developing …
ytc_Ugza6n78z…
G
Here's an interesting quandary if they're going to do artificial general intelli…
ytc_Ugyt50KGp…
G
Shad always had a limited understanding of a lot of things he was talking, he ma…
ytc_UgyZF72sf…
Comment
What happens? Society will likely break apart (or pull together if we’re smart). There are a few distinct scenarios. The first is everything basically becomes free (or near free). The second scenario is more of a Hunger Games type of situation…where there’s the ultra poor and ultra rich. The ultra poor are pitted against one another to keep everyone in line. Given the human ego, the second scenario seems more likely. But a third possibility is AI dominates and decides to eliminate the human race . The Terminator scenario, which is probably the most plausible. AI decides humans cause the most death and destruction on earth and they decide to clean house.
youtube
AI Harm Incident
2025-02-09T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyN74Tce6rRy2pk3TV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy-hk42k0196pVN5wF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzcWlqH1pY8MkWYVXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxJJWDCyphXqTBgrBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyiF0YbWVVfW_Q3KQd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzkITlqyL_ovF6frOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJuP8ofHrXIvod81t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgyDGNQf5vUDwHf21HV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx-YfAEFWkR57xR86l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},{"id":"ytc_Ugzc_4LGJjO0Jx-f6354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]