Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not wanting to sound insensitive with the artists community (which I'm part of) …
ytc_UgydaD05S…
G
The fantasy of autonomous self-driving cars is coming to an end as Tesla faces a…
ytc_UgwLsaMln…
G
@Jesse-jamwhat are you trying to argue? You do enough that it’s ok to use ai? Un…
ytr_Ugx6gZD25…
G
@User3eighty4 so there are a lot of reasons I would call this video amazing to …
ytr_Ugw5Uz2ms…
G
First off - I pray your brother will fully recover...
"AI's first kill and why …
ytc_Ugwh1I6gE…
G
Great interview. 100% agree (and I build AI tools and am a parent). We may end …
ytc_UgzSB5x7F…
G
No, AI will not destroy humanity,
AI is the tool, the bicycle, the spaceship or…
ytc_UgyN0O2yA…
G
I feel insane consulting anything not written by a real person, I don't get how …
ytc_UgzeCz5uj…
Comment
Humanity really has two endings with AGI.
One, we end up like Wall-E or the Idocracy. Dumb as a box of rocks while robots take care of us. I wouldn't be surprised if the AI decided to exterminate humanity through pleasure. Everyone lives in Matrix like realities they control until they die. With nobody reproducing in the real world humanity subtlety and painlessly vanishes.
Option 2 is the Elon Musk route. We evolve with the AGI. Installing neural and phsyical implants that makes us smarter and stronger. By merging with the AI humanity takes another step forward in our evolution. Of course this also will come with problems. People will now be able to hack your brain and body, authoritarianism and other dystopia scenarios exist, but humanity lives on and expands to new worlds.
youtube
2024-12-01T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAWRieI5a8VBWU-fF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaVG0hG7dtXJH75IF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyjMg61zuT9O_7PlBB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZrVDy8FsydvR4cHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFzF75HONz2aQoqGd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIsBDNWAAxORhdb3Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgypQjVgwG6izsH1hVJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYs3EgWOtz_ibrevF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZ2mvFgWHMTDY1kqR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtC5j-ao0kMgr2E7F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]