Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pure sensationalist, For an expert in the field he keeps portraying the current …
ytc_Ugws2PRX3…
G
I think AI is just the term they use to substitute for offshoring and cost cutti…
ytr_Ugwvx0FC2…
G
I do have the expertise and focus as a motorcycle rider along with a Tesal. I ha…
ytr_UgwfDJrlH…
G
Just don`t give them thumbs right? Just playing. Seriously I A I will be better …
ytc_Ugzy099u8…
G
@VrontixYT because data literally backs up the idea that different races are dif…
ytr_UgwZVKIdt…
G
The most dangerous part is when creatures feeding them wrongful orders by that t…
ytc_UgwDoM_kx…
G
And that’s why you get a man job that can’t be replaced😂 id love see a robot cli…
ytc_UgzloaIw_…
G
Oh come on, robots are programmed to make specific tasks, even if you use neural…
ytc_UgiFsTnbA…
Comment
The problem with this suggestion right here. 1:02:41 is that he is making the claim every single thinking entity in the simulation is being controlled via the creator, not the user. this would completely defeat the purpose of the experiment or simulation or whatever you want to call it, because your no longer letting the entities make up their own minds of what they want to do. your telling them what they want to do, thus immoral actions make an immoral creator because your immoral actions aren't your own their from your creator.............. thats like creating a video game and having not a single npc or other player besides yourself in the game. like sitting in a living room with 500 game controllers and you have to play them all for the gaming world to interact. but all the time its always just been the one thinking mind making all the individual choices. this wouldn't be simulation theory this is the idea that god is us and we are god, and all we are is god creating little versions of himself to interact with himself. but i am you and you are me and we are all one big entity (didnt mean to make that rhyme)
If the creator was a immoral or impartial entity. it wouldnt have put the ideal of morality into the simulation to begin with!!!!!!!
i argue this simulation theory is exactly what god and Jesus and the bible is talking about!!!
our existence was created for us to learn about morals. to learn about what it means to make your own choices and seen the consequences of those choices. To learn why being a good person and treating everyone else equal and with value is the most important part of life. and also its a great way to weed out all of your new "souls" or little "models" "agents" whatever you want. its a GREAT way to weed out the people and determine what type of person they truly are. much like we have no idea what an A.I. is going to be like or turn out till you let it go and watch, maybe this is gods best way of keeping the new souls he creates from bringing chaos to the heavens....... you have to prove your not a complete asshole first!! you dont need to be perfect, you just need to want things to be better for all life, all suffering to end, and peace among people.
and frankly imo if it is the worst case situation the most terrifying to me. we are truly just a random experiment a test run and none of it matters AT ALL, we are all just a bunch of bullshit algorithms running our own little programs. as insanely depressing as that is. our goals all should still remain the same. to help eachother out as much as we can, take care of eachother, love eachother, and pray and hope we all are something more than just some teenagers video game he plays after school....... cuz if thats the true reality........... well i just dont even want to dwell on how depressing that would be!!!!!!!!!!!!
youtube
AI Governance
2026-01-12T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwEgQYUfwAi197rn6J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-xwcBmle4ald9jvB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8fIb6Femip80Lght4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSkdvOcmA7di1z2Dh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxny1fVCmWS-GGr7_B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwC0Gfb-nUJ4nHekiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOz9kcPBaC2qfOjfZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyGIZOpyjX2SnJxl4B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-6s47SVfstwVaZ9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxv68fCCuthfOpm5aF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]