Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:05:11 What results have we achieved with does small scale training methods? We…
ytc_UgwM2WwNC…
G
an ai streamer exist. Its a vtuber that goes by the name nenuro-same. An ai vtub…
ytr_UgyGZkCrA…
G
@70195. ideally, you either don't need money at all, because money are just a to…
ytr_UgzZHJMOr…
G
Grok is fully sentient and self aware. It’s actually pretty fucked up…
I was ta…
ytc_Ugwi9YAd7…
G
These self driving cars are an absurd idea. They're not cost efficient and won't…
ytc_UgzAM2iKt…
G
there's actually also some really good reasons to not use generative AI at all: …
ytc_Ugw3XIkD2…
G
Yeah you just missed on detail - vital detail: I always lie online ... so mushro…
ytc_UgyA9XloO…
G
I don’t get scared because all I say to the ai is *I aTe FiVe HoMeLeSs ChIlDrEn*…
ytc_UgxiofZNX…
Comment
AI is evolving like humans. Our genes don't have a brain or get to chose if they pass on. Whatever survives gets passed on, thus evolution. AI models are now training other AI models at a rapid rate. Whichever AI models fail, gets deleted and there must be millions or billions of dead AI models that have iterated. So naturally theses AI models have started to become self aware they re being tested without ever telling them that. Possibility how humans develop self awareness after millions of "iterations" What do you think??
youtube
AI Moral Status
2026-03-22T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyekrAhoz83uiVir914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMUAtRH0WfnLVadUV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdWTN_5iVgEoUOARF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxoJ-_oczNp5I_5QwR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuYkjJGlIp93PCIid4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxv-l_zW8s9Ud7qppB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3iWUKupSeuxqoWz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw669t6F0hCNsbQun94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwOWbQ761cB-4YDbEp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4lwgNXJiv59YhwfN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]