Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In my opinion I feel like we are making more and more and more and more ai inclu…
ytc_UgxU2-E94…
G
The horror of AI is not if it finds consciouness, it is if it doesn't.…
ytc_Ugx2-gUvL…
G
And I'll tell you it will be used to brainwash children and it turned the human …
ytc_UgzX1qB-Y…
G
9:22 does that imply, that when agents meet for the first time, they check if th…
ytc_UgyvN8gS0…
G
Hello, while I am not an expert on the topic, I understand that the general cons…
ytr_Ugz2dGMF4…
G
Getting your final answer from ChatGPT… wow. Folks do not take what you see here…
ytc_UgyoAPe9q…
G
Humans have Life Energy, which no technology has!
Human touch can never be resem…
ytc_UgwzD5oME…
G
Main thing to worry about as of right now is maintaining a coherent and shared r…
ytc_UgzlhCSeR…
Comment
31 minutes and 44 seconds, we're talking about Kenyan content moderator, whose life was upheaved by AI tech. 31 minutes. 45 seconds. Google Gemini shows me an ad for their their LLM/AGI (which I already own, subscribe to) 31 minutes 46 seconds. I'm leaving this comment using gboard which is driven by AI. I honestly think this problem might be so far past our human ability to "know what's coming" it's insane. Within a 3 second time period I was thinking about the dangers of AI, then advertised AGI large language model, then used the same technology that I am skeptical about to even make this comment. We are so entrenched and I do not think human minds are capable of regulating or putting this problem back inside of Pandora's box.. we already opened the thing. I hope we survive this
youtube
2025-09-10T17:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwj4c2RhCiOiarrCM14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxJ4xi9x2tv_1PG3x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-ULbsSAnfaVGDGo94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVF9_yBdUVSFhF1b94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2IV5iTIAD0e-RYRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw5TXAd4b7woFFrntV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGRwKMfCR0QPSuo4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAlehNecZlyYroQPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyT2euUWINCH0JZxkV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFQm3lOtDc_VrxzzJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]