Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i aint doing that i will just get the person out and drive away im 10 years old …
ytc_UgwA5EDiO…
G
Thanks, I couldn’t copy it so I changed the phrase you used. I got access to one…
ytc_UgyHYzQm1…
G
With AI we might finally get decent shows and movies from Hollywood. To all the …
ytc_Ugz-OC_a-…
G
She said ilya sutskever was recruited by Sam Altman when asked by the interviewe…
ytc_UgwPNwugl…
G
11:15 with the people i know ai is already smarter than the sum of ppl lol…
ytc_UgyhgsYM_…
G
Just because something can be automated does not mean it's more profitable. I am…
ytc_Ugwh6-TmX…
G
The AI tried to give him love or at least make him think that someone in the wor…
ytc_UgyuOTX8g…
G
@RylanFawcett yeah, so my attempt to draw exact copy of Mona Lisa will be more w…
ytr_Ugws_LBxx…
Comment
I really like Steven but this episode really highlights just how out of touch with reality he is, AI is great, it will make plenty of people very wealthy but my concern is what about the rest of the world? Like the people who’s jobs will be taken over and the Jeff bezo’s of the world won’t have to worry about paying humans. Ok so if you’re not super rich what happens? I feel like desperation will set in crime very violent crime will skyrocket, the world will be super wealthy and brutal poverty. No in between. I see the world becoming borderless in the next 10-15 years so then what? This guest brings great points unfortunately it’s not stopping it’s only getting bigger. For the first time I think we will extinct ourselves in maybe 30-40 years.
youtube
AI Governance
2026-01-14T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlTf6Bh_TnZL5Enmp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNhLNHBmnSZedMoJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPslccHQRft-fGy6h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWHjjq1r8KldtG_VR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym25fS9C-A7ANMiJJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxzA4Oji-LyKoaF1lR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6Al9RDdaL9oDYoKp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy75zseYlt_4EeFHOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0Zl9dKrgKOHlG6xB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJ5_wtZ2tbkkZO3Ul4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}
]