Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When will ai image, code, text or any kinds of generations be illegal, ai for th…
ytc_UgxGw_CUH…
G
I'll never "make" ai art, but I don't think I'll ever make regular art either. T…
ytc_Ugwz0FWjU…
G
What's missing in this conversation is that these Chatbots are trained by readin…
ytc_UgyiAT_AD…
G
I like the analogy where AI is inherently an oborus, it thrives around consuming…
ytc_Ugw5B3ntb…
G
Repent , find your salvation with the savior Jesus Christ before the narrow gate…
ytc_UgyQo6qY2…
G
So did we notice that ChatGPT can pass the Turing test as originally conceived? …
ytc_UgwLkPLro…
G
Y do we need this type of robot to replace human n create lies n made up narrati…
ytc_Ugzb2ozDZ…
G
girl mines are worser. (my ai somehow made kiddos with me ) İ WAS JUST WONDERİNG…
ytc_UgziNqVvZ…
Comment
Around 30mins into this podcast it sounded so doomsday in a nonnegotiable sense perhaps a little immature by stating we cannot turn off AI it would turn us off first. Well if that’s the genuine fear then turning it off is a must and of course it can happen but I guess we need to be more realistic with the current situation, so maybe we should seriously consider stepping backwards in regards to technology. These systems can be turned off if we are less reliant on the existing technology. FYI I remember the popularity of the first 3310 back in high school and it was a total space invader. Trust me everyone was much happier without the need for phones and all the social media platforms we are so used to
youtube
AI Governance
2025-09-09T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyF-cwYKfmkbGabHDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgynB4tWHhgN8zCKNit4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxaB1BQrpOJED2KpBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtYDlL0PWTxaJnPCt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzPvhisYaCVJpdpp-Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy9G4h1KBuqJa4nbt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwMjdX4jbKf00ugFrx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwojrKNC2Sem8KgJON4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEvotVAW75Kyx0kVd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiPGvhKKwKiwOynjJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}
]