Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The number of videos boomers wanted to show me that were OBVIOUSLY ai but they t…
ytc_UgxbRVbBI…
G
Many AI artists have picked up a pencil. Using AI to assist you in making digita…
ytr_UgwoS5XKx…
G
ok, but the fact I had to watch three ads about generative AI features in apps t…
ytc_UgyFOXd9I…
G
She's wrong on certain parts. I think what's going to happen is we're going to l…
ytc_UgxQWJyZ9…
G
Art is expressing the human condition and the emotions that come along with it. …
ytc_UgzzHyc2i…
G
I love how you have this conversation with an "AI" guy and you always hear about…
ytc_UgweIpA0H…
G
I don't mind AI art at all, I'd just like to know if it's AI art or not.
It's th…
ytc_UgzFced-F…
G
Yes we should stop it actually. The way it's made is through stolen artwork, par…
ytr_UgzJ5zUYF…
Comment
Oh come off it, he says the FDA keeps companies from cutting corners when its actually the reverse; they help companies cut corners! Musk has truly become a deep state mouthpiece.
The reason for Musk asking the tech sector to pause AI Development for 6 months is because its too difficult to regulate official narratives in AI since the programming code is essentially comprised of tons and tons of data; we lack the tools. And, Musk is working for deep state
And, I expect they'll need a lot more than 6 months. Try 6 years. ><
youtube
AI Governance
2023-04-18T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzz6HA3Ik7ZCQ7kS954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3EO0AaZChQzuHXrl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoqQvSLt5WPJVHhEh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMAUAKOGqPjkqchWV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGuDh9MFsZLzPxuBp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxM6wmfMxsoeyqDXbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxukUnfj2ViZTeu-jF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw45WSK4ZqpOIiCtmR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz0lHGmKAcZo5_a0G14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwe3noAkoTYUGhgGa54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]