Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a fallacy in this logic. If a businessman is introducing AI in his comp…
ytc_UgyYELmcc…
G
They're actors. It's a promotional stunt for a movie about artificial intelligen…
ytc_UgwwzkHTK…
G
Breaking NEWS:
They did it! It’s called “UBI” (Universal Basic Income).
Oh, and…
ytc_UgzDOazDZ…
G
The biggest mistake they made with AI is calling it “artificial intelligence” or…
rdc_ks2mpva
G
4:56 i love how this is 100% applicable to Ai users and they don't even realize …
ytc_UgyVoKrV3…
G
Where are the Ai based?
How big are they?
How many are out there?
Can they be li…
ytc_UgyJYHBkK…
G
Bro, AI can only steal and generate images without intention. AI doesn't have an…
ytr_Ugy7nOtcx…
G
I'm a dude but the second I saw her I knew it was fake, something just looked of…
ytc_UgxcikUKM…
Comment
There will have to be a huge breakthrough as it's basically plateaud. The code is so junked up and obsfucated, every AI is trained on other AI with the same garbage code. The code is so bad, most working on it do not even know what certain modules work so they're just left alone. Startups are always a cash grab scam. Their product works great in a vacuum when demo'd, but real world variables throw it all out of place. Not to mention the majority of AI's are trained on existing AI models. That report from MIT was altered slightly to prevent the bubble from bursting this early.
youtube
AI Responsibility
2025-10-01T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUhEA3HoH6z7Gn4A54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzfuWCN1N5F_n_bKV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlTtlIJFZwsxG4c5F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUMJmxNnqMnfWwP8d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0z2FWQG57fIcjTQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxczCXcEHL2aI3rwll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjun0B7POdhTJ4eFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzSRTfMI6HVCeEZktJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIDPZ5oC4_VrsWmOl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxC1acnJ5bjQMjrmi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]