Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What about Isaac Asimov's 3 Laws of robotics. Asimov was a science fiction autor…
ytc_Ugzn0917B…
G
Ai was never meant to be an assist, you just feel for marketing my dude.…
ytr_UgxFy_65s…
G
Ai art is art. . . As a general attendee i couldn’t care less how the art was cr…
ytc_Ugw0tZCte…
G
And this is exactly why AI generated content needs to have a hard watermark on t…
ytc_UgwmdQ-Jg…
G
This isn't an accurate depiction of what happened. There were researchers that t…
ytc_UgyDnXpud…
G
I am shocked how many people find it hard to understand the difference between a…
ytr_UgxUcAO2m…
G
I think the other major flaw with AI is that it's just visual noise.
Art is usu…
ytc_UgyghsVsr…
G
https://youtu.be/UclrVWafRAI?si=Q4DLrRzHQcIfZdti&t=1324
Wow this 'expert' is d…
ytc_Ugy_qmlYM…
Comment
I don't agree with the guy. We don't have the compute power to have both powerful and available machine learning models. We don't even really have AI, just fancy distributed neural networks. We're at least 15 to 30 years away from machine learning models which can actually replace humans globally. Most "powerful" machine learning models required huge datacenters to work, which means they cannot operate outside of network connection. The power these models consume is so immense, they have to rely on thousands of subscriptions to make it worth it. It's never as simple as it looks.
youtube
AI Governance
2025-09-07T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKDdRpub9uAqPg2oV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwP6BrUluoFLDFaJ-V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdfRwTdci3AMEezrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQDMDWeVpuoJSvwV54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyK1r15R4CVKA6gRuB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDJON914wkWKHq94l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaYIhspn0v2dihJz54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzbJzRgZMXL_32KM7V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPkeDL2bxVrol1SdF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrqhtcvZ6gYDStS_N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]