Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Totally fake, but looks cool. The concept is scary though. Let's not go any furt…
ytc_UgxGub7I8…
G
These dumb kids are just using technology incorrectly, they don't realize how in…
rdc_oi3frbj
G
Funny how he worried but continues to contribute to his worries by creating AI t…
ytc_UgyRWi0ZA…
G
@SarahNGeti there is 0% chance of this. From someone who works with AI on a dail…
ytr_Ugx5i7PTW…
G
I think an ant has more consciousness than chatGPT! I don't think it's about com…
ytc_UgyssteKp…
G
@KucheKlizmaWhat you wrote makes no sense. First of all, LLMs do use gradient d…
ytr_Ugzvpkq4X…
G
guess what? they’ve been listening to everything in your environment for as long…
ytc_UgxYHLoUy…
G
Ok AI takes over and shuts everything down but then what... if there is no power…
ytc_UgyVEwaLR…
Comment
Great video...man you're always 20 steps ahead of what I already know on stuff and this crap is scary. As an artist I do not like anything about A.I. art. But how can we stop it? Too many people are pro-A.I. It reminds me of when Metallica tried to stand against Napster...they were really fighting the future. The future does things because it CAN and to make money and doesn't factor in "SHOULD we?".
youtube
2023-03-21T22:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzdp3C3Xr0YvPzuVyt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxm1PgfFt0rEarJ8zp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyu5C8CbdvY6DS61eB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxJgYnVoO1pueALIuZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUYfhS5EKKhsb7RAh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzf6XwOsLQLOQVcr8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwm6C5mhnLcCYPhhRh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyQC97sX2ipD1uE-ih4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgxXG5cNrW2MKMWT2Wd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzE5kCUcLe9S9J-1sd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"regulate","emotion":"mixed"}
]