Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you want to see AI disappear create AI CEO's. In case you couldn't hear it th…
ytc_UgzGMcoTN…
G
Ai art deserves this, unless you program the ai yourself then you shouldn't be a…
ytc_Ugy-ZupRB…
G
That's not at all how that works.
Robots have no way of doing any of that. The …
ytr_UgzByUwBi…
G
It seems that like self-driving cars, there would be stages of learning before b…
ytc_UgyOnSyKd…
G
AI threatens to take jobs, but they are firing people before AI actually does ta…
ytc_UgxEKw9Se…
G
what really defines a human being is free will and a robot will never have it…
ytc_UgwB829cG…
G
What?? I thought the big mean racist cops only targeted people for their skin co…
ytc_UgyPCRDxt…
G
Stop calling these machine learning tools "AI."
They're not AI. They have no in…
ytc_UgzoFsbcy…
Comment
Listen.. why you people still doing this? Lemme just.. ok so it’s simple: use AI on it self. You tell it to make its own perfect prompt, using its full potential and then yes you tell it what it’s specific for even tho this is not necessary most times. Then you tell it to reverse engineer it, what ever it might be to make sure it I bullet proof. Any questions hit me up and I will do you an example
youtube
Viral AI Reaction
2025-04-21T20:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyoj8KkDhdmCCbsHmd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsRaG1gSTyv12ySSp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwabJYT5mjnVzpuL-F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-5plsyHN7RPjakFZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFhBA3NPSpNZXMxjx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJA-fnzqz1wl67CxN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGnBa5ttP_2FQV4Ed4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyhCz8n5ba8zVF9XTl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOHGHQAW-fjNcsKDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugymj2iDzxi4dejMebB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]