Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You don't need 50 pages to explain how this goes bad. At the point at which AI n…
ytc_UgyBtDptm…
G
"AI is inevitable..." just so true as "AI using against AI is inevitable...". S…
ytc_UgwEJBKQ0…
G
AI is only a projection of humans. If humans use it with bad intention, they wil…
ytc_Ugx_NWmtE…
G
Why are the AI artists wasting their time eating? Just use an IV drip smh🤦…
ytc_Ugxl-49Nz…
G
It can barely make a decent cupcake recipe, the issue is all the lies around it …
ytc_UgwM5Oc7G…
G
This is Astro, a $1600 robot on wheels that spying on you throughout your house.…
ytc_Ugxo8xvUW…
G
If AI mangers to replace any branch of doctors 70-80 percent of job doing peopl…
ytc_UgwacBdez…
G
Still you need time and if it's not for your job can be seen as time lost, while…
ytc_UgzlicPWm…
Comment
Ankur, prompt engineering isn’t really a long-term career path. It’s a temporary phase while AI systems are still learning to understand humans better. Eventually, AI will outperform humans even in writing its own prompts — after all, generating and refining language is exactly what it does best.
youtube
2025-10-15T04:1…
♥ 86
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyvVktenvTPVhig2fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGInIhIh9rcykOsXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwliONC1ISazS-b8A54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztkQa_sULfGqilM2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztysOzZk1fj_pnPEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCTc_X9Lm3xBXjGCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6siuulwgFp7YDNLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgytG2uJhjTkxB8LRRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW23LYd_y_rlmi1pZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrciEbkG0r5mQAdf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]