Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@zettovii1367 It's easy to denounce because it is used mainly for what's popular…
ytr_UgzF4oQ9E…
G
Ai steals real artist work and gives it to you on a silver platter. Theres a rea…
ytr_UgykN8Y3G…
G
I'm training one right now. It calculates probabilities nothing more. I ask a qu…
ytr_UgwKxEZas…
G
95% of ai project fail. Ai isn’t at a point that companies could trust replacing…
ytc_Ugx5iox_Q…
G
I watched the whole video and therefore you can monetize it. Now; You have every…
ytc_UgzuDGenF…
G
I say we recall a whole bunch of these bad ai systems and redo the code and wher…
ytc_Ugwd2tN5F…
G
Yes automation is the future. do you idiots still cry about the milk man losing …
ytc_Ugw7kooKl…
G
Im gonna spoil it: you clicked on this video expecting to hear about madmen.
I…
ytc_Ugy-gl-vO…
Comment
Ankur, please research this more deeply. AI isn’t just another innovation like the Industrial or Computer Revolution — it’s fundamentally different. This could be the last major creation humans ever make, as we’re essentially building a creator. The few jobs that emerge to manage AI will only exist temporarily, until AI becomes capable of managing itself — remember, it’s intelligent. Within the next decade, there will be no jobs left that require human thinking — none at all.
youtube
2025-10-15T04:1…
♥ 55
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyvVktenvTPVhig2fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGInIhIh9rcykOsXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwliONC1ISazS-b8A54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztkQa_sULfGqilM2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztysOzZk1fj_pnPEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCTc_X9Lm3xBXjGCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6siuulwgFp7YDNLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgytG2uJhjTkxB8LRRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW23LYd_y_rlmi1pZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrciEbkG0r5mQAdf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]