Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am scared by the time when I go to apply for college to get a degree in digita…
ytc_Ugx2IId2Q…
G
they are making the AI smarter by giving it for free the more people use the mor…
ytc_Ugzz_vmpM…
G
Ai will cannibalize itself working in a vacuum. The only reason it’s learned any…
ytc_UgxrejGWH…
G
If you just enjoy making art, why do you care? Your art is for the feeling, the …
ytr_Ugyt_XLEw…
G
It most likely would not care one way or the other, just like we have no ill wil…
rdc_jfa9sgi
G
This was literally the exact thing almost that I wanted to start about 2 years a…
ytc_Ugw9z5Wka…
G
It's corruption. Also when yall cheap. You see the self driving waymo cars they'…
ytc_UgyTQuGEW…
G
gpt-4 10 billion dollars? iirc the training cost was more than 100 million but n…
ytc_Ugw1P-oDU…
Comment
And do you know who "AI" will take out first? Yes, the rich, indeed, the richest. "You reap what you sow" springs immediately to mind.
youtube
AI Governance
2026-01-08T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWW7uu4faWK9YiBix4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgylhNzUTbe6R23Felx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhQJMxegBs4FaMsGd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZ5flsnCggMu-ZEEd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5UWVLQEfdaQyUGfp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwk8QPLX6US6-kI4Fl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgweL7zioowZ3BH9kRJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy7zkiLKGtmn93mwQd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxETHT0nuGAvImuQoF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwGNzHryhVgzCMhfjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]