Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Many good points were made, but some parts of AI were left out. He worked at Goo…
ytc_Ugz7rlydf…
G
Remember folks.
AI didn't take your job.
Your job was given to AI.
By your bosse…
ytc_UgyVqGmkB…
G
Your info on drones is incorrect. Most of them are automated - humans are not co…
ytc_UgwC2Pd4v…
G
No. We dont do anything similar. Inspiration is nothing like the mindless patter…
ytr_UgyVZ6su6…
G
In a shoping center, it takes more effort to sort the corrals and return to the …
ytr_Ugz_USuH1…
G
AI Mistakes are just a statistical issue at the end. Hence you get another "vali…
ytc_UgyTz-YrG…
G
These predictions are always wild, but at least I can use AICarma to keep track …
ytc_UgwNOmVWE…
G
This is what we should be fighting , not wearing masks! I don't like robots or a…
ytc_UgzgZrTN-…
Comment
1% is a lot I agree if someone gave me a gun with 100 places for bullet and there was just 1 bullet and I had to play Russian roulette with it I would be shitting my self. However it also may be if we don't make the AI it's 10% chance we go extinct because AI didn't save us.
youtube
Viral AI Reaction
2025-11-04T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxRp1FA7gXM1jWLpIR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxaNbWcRnQRl1KuM_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAB0HpMtmna2dYhXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6RclOAdj3_Udv6Y94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj-N1sLJb_pPmnu714AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxirD4hIabays5V_lF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjotiOGCB80vvLJ7V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxnBcQ7Y_LyLGiYuyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCE_8Pi3e4FPK9D114AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwQCC_1Tcs7idZ3UV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]