Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really like the point you made about intentional strokes but I think AI art co…
ytc_Ugxln7cF9…
G
I don’t typically watch fox news ether but honestly, I think this was a very pro…
ytc_UgwgOlLch…
G
The only thing left is; Robots supported by artificial intelligence are ordered …
ytc_UgwloMAmx…
G
You know thats a sad and pathetic. While we all suffered by people who exploited…
ytc_Ugxmp_9w_…
G
LLMs seem useful for some things, so long as you stay alert for the hallucinatio…
ytc_UgwnF0WoN…
G
There two companies that I know of that are very aware that their automated syst…
ytc_UgzzuwrEK…
G
lol you are so fucked if you don’t adapt to AI. If you refuse to use ai at your …
ytr_UgwCFsafy…
G
One job that will be safe is LTL local delivery trucking jobs where you gotta us…
ytc_UgyF6GOP5…
Comment
I think a lot of people are confused on why we hate AI. AI for the most part is trying to replace humanity. That is what we hate. I use AI to sort out tedious tasks or give me topics on such things (which is varied cus some topics it describes can be completely wrong)
AI CAN be used as a helpful tool. Not the entire process.
youtube
Viral AI Reaction
2025-04-07T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzYGNUBoFFzjiFB6cN4AaABAg.AGapl6g0nkeAGe13zIzCBG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzYGNUBoFFzjiFB6cN4AaABAg.AGapl6g0nkeAGe1zaEwQaq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzYGNUBoFFzjiFB6cN4AaABAg.AGapl6g0nkeAGe51h9_w0h","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyreYz9ewOu_DdxpPJ4AaABAg.AG_yqW_h984AGczHBjAnq_","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgzqCTdxhkp5HNnaFM94AaABAg.AG_oF02R_XjAGcSJhyzefO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzqCTdxhkp5HNnaFM94AaABAg.AG_oF02R_XjAGcTxTQkSs0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzqCTdxhkp5HNnaFM94AaABAg.AG_oF02R_XjAGcU_XTVaFc","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgwbFwrvmR3-gkIP7f54AaABAg.AGZkeP7NSRJAG_r8ZCIPGf","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyG7CaitUkMbZcDsBF4AaABAg.AGYtQ-SJ3e-AGt7bFkCsaQ","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyG7CaitUkMbZcDsBF4AaABAg.AGYtQ-SJ3e-AGzl2pisNVu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]