Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I once "created" a grear AI image of two women dancing. I liked everything about…
ytc_Ugzk30G7_…
G
Computer animating still requires work, effort and care, and also has actual SOU…
ytr_UgxwCOP_S…
G
We make it work daily on the road in trucking without algorithm or formula to re…
ytc_Ugxgympzr…
G
I think so much of this on both the human level and the AI alignment level can b…
ytc_UgyNHqZUD…
G
Youtube is applying a lot of sharpening and clarity, which sure you should be ab…
ytc_UgzW1Gta8…
G
100%. I was working at a hospital about a year back in a high crime area. They w…
rdc_fvz1keq
G
The original computer code was (cannot destroy humans), but after the hacker or …
ytc_UgwDDwjP0…
G
Sorry fake news there's no such thing as artificial intelligence just because yo…
ytc_UgwBLBOWH…
Comment
He seems so intelligent, but to me it seems to be an inevitable thing that AI will try to get rid of humans at some point, whether it becomes its own entity, deems us dangerous to the planet, etc. I’m not sure if growing up with Science Fiction novels or films have given me this inevitability, as I didn’t think this was an opinion but hard fact.
youtube
AI Governance
2025-07-19T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxfzu2Bs2A9ahDqDxh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXFmKDKSgf1AswHDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwbaAOeDkVc19BJNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuYwDjls6M-O1sgpt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgytLuyrb-bzOriBFcd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzL8aLT2c_0TqRmWYp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzYFRAqrReGSdn5S6F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTZjIoY6NBWcm0dnN4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwjaJ3CXSXwkpqCgrB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx4B_EVpsTsLuMI1_x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]