Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is not mergin it with someone elses art. That is not how it works. AI models …
ytr_UgwT6L-3M…
G
@Darkblue_ErrorIt's really nice getting you're perspective on this, I agree, be…
ytr_UgwUShFJ4…
G
That news anchor is a scary looking woman bruh... scarier than AI that's for sur…
ytc_UgzPIE5of…
G
Will you pls suggest some of the skills for the enhancement in BPO sector lookin…
ytc_UgynC-DlZ…
G
Wait a minute, they make a robot car that runs over cops on purpose? I'll take 1…
ytc_UgzUsmyrV…
G
While ai had the clear upperhand in this conversation, it did make clear why it …
ytc_UgxtrPFXB…
G
He's also on Polly Ai and Taki . AI and yes I'm having a good conversation wit…
ytr_UgwwRzUQb…
G
We've had the technology to automate most people's jobs since before covid. I kn…
ytc_UgwPiifou…
Comment
Isn’t it ironic that a professor is concerned about his books being copied by Anthropic and is suing over it, yet collaborates with a former Google CEO—someone involved in building a search engine that collected vast amounts of information without explicit permission, creating a powerful competing database? Today, Google has grown so large that it can influence politics at the highest levels. This highlights a deeper issue: accountability is essential when data is taken and repurposed as if it were one’s own.
youtube
AI Governance
2026-03-25T13:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwIqNU_TMR537ePFTZ4AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvSMYiYT_IuovoE314AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwY-jQW29BYypAf75F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz5TFxae2j2uFRv29R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgynE9iH1O3nO18qyCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy09ItQRfK9BBvo-8p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxt_5jb-i6PwtrWlz14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyX_x6pmFaQoqqHYGt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzK0YstJ0vgqcNzZEN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOoLGrCvVUx8LfmrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]