Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay another thing I just wanted to add, as a writer who dearly wishes she could…
ytc_UgzgKHKBf…
G
And its strange how all these tech people are the ones doing the fear mongering…
ytr_UgxmTc-3y…
G
I'm a software developer who occasionally dabbles in art, so take my opinion wit…
ytc_UgyFJTC-B…
G
As an Aeronautical Engineering I am working in an Operative System with a super …
ytc_UgzSjiQm1…
G
I'm disabled and an outsider artist, in no way does AI bring anything to the tab…
ytc_UgyVsQ3Dn…
G
Remotely controlled Tesla robot is called autonomous humamnoid robot. The drivin…
ytc_UgwcmeUn6…
G
Girl just explained how I'd react if someone saw my A.I chats, some aren't bad b…
ytc_UgyKPdPa8…
G
55:05 huh, that’s interesting, that’s the first thing I asked it.
The YouTube …
ytc_UgyKp43VL…
Comment
Thank you for speaking about this. My concern is that there's no regulation or apparent controls in place as this technology is being developed. Additionally, the guardrails that would've been in place have been eliminated either by DOGE and it's financial cuts or the gutting of over-site government agencies. Privatization of this type of technology is risky and we are currently experiencing can be sold to the highest bidder makes it even riskier. This type of technology as in years past is getting fielded as "approved" without thorough testing so it's like building the plane while you're flying it. AI has told children to kill themselves, how is that protecting them? It's putting $ in corporate pockets but at what cost to life?
youtube
AI Jobs
2025-12-05T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgypTS4svymXxB10vtR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxmSVEuV5ZmqrqELDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdQwDm7FwaSe9xQw54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNV1inBsBYY-wDC854AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxL77utegpyRXR_c-B4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYoABgmEKtACyWfr14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqI6B_u6CutX90xXx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw8HLgf6TjR5gCoFeB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"approval"},
{"id":"ytc_UgyvJoT5TGEQISVK6tJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzkZPaQue3Km0IiMwZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"})