Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So what happens when you lose your phone? Can't travel? Or just facial recogniti…
ytc_Ugzm72qX4…
G
Do we REALLY have to use the "cloud" for everything? I would happily return to …
ytc_UgymKjXNY…
G
It sounded so much like AI, pretty stilted with very little warmth. I was not as…
ytc_UgyORG6aC…
G
LLM's can't even count to 100. I wonder how much hype this lunatic is passing. A…
ytc_UgyjrY-eK…
G
So AI is here to do all the fun things and leave us to do the boring things.…
ytc_Ugx3Bs1C_…
G
My take on this is simple. Dont make robots smart enough to be conscious. Make…
ytc_UgxymhbV-…
G
Ya, I thought so too. Turns out it was from 2 yrs ago. Gemini's come a long wa…
ytr_UgwO5z315…
G
I'm in big tech right now. The vast majority of employees actually being replace…
rdc_nma086z
Comment
As Engineer, i been dealing with this question for a while, i specialize in hardware design, but also deal in cybersecurity and there has been talks in many events i been to about the problems with AI having access to hardware internal processes without secure control shut offs. We could develop hardware key access and security shut off directives, but they are still controlled by some form of programming, we might have to make independent programming developments to control AI access, but the development is highly complex.
youtube
AI Governance
2026-02-13T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz_aHN8kBtfstKzhL54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxhDrLhihfJklufwqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxd9wUk0U5EI2mZ3Ud4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc0PoCCvDmeuJG4uV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxVbHjeWASL5m4o7xV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwslQqsHxWQW_5qHvB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwdhCZevxjWCzmc-DZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmLPWBpcbYahcCfap4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBV_JHrjry5uLOftB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwTgAk4kLjFxczti1t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}
]