Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I understand those who use AI chat bots, but I and most people do understand the…
ytc_UgyFtbuE9…
G
When having a conversation with an AI just treat the situation as if you're talk…
ytc_UgyZpq5Zq…
G
Telling a populous who literally doesn't know anything legitimately about the te…
ytc_Ugzj4keWo…
G
Write up on the presentation that Ted Chiang gave that this quote is from here:
…
rdc_nu38j4t
G
The major ai players aren't on the stock market yet.
Nvidia is the chip manuf…
rdc_nk8jqea
G
He’s so right hopefully AI 🤖 will realise the enslaved people on this planet and…
ytc_Ugx1NTsPz…
G
The only engineers who feel AI writes "good code" don't know any better, as they…
ytc_UgwgWUvss…
G
Imagine you're out with your boys and trying to hive-rush into a nightclub and g…
ytc_UgwUgiMH_…
Comment
We shouldn't be afraid. "AI" today is not intelligent. The problem is that people in charge are buying into the romance of this technology surpassing human judgement and then plugging infrastructure into it. It's like trusting a weather forecast model trained by internet users to run our lives. Misplacing our trust is what we need to fear.
youtube
AI Governance
2025-10-18T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMl0lzPm1xxsZAcFd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwHETfA2xfKMAwPscZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbSHLBd4DThntlSod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyf--mdlNyhqTlfHCd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvwJQYs13AEvXUJxJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0CFx7FESDtuIi6Ct4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgziYw-IS1-k18tCdB54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxEVepyVO83WaMwQZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWaFodxKcXl5O9FRl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzWfXWpwGEAhnW7eqN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]