Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> Because that means companies that own automation technology (not klarna) ca…
rdc_m26ol5b
G
Do not overlook the position paper "Copyright Registration Guidance: Works Conta…
ytc_UgyXssjMv…
G
i wonder if gpt6/7/8/9.... once it reaches superintelligence, could hypothetical…
ytc_UgxKmz_Zs…
G
People like you live under a rock. You have for several months top level open so…
ytr_UgwmlDTAA…
G
Why would you want the government having any part or controll with AI? I underst…
ytc_Ugyl2D7vi…
G
Good! We need that sort of education where our systems have so obviously failed …
rdc_f506xlr
G
People should be very careful about the use of AI. Knowledge is power and the mo…
ytc_UgznwBqFx…
G
Because certain political figures have touted it as a solution to tge pandemic t…
rdc_g9tg2lq
Comment
The "Gorilla Problem" analogy is actually terrifying when you think about it. If we are building something smarter than us, we have to be sure it's aligned. But honestly, the more immediate problem for me is aligning my budget with all these new models. I canceled my direct OpenAI and Anthropic subs because it was getting too expensive to "keep up with the race" Stuart talks about. Switched to omnely so I can access all the top models in one place without going broke before the singularity hits.
youtube
AI Governance
2025-12-07T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylOcMtmfYPRLyA_uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0s_5F0fL7Yc6h9pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCTFAC3tuaqQyd4rJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmHU68lQswaDEmhOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9c5M8aiFACFvwDkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzslRkuK_KSVVjq6CV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxqiq20CC4lLEtT6Oh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAcj9D3tb7hktFuIJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUI-XmKN6ijviTF6N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW72yvXfzgauYvOVF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]