Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
100% this. This is the singularity sub. All we care about is acceleration. OpenA…
rdc_m98fdru
G
AI is incredible to inspire actual artists and the anti-AI brigade has become ex…
ytc_UgzPTeba4…
G
I've been doing something like that with vs code copilot and it's better, game c…
ytc_Ugz9Yt1vL…
G
Yes, coding will not go away. AI tools will imporve and likely support, but we s…
ytc_UgyTHc_7z…
G
It seems you guys are assuming that LLMs can only be bad for people with mental …
ytc_Ugxbk-hDV…
G
i tried to explain this to someone online and their response was the reference a…
ytc_UgxF6m_3b…
G
AI art replicates the way humans create art, and indeed a good artist can create…
ytc_UgxcGHuxd…
G
If a business you spend money fires everyone for ai just easy stop buyinh there,…
ytc_Ugwa82e4A…
Comment
Well yeah we still don't know how the brain works exactly and yet they're trying to mimic one, it's very possible they won't reach AGI within the next 100 years because of that. Yet a few lucky breakthroughs could also attain it within a month.. who knows, maybe our brains mimic thinking the same way AI ends up doing.
Either way the US would never risk China getting it first.
youtube
AI Governance
2025-12-07T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxRMlkPWGZmJGP-Let4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRrW1If8xX27oRAgx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGO4IXsZSM7ncU14Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRt46Pmx0VD_lrllp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQtHxKf06CvG_5N294AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-1_DRHgpA2F-C5RN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxH2mgWIi_roUFOzht4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1Xt9-0rHI93CwGip4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdL1inWvEHlyr3gvV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7wnUK14_gKgXp9mR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]