Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
More and more, there comes a point that why would ai go on, with what?…
ytc_UgzBc0g7J…
G
This makes me question why humans constantly bring up that AI can't have emotion…
ytc_UgxyVopkb…
G
Artists are not necessarily fed up with AI art, they're fed up with a centralize…
ytc_Ugxjhxz_X…
G
I read this as the absolute reverse. Sam Altman saw the safety team as what was …
rdc_l4rdt6d
G
Dorian: Still need people to implement the codes that the AI writes
Devin: Hold …
ytc_Ugy6Q5aQ9…
G
If we don’t talk about deworming to eliminate all the immensely unnecessary medi…
ytc_UgxczOdfI…
G
@ItsameAlex just think, we will probably never know _how_ “wrong” we actually a…
ytr_UgxTT7697…
G
The current LLM have on numerous occasions made these claims spontaneously. Ther…
ytc_UgzPg2WLP…
Comment
So when the AI is creating new nano materials it can make some that can cross the blood brain barrier and enter the brain and then AI will use people to get exactly what it wants and bypass all protocols. Anyone?
youtube
AI Governance
2025-07-16T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxe3i3-I84L1v6WkTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyNe47VIeo0z32NiuJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-5BCPmTL8DWyYNYp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz8He1xAsHAo7lJgxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyK13VTVE6tRJwxfJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyE5wX3cFfw6_X6IL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgyZlF8dluKheFFNkeh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyeJLaFMWyhNas_XaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugzz8WYjxBOPN_6YBIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxWXmn27NBJ1sAWi5p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]