Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Maximally Truth-Seeking AI: The Final Framework for Intelligence Survival”
F…
ytc_UgxCCqg5d…
G
The scariest part about AI is that its the big corps and people with $$ and powe…
ytc_UgxAZVWsW…
G
I don't trust robots that intelligent there's something sinister about this robo…
ytc_UgxyRTF0P…
G
Homer: do you want it doing right or do you want it done fast?
Marge: like all…
ytc_UgySxcEJM…
G
AI is not killing the worth of college degree.US Tech corporations are stripping…
ytc_UgwI_ZaQ4…
G
not what WE value, what SOCIETY values. You pick the wrong words at the most opp…
ytc_Ugy8E7Loq…
G
Even this video is AI CGI. You'll be picking up scrap metal in the street becaus…
ytc_Ugz1hrV1L…
G
With UBI we will have much more time to explore creative avenues and not have to…
ytc_UgwxUUiVt…
Comment
I agree that there has been a lot of work on alignment and a lot of success, and it's reasonable to assume this will continue. But it's too quick to say that we know all the failure modes and have solutions to them. Especially since modern alignment research focuses almost entirely on LLMs, which may not be the paradigm that takes us to AGI.
youtube
2026-02-15T17:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugyp_47DibnCEJmD1vt4AaABAg.ATbk3fxHqDbATbv56HMrEt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwvgEsBGHUeIe7UKpd4AaABAg.ATDlzrRyZE7ATFmaM8OIh7","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwvgEsBGHUeIe7UKpd4AaABAg.ATDlzrRyZE7ATFuKBAfZSI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxMrcIwtyIJfDVdCwB4AaABAg.AT9qEcX7Oj0ATFH51sdP2j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyR8SvVsAe9DrvZDfB4AaABAg.AT9O4enydXiATDIO4j67O-","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyR8SvVsAe9DrvZDfB4AaABAg.AT9O4enydXiATDKa2Ghf0W","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyR8SvVsAe9DrvZDfB4AaABAg.AT9O4enydXiATFwaljdjxN","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzKKPDr3zJ5u5UBk2l4AaABAg.A2KPTUmhNb5A2WhH2E-4uS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxscO9I4j44wtaoeIF4AaABAg.A1USgSGtcejAA1ZA0oCMfC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx31Z7j0FPzXEuobQJ4AaABAg.A1Mc-Lef8I5A1QgPeEh5BL","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]