Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TheDiaryOfACEO I am at 1hr and 9mins. So perhaps this is discussed at the end. …
ytc_UgyhVMMUm…
G
Dude, you must have absolutely no imagination to think that.
In what world is …
rdc_kig9idm
G
There are plenty of people working on inventions regarding AI that will actually…
ytr_UgxK9L0JM…
G
Alright so this specimen is filled with hate, cleary showing their disgusting in…
ytr_Ugzfu3HBW…
G
I'll be honest and say that I have used generative AI (rarely though) to make re…
ytc_UgwXGVO_B…
G
Nuclear is obviously cleaner than things like fossil fuels but I don't really se…
rdc_eudkjqq
G
Don’t underestimate AI…
There probably is an AI that will do that shit.
It’s …
ytc_UgziIOW9G…
G
Does she have any opinions on what the AI coming out of China is like? Is it any…
ytc_UgzaOCN_O…
Comment
We literally have an entire trilogy of games (Mass Effect) that tell us why unrestricted AI is a bad thing. Of course, in Mass Effect, the machines are fully self-aware, but they still have no programming restrictions that prevent them from rising up and killing us all.
youtube
2015-07-30T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjHpoi4MMGqgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Uggc-9bes9wUWXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj9aX1JiUSK3XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggnJEnC7z1pzHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugg4you0I9WF0XgCoAEC","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugh984wo3xCWJngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggnR24j2_LMwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggNnprVproRXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghVP7t4IjdXLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggSCIMbCmQoD3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]