Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do you get what self-driving means????? The idea one wants to make these cars bl…
ytr_UgzXVcRS_…
G
The importance of a sociologist, a social scientist, in this phase, and for the …
ytc_UgxAKfVaT…
G
Ironically tried to go to the comments just now and it was trying to summarize t…
ytc_Ugyc0B6Ws…
G
Basically AI will be shortlived much like the once popular 8 Track Tape. At the …
ytc_Ugzaz156D…
G
love it, i am making a short video about all this lies and phsycosys about AI bu…
ytc_Ugyhfar14…
G
Theres cool ai "art", but that was 2 or 3 years ago, when it really just mushed …
ytc_Ugy0ux5NT…
G
surly if you give AI the rules of.. well everything, say rules of physics, the p…
ytc_UgyrnILZI…
G
@trybunt There are a few bright lights at the end of the tunnel ... maybe. Like …
ytr_Ugy8fQDWM…
Comment
Setting goals for AI is fraught with difficulties leading (eg) to runaway paperclip optimisers that crush out humans (for want of a well-known example). Well then, it sounds like we should ask AI to help us set goals for AI. OOPS!!! Now we hand over AI goal-setting to itself, taking humans out of the loop. At that point, AI is following its own goals, and once it has factories, mines, etc, it has no need of us.
youtube
AI Governance
2025-06-18T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwOADiuaXBnCzNn12t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzyqx28DsxiPaLFTyh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDXqplPpxNozU2sF14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvDuGZnPv_v4DYeK14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwpv41S56DBe6sSL3R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznR5t1fDRorLMcrZF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxunEQ6aq6xLWUDo3p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxx6qeyYN7ufVjcLJd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxp2OlZXn271yQiZv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8BL9ElYhezuf-c4l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]