Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait, your non-narrator voice, at the end sounded like you were using an AI voic…
ytc_Ugy27GWyd…
G
In the end, AI will never be a evil as human beings. You just can't program in …
ytc_UgwTZiz9C…
G
Creating art is the livelihood of many artists with disabilities. Using their ar…
ytc_UgyXQPokw…
G
Will the different ai be racist against each other after they select what race t…
ytc_Ugy4ihfRT…
G
I agree, the way they describe howlrounding in the paper is instead similar to w…
rdc_mul25xl
G
I have found that the best way to limit latent space in LLMs is through logical …
ytc_Ugwphqor2…
G
Deep Research is solid - not perfect, but incredibly useful.
"AI isn't use…
rdc_mdsz8fo
G
We appreciate your reaction! If you're interested in engaging with advanced arti…
ytr_Ugwy-z15D…
Comment
"What if you program them in correctly?" is a pretty silly question given that humans make mistakes all the time. I'm sure mistakes will happen also with robots, but overall automation tends to increase safety by orders of magnitude. When you think about it you'll also realize that the destructive capacity these things will be severely limited by ammo and fuel Skynet just isn't an issue.
youtube
2012-11-23T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwkTd4vXc32HI_5tfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9mBoaAtemGq2dYNB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzGj_CMgD8AM9wGPKZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrtkvP9hq0PCsJ3EZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzp5_RDZwOLFyXjLN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyD5pyxiyLGw_kg4w54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyKGNy6C-78aOwLCFV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwvf7PN3ITtzuv76et4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxSaelGApyYIXLQfkh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqYzqMuGvNtO2BBwV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]