Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DAVID-ql1vo There are no self aware AI's and there won't be for a long time.…
ytr_Ugxx-y4CG…
G
Bold of you to assume that from this moment up to infinity AI couldn't create th…
ytc_UgyLpQKE6…
G
It’s quite interesting reading about the neural networks he pioneered that resul…
ytc_Ugz0Fl0ms…
G
I’ve noticed that AI-generated videos don’t do well with written words. The word…
ytc_UgydTerpI…
G
I completely agree. I've had a lot of fun playing with AI and code generation s…
rdc_moywep5
G
Don’t be a fool the u.s government has some of the most sophisticated CCTV face …
ytc_UgxmrRoIS…
G
I used to draw when I was younger, then because of depression, I didn't draw any…
ytc_UgzZMK6zn…
G
That's why Dr House breaks into patients houses and discover the stupid shit the…
ytc_UgxKRJmy_…
Comment
@theofficialness578i think it will do what it's taught is best. If we're not teaching it value for the human life, then how will it learn? It needs to be brainwashed to believe that humans are detrimental to even their own existence, that we are the utmost important component of life, that every single human life needs to be protected. It needs something similar to the Ten Commandments mandatorily programmed in every model. A Code of Ethics. If we're going to create something and equip it with tools that could be used against, we should at least create it to not want to.
There are some humans that would much rather remove those ants, and that's because of compassion. Unfortunately with us, some have it some don't. But if we had the ability to "program" people, everyone would. We have the ability to program AI.
youtube
AI Governance
2025-10-03T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugxd6tb0mcBPAFB6q_p4AaABAg.ANs-vhVBzp3ANwSq_Y113w","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzTtu-5wdkpZHWiaap4AaABAg.ANqls69G3lRANspG5ACv6p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxGK8O6H57EdRHIuvp4AaABAg.ANqiRcd0e1BANsr9vyIaf3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANoXlKVUGxO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANpTWQ6hEuc","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7fuyysxH","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7xFkWLN2","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANq1ZNvNc4W","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANqP_3ZvdhS","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgybxWrl9nJzsKEejot4AaABAg.ANmPP7QKWYsANmS3TL3Mgr","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]