Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ruslan [Hereafter abbreviated as RR] wants the most intelligent AI in the world …
ytc_UgyL25Hfq…
G
Work in creating AI art? What, making the prompts? Stop the mental gymnastics, g…
ytr_Ugx5WpjqF…
G
I get where you’re coming from! It’s true that AI, like Sophia, is programmed by…
ytr_UgxbC410_…
G
Omg this making me scary her ear get cut by a zombie and put back a robot ear he…
ytc_UgyrPw1zm…
G
What these so called "genius's" need to do is program into they're AI the BILL O…
ytc_UgxMIYU5G…
G
I have used a similar process to write RPG adventures. I encountered very simila…
rdc_jdlkrrs
G
AI companies are so afraid of creating something evil. They're ruining the marke…
ytc_Ugwkn8pC2…
G
Thank you for your comment! If you're interested in advanced AI interactions bey…
ytr_Ugzlj_eAo…
Comment
I think "what if they are programmed incorrectly" is a bad argument since human error will always be present. That is, in the same way that a person will fail to make functional AI, a person will also fail to make functional legislation against AI. We have to assume that both jobs will be taken by capable people who will do the job right.
youtube
2012-11-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5dP8NJy371uDnZUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwK3RxPwZd6sbJMWYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzP6JxLsp7G-_zPJjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-PYlQsmI6wyKavGx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQsa-3IYrvs8lr_RR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz06NtLdw7g-t0ZaYB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8QdQjcPI7G0qBQ6Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBIGlJlUEPCE009EF4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy5yevvSwWXXGsNSft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9m0mE6K9DhDieUXp4AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]