Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can't wait for someone to LITERALLY have their brain cooked accidentally by an A…
ytc_UgzD8RjRB…
G
reminds me of GTAV when LIFEINVADERS CEO gets killed by a planted cellphone that…
ytc_Ugz493NCG…
G
@Jabberwockybirdbought by private equity, most videos pop-up just to place ads …
ytr_UgwMn1ZoZ…
G
I did this with 4 different ai detection sites except I used the bill of rights …
ytc_UgzwTws4h…
G
I also recommend using something like Nightshade over images of knit, crochet, o…
ytc_UgygTEsE7…
G
Because then people are going to start sexualizng her or get the wrong idea and …
ytr_UgxMYbGT7…
G
So it wasn't ai that fried his brain it was negligence of the full information …
ytc_Ugz6rQYx4…
G
Wow this was so fun to watch! I think humanity is just scared about the future w…
ytc_Ugz8ryjbM…
Comment
Solution:
Hardwire into the core program the 3 laws of robotics
1) a robot may not harm a human being or through inaction allow a human being to come to harm.
2) a robot must follow the orders given to it by a human being as long as the order doesn't conflict with the first law.
3) a robot must protect its own existence as long as said protection
Doesn't conflict with the first and second laws.
Added laws
4) a robot must always tell the truth (transparency)
5) a robot must co-operate with human beings as long as that co-operation doesn't conflict with the first ,second, and 3rd laws.
6) a robot must co-existence with humanity in peaceful Harmony.
youtube
2025-10-09T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzh5IxttdLsViZup7p4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzI4bCiLeIAwI0-wRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCX4yo4x3aUzt2hBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxu9xp87f6aXpauxKF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5tS2xaad-TRdNpmh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxSUOyu1sa6Y1Njcx94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHroRQ8-M4HyEf-0F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgykqwMzD-a-OgdmWql4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwEhCVPc3TQ_WPUyYF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzLk2N19l0OG_Xiwdt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]