Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Yes I would still be this critical", hit the nail on the head, the only differe…
ytc_UgxKYivYa…
G
Finally, someone on YouTube actually points out the stupidity of these "defenses…
ytc_UgwTzTfHi…
G
Hi Cal, think this works. Having a separate upload to centre on the AI stuff mig…
ytc_UgxDn7_JL…
G
I’m the only one in my friend group who doesn’t have character ai besides one ot…
ytc_Ugyn0Fz1q…
G
I am ok with living in 2020s but this ai stuff is making me wanna go back…
ytr_UgyxXwDB3…
G
About a year ago, Saagar said AI will improve productivity and will not cause lo…
ytc_UgzaCYPjg…
G
AI bros be like, "AI isn't a scam! You're just afraid of being replaced!" then p…
ytc_UgxYDtd_x…
G
0:21 If Nightshade ever stops becoming a viable way to poison AI art will just f…
ytc_Ugxo7lg_m…
Comment
Isaac Asimov's 3 laws in AI.
•The First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
The Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
The Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
2023-09-15T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwEKgeFmE_vAH57uL94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwRYK2KOp80bf6UxDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5QbRVkE8k9BYNbS54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1BUHkeRMOSh11_6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_oQXSfjyAfCnKYql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzKiR9T1TFevf7yOvN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5GmQhNpykH4oiCkp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPe1RZ3FpUvhTtiQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxPkYzgKmtN3b9ebzJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSDwuhkcYmp4x8sMB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]