Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair, there must be thousands of times human drivers do stupid sh*t like …
ytc_Ugz1oO7Al…
G
I'm a time travelling pleadian. Working on keeping you stupid humans alive. I've…
ytc_Ugy-SFBWf…
G
here’s the thing cameras are all around and we can’t deny that and if we’re not …
ytc_UgzlQW2VA…
G
Disney wants to eliminate competition for their own Ai slop. Coming soon to Disn…
ytc_UgwlZCW6s…
G
Meanwhile Oracle laid off 30,000 employees last week.
" Coding is ending this ye…
ytc_Ugx97rG3Q…
G
this is too narrow of a mindset. AI won't look anything like it does today 10 ye…
ytr_Ugytao46q…
G
Humans are evil, and who created ai? Humans, We are horrible, some of us kill an…
ytc_UgxstcQUe…
G
Dear Turtle, as for the part 2:30-3:00, basically smaller models - this is alrea…
ytc_UgwipBOGj…
Comment
He's a fan of sci-fi but doesn't mention the book, I, Robot by Isaac Asimov, where there are the Three Laws:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
2026-02-18T05:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_ktNRq9GhPaQzxZ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhSv9TXepfYzfQAfR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugys7Q9NazuUV7PwhYB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzqTlB41VXFeorCsMB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgydBHOjvEO5Aeo72ch4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIjc0JMrHvxzcvlJF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyu89iTdEuGxgXyd7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwavegenCr_-Jazi5V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzE9ScnVhqerTsdVqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwoNFNeinU9xv6JW5p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]