Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Heyyy your videos are really informative and thanks you for the amazing content …
ytc_Ugz13TrUq…
G
Betch, the problem is that we are inventing "General Intelligence". In the first…
ytr_Ugxg-4iD6…
G
All this wonderful deepfake technology and the young shivarajkumar in Ghost look…
ytc_UgzcpGGTM…
G
Humanlike Writer has been working great for my affiliate content - it passes AI …
ytr_UgyQH9pBx…
G
Hey @Mille..! Thanks for your comment, it had me laughing harder than a malfunct…
ytr_UgwMQi489…
G
"An artist has no right to speak to gods".
Gods as random dudes that have no tal…
ytc_UgxJ3TK-Z…
G
I think the bigger meta "story" is that we as humans are handing over tasks to "…
rdc_e7j7w3s
G
Everything I’ve asked Claude to do it has gotten horribly wrong. And it has been…
ytc_UgwxvGtCD…
Comment
Giving robot sentience is a bad idea. Have you never watched any robot movies. Also when making a robot you have to abide by the rules of robotics that robots must always do everything their owner says if it doesn’t infringe with rule 2. Rule 2; the robot cannot harm or let anyone be harmed.
youtube
AI Moral Status
2018-07-13T18:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7YznFYEUKkMe1iBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzahW5WKawAqoKCB7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHWKvJT8IhKO-_qF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbgNKJMW57e2gSy1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYCJpRzmrEA7SN_ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dI6ViiYSCEbnzft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzinrD6hweefSHzu-x4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9MR1jF5P4ZT51IHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy75Vkh-6d8zWFeqFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnZ11_1Tt2abQ2lgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"})