Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If people are that concerned about self-driving cars, then they should be voting…
ytc_UgwkPsRWz…
G
Kinda doubt it. But if that really happens and the AI does end up whipping the e…
ytc_UgxlLjuZ2…
G
And yet mankind want to destroy its own people by replacing AI with man force ! …
ytc_UgwQZjRDN…
G
What's critically missing is Issac Asimov's Three Laws for Robots:
"The first…
ytc_UgzB9ylG0…
G
Do AI have a degree. Why does the teacher have to have a degree. These kids are …
ytc_UgzOGvJz2…
G
AI artists arent artists, they are prompters. Let us be real! Humans are inheren…
ytc_UgwvilNyI…
G
Except for the good, all the AI people like Wall-E cuz somehow they are more hum…
ytr_Ugzx6qkJV…
G
The experts always say were five years away from the ultimate game changer in te…
ytc_UgyAQKeAn…
Comment
If you strip ai from all the data it learned and trained on it wouldn't be able to generate anything so to argue that it's like human training is just plain tomfoolery, the original artwork's role in human inspo ends there . At the inspo part. And nothing more. Here the original work is fundamentally crucial to get an end result at all.
youtube
2025-04-13T06:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxvcHb4Vde4mMxiNSR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzqjKHaYtMFOlrGo7B4AaABAg","responsibility":"industry","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysmYUJoFDbz1LZSmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjQiiv6_h6Hn9LIDZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpFm3GMA8AtmYRVHB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLrmmEzx0GmwhSa3t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLXYtOuaxvub-J_Yt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEqw3Wd37fn6t9YH94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBFdAoVjQrY11TJJx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwg8CJfojk22QSfhR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]