Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When will we take down the data centers, built to record our every eye movement,…
ytc_UgxEQ6d0Z…
G
To add onto your point about "inspiration", an example I often use when explaini…
ytc_Ugz_AE-Cp…
G
No! No no no! Stop it now! Shut it down! We don't need a real I Robot going on!…
ytc_Ugw-Zy66p…
G
And the funny thing is all these scientists and engineers are racing to advance …
ytc_UgyKiepod…
G
🚨 IMPORTANT: This isn't just Taylor Swift's problem—it's everyone's problem.
Wha…
ytc_UgzVag5G_…
G
I think AI will not replace artist tho, but the "People who use it" will replace…
ytc_UgxbLCufL…
G
Ring, Alexa and all ai is surveillance that people are installing to spy on them…
ytc_UgxL_158j…
G
No, what you did was tell a generative ai what to make. You did not create it…
ytr_Ugw6onAEm…
Comment
Once you get rid of, the nation of India & all Indians who live abroad [1st, 2nd, 3rd generation & goes on]. Then, you minimize the danger [60%] of world population who sleeps & wakes with Artificial Intelligence, then you have the rest [40%] (2) clean the mess which you created & supposed you dreamed of bringing the balance in [21st century].
At the end of the day, you have reach the point (2) asking yourself what wrong you did, & still, you don't see it because you enjoy all the dirty benefits of Artificial Intelligence along with your friends [Elon Musk, Mark Zuckeberg ....].
At the end of the day, you will be responsible (4) the end of human kind in this planet because of the wrong policy you follow along with your rich friends
youtube
AI Jobs
2025-12-10T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugymab7WwQeZdbgymxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQGpS2gnwjviFrO7J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJ3hMpDqSuutl33uR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwBL5cbzKBfEOUEQxV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzR5XsSOt0YbyN9HLh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyP1qn8vD-qOmbug0N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2xWaj9Jx9OjpCF454AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysO4RsbliQyvjFVkp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw1MB0apNlzZ8XDZg54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyc62dw3OCas3jTTRJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]