Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Definitely agree. In the past, "AI" was a dangerous term to use when marketing y…
ytr_UgxtjMfky…
G
Lol what is this guy on about than maybe 10-15 years from now AI and robots will…
ytc_Ugyt-pFdc…
G
AI will be the death of jobs - There are numerous jobs that I would historically…
ytc_Ugz9qCHs4…
G
@7:48 "whether you'll be one of them". Because if you haven't already launched …
ytc_UgxC-3dUx…
G
Typing an AI prompt is closer to commissioning artwork than making it. It's like…
ytc_UgzzSlvFL…
G
I think weaponized AI is inevitable. Instead of human life, we would sacrifice a…
ytc_UgzQoiHSI…
G
Who cares?
We asked for this with ai, and simps will always be a problem. Who ca…
ytc_UgzxPLVaQ…
G
Rereading Childhood’s End by Clarke led me to replace the Overlords with AI and …
ytc_UgzLZYVIc…
Comment
Remembering the movie Endhiran in which Robot refuses to obey the Scientist 🤣🤣 Time is not far away in reality 😅 Robots are going to dominate humans 🤣🤣Sun never will set in the Robot Empire 😆😆
youtube
AI Governance
2025-05-30T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_E52oHfeofjIpmAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzRBcfQs-XTUFG7_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwkSi4qWYy3OC1mFl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxETt_kq4JjXRLl0Sp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4efrA-LY8BpErvUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWb8uw0bObhpDm3rd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8gEg7BcR-P2V36hp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYsAEEkG6kGUInAFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6wsbliaTWLgWL0sR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzyRP2s3fEzSEGLYg54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]