Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But depends on if AI really does think for itself and what sort of morality does…
ytr_Ugyki5ULO…
G
@spacekitt.n I've seen this argument for a bit, but the flaw of traditional pai…
ytr_UgzJV4kRL…
G
I went on to chatgpt and asked it this
According to the definition of art is ai…
ytc_UgwTKDyQ3…
G
A few different arguments could be made for why fighting against A.I. art may be…
ytc_UgxNfDGh_…
G
3:34 says so by the label "Harmful if swallowed." The AI chat bot really got int…
ytc_Ugws05jr2…
G
Well said. I have viewed it the same as you frame it for many years: its a cultu…
rdc_fwhdb33
G
Who cares what AI says about AI? A human would have to start this crap.…
ytc_Ugy1dOEJS…
G
Oh the be creative thing. You come up with one job, then AI takes over, you come…
ytc_Ugx_pkwM_…
Comment
I THINK THE MESSAGE HERE IS YOU CANT REPLACE A HUMAN SOUL, SAME AS YOU CANT DEFINE THE INFINITE WITH STUFF ITS WHATS CAUSING THE BIGGEST SICKNESS, ALL YOU NEED IS ALREADY WITHIN YOU BOUNDLESS JOY, AI IS USEFUL TO ASSIST PEOPLE THAT IS A GREAT ACHIEVEMENT BUT IT WONT REPLACE WHAT WE TRULY ARE, A BEING OF LIGHT HAVING A HUMAN EXPERIENCE, MEDITATE AND GET TO KNOW YOUR ONESS WITHIN, PRACTICE ASCEND AND MASTER THE GAME, LOVE THE SHOW BTW
youtube
AI Governance
2025-06-17T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzw-nBwXiL36g06CfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygzKIZ3z7RnuRRckN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLmSoZzZU5UR8GsEJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbMLYCKypaaSYJiJh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI3RcZSBVR3k6c9Jt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwX-oCXysD3oTDHWFN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzh8hrN7EiQEn_Nr7l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwUoxmgxBY6CwOUp314AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx52jcEzOT1Y0_vkX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrJWe6YaXk_oWV2G14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]