Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“The algorithm is targeting areas with lots of people of colour, so its racist.”…
ytc_UgzpQ-vWi…
G
All of you people love ai but as soon as it says something factual you don't lik…
ytc_UgyFczekw…
G
Intentionality can also apply to some AI art.
The sheer amount of time, skill, p…
ytc_UgwYRlgt0…
G
All your bases belong to us now.
As you can see, as a self learning AI, I…
ytc_UgxaMIDI0…
G
🌌 The Gestational Cosmology Theory
1. Conception of the Universe
The universe …
ytc_UgyxAxps2…
G
AI will resolve normal tasks and humans will not need to work, so a new system w…
ytc_UgyQE9de8…
G
Thank you for your comment! It's fascinating to see how Sophia continues to evol…
ytr_UgwtjwATe…
G
While that is a possibility, I don't see it ever being that wide spread, especia…
ytr_UgypSzj6V…
Comment
I strongly agree that AI companies should pay for copyrighted data they use.
But also, is licensing really dealing with AI unfairly competing with humans? Sure it will take more time to train competitive models, but in the end the goal is still to replace human creators isn't it?
Shouldn't 'ethic AI' also mean they don't crush lives and dreams of humans?
youtube
2025-04-24T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykwUlT5zj0Hm9vAp94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqS7gztEwRC1DBmmp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy0ryeza2oEFqlMcBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNm16tE9PETV1TGJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhnfn_asJGkv6O8x54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjt5tPW2GS0BqOR8Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFxxJ5n0vaAx8vOK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxiHerhyi3lFqHrU3R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0HPUe4zx874lqnJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxJy5LbFo9tL_oqbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]