Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI does not feel fear or compassion. Mercy and Loyalty are just words with defin…
ytc_UgzdE27xv…
G
Neat.
I do wish we lived in a world that creatives didn’t have to devote time t…
ytc_UgwhwZpZy…
G
Them: i love ai art. Why dont you? How dare you offend us?
AI: takes their day …
ytc_UgynV6Mo2…
G
how about we save the worlds resources and not speak to them aka "AI" or the peo…
ytc_UgylOF9aw…
G
We need to give name for Tesla autopilot. So feel free to leave , a name for tha…
ytc_UgxexkQg6…
G
Its crazy that ai has shown promise in finding cancer early, yet people insist o…
ytc_Ugy4xgYWA…
G
We appreciate your perspective. In our live broadcasts on AITube, we delve into …
ytr_UgxrXljgy…
G
This doesn't work with current LLMs, they are heavily system prompted to respond…
ytc_UgxIUXZ3i…
Comment
Automation is all about making life easier... until it impacts your field.
Best thing to have would be a solid system to look up and remove artwork from the training set that is not permitted for use by said art's creator. One with some solid guards in place against misuse.
It should also be kept as open source as possible to avoid corporate monopolisation of a skill democratisation tool.
youtube
Viral AI Reaction
2022-12-27T13:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwnYMUbNM6yXDftmxB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugw9_qrk8AVy97tNJC14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwg0DveoGgSsETYpmt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyAB_XKta4GaFTBYf54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxRKxl7XrJCYrViX9R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuMagaEuCUPKrH2nJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgydyF1ykZK36Fz3EFx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCSBA9t3U1XuhKRnp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxkg5zQgRtT1REBdMt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy33KTt9cDA1v45ktJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]