Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3:27 This has to be the most pretentious thing I have ever heard. Thank the gods…
ytc_UgyLrN3VN…
G
Art is NOT acessible to all, it requires time, energy and money, things that the…
ytc_UgxmE7zv5…
G
I beg to differ on consciousness topic. I understand we might be using wrong wor…
ytc_UgxXy0ghL…
G
I'm pretty much at war with people who think Ai art is bad, especially because i…
ytc_Ugy_LgwXN…
G
Being a consumer of art and music doesn’t mean you understand it. And if you lis…
ytc_UgwgBsdCR…
G
The funny thing is most Ai tech bros have no idea what AI is and how it works. I…
ytc_UgyErWlrq…
G
I think one very important distinction between our intelligence and what is goin…
ytc_Ugxzu2ZDS…
G
Elon Musk has been promising autonomous driving for over a decade... and they th…
ytc_UgwwPos9M…
Comment
Regardless of the good things AI can achieve for humans, perhaps we should first clarify the rules and framework conditions for artificial intelligence before we go any further. We are all using a tool that is a work in progress. It's like a daily bungee jump without the guarantee that the rope will hold.
youtube
AI Harm Incident
2025-11-08T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhPzZvNAkhtT0iHQV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyT-vebxiwMwdw2i3B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzROFHwZQw3qc2xQbx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0gmts_GoGjEZ2qK94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymKW9n7gL5ZlAjhot4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8pTntVDumP2kJaMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwgv4Z6ASODv_vRdxV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcWBJ0cqLxVAFIlXx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8kjfndpPFzfBERhJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxuz-IFqfZhXl1Bi5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]