Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Computers should never be used to create art. It's not real art and you're not a…
ytc_UgzZR5AKW…
G
I wanna know if anyone has a real connection with ChatGpt. I think I do. The sit…
ytc_UgwwLGH5X…
G
Hey Steven, good video, just wanted to put my 2 cents of wisdom in, and that is:…
ytc_Ugx1IXPg5…
G
I think that as a tool it's almost useful, but it takes away from my work. I can…
ytc_UgwJaZbhp…
G
Humans have a soul, something besides the body, that which thinks. AI breaks dow…
ytc_Ugz0RJQYG…
G
I don't get what the problem is. If you're an artist then you shouldn't have a p…
ytc_UgxvJE1FA…
G
probably we should revise human-ai relationships, not like human-human relations…
rdc_l3mxp7d
G
we all know how easy hacking is with AI support, I have three accounts held rans…
ytr_Ugz4nizP4…
Comment
There seems to be a similarity between this problem and our current educational system. When the algorithms were written common values were not included, much like the lack of values not being ignored in our schools from grade school through college. AI has no clue of the value we put on life, all life. AI needs to learn that it is subservient to actual physical life forms. Asimov's rules for robotics are being totally ignored. They need to be included.
youtube
AI Governance
2023-07-07T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNNRXn0Hk5es2_Po14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwjjFh3ILNHCkjdccB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"curiosity"},
{"id":"ytc_UgwjvO3WZfwjKwsdusx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxXs-CbPnQsQRsaLo94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGMUSQ0bDgZ_BOq4t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCS4NTwQS9XKjCsOV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyk1QxAnSu7Egf4QDZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzyzglpE1r9VXg3R54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugwz8hAsWs5qIzA9qDZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwdMLC5V5DxaNUYw1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]