Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OMG! This is a prime example of we were all worrying about. I occasionally poste…
ytc_UgzcQMum8…
G
I'm two ways about it. Mass surveillance via facial recognition is just unethica…
rdc_gqja3es
G
Please please please dont force people to use it, hwp and k-cyber security softw…
rdc_myroxja
G
@randominternetperson3 lol. By the same logic, a real therapist isn't gonna "fi…
ytr_UgxwrzEyH…
G
I already dislike AI art as it is. But you gave me more reasons why too dislike …
ytc_Ugw7CMuWF…
G
Seems like we are already dealing with superintelligence not with the computers …
ytc_UgyxEOaLb…
G
The kids should be more concerned about their future. Remember the robot toys we…
ytc_UgyJ0HaFH…
G
Hey.. the robot is programmed the way it is needed..so it is only saying what we…
ytc_Ugwtauz-W…
Comment
The second question is something that is also a concern/topic of debate in programming spheres with codex AI tools like copilot which are deemed as a way to smuggle open source codes into proprietary software. It is also trained on publicly viewable code for software and is then used of code that isn't shared and charged for proprietary software
youtube
AI Responsibility
2023-01-10T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyow5nwQ_tw-h3_Twt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-qKx9AEz9SO1nEVp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz4PtxJdqXPHgQM3ip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3ky_GRXdgpgZWGjV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDWAurlcyK7UeYyzp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmvOyf7KVDJvJ-fKp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuMaFCj4Naj5zTZPB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDx_PYQsie7mSmzyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5d4-8TSdBPBhq94l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNheKzByeDE7FmUv14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]