Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots are terrifying to me. Ever since seeing the film "I, Robot.". Poor Regina…
ytc_Ugy0vEKH3…
G
I'm admittedly only 40 minutes in so far, but to me the main issue is that Yudko…
ytr_Ugzl3OaI9…
G
risk management, ethics should be explored properly. Human's ugly and dark sides…
ytc_UgxRQa1zW…
G
So uh the same thing is happening for the ai artists
But replace the wood with …
ytr_UgwmOEvi3…
G
Fully automated luxury communism please, as a part of this universal basic servi…
ytc_Ugw_Va46p…
G
I honestly hate people who think they are artists when they print other peoples …
ytc_UgzRUZbRW…
G
My immediate thought back when AI-art started getting popular was "what a cool t…
ytr_Ugx6eolNI…
G
Yesterday i was shitting right. (all good stories start like this) I saw a video…
ytc_Ugz7LsfeI…
Comment
Look at what happened with the colonial pipeline being hacked. Making trucks fully autonomous is a bad idea. Not to mention it would kill millions of jobs. Make them more fuel efficient or use an alternative energy source all together. An adaptive cruise control and lane hold is fine. But not driverless!
youtube
AI Jobs
2021-05-11T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzKobM3NiED2mhCyEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy--17hGw1fiZLoN3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4sE9jXXw55cH25o94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwt8xfdCnW4I2bocPB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEnERSTKOnNO8QXGx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-anXLywjIuJisHJN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZZh6gxmn7zsBZ6md4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0Otd5fmJG3ZJVkgp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjEaZ_McaiKcDQwoV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxypTIRhca2oslZhKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})