Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seriously if your poisoning the future of AI growth your a selfish piece of tras…
ytc_UgyrvE4ki…
G
so long as the perception of consciousness exists so does the consciousness? Our…
ytc_UgxumOtW_…
G
We need to keep AI to the worlds of Sciences, Medicine, and even hard labor wher…
ytc_Ugwjrh2fP…
G
@caitlinsnowfrost8244 the AI is still trained to use the art in the original dat…
ytr_Ugy8U4l5f…
G
Soon A.I. will tell you what job you will be designated to. We are in the pot o…
ytc_UgwMKEaFI…
G
AI could easily destroy human civilisation. If AI ever exceeds the abilities of …
ytc_UgxUl464t…
G
@ basically, it slightly changes each and every pixel of the art. the difference…
ytr_UgwQ8UjJg…
G
I wonder if you would say the same about intentional ai creations like with cont…
ytc_Ugz-w4ZJ5…
Comment
What we need is a professional ethical guideline for developers of AI. Drones will have autonomy when it is demonstrated that they can make better decisions than humans. If you program empathy and ethics into a robot it will not compromise, unlike some guy sitting behind a joystick.
youtube
2012-11-23T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5dP8NJy371uDnZUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwK3RxPwZd6sbJMWYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzP6JxLsp7G-_zPJjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-PYlQsmI6wyKavGx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQsa-3IYrvs8lr_RR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz06NtLdw7g-t0ZaYB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8QdQjcPI7G0qBQ6Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBIGlJlUEPCE009EF4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy5yevvSwWXXGsNSft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9m0mE6K9DhDieUXp4AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]