Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Get a good Trade down longevity will be better than wasting money on college , w…
ytc_Ugz5VENrw…
G
SO MUCH OF TODAYS RESPOCES IS THAT ISNT FAIR WHAT IF AI DECIDES HUMANS …
ytc_UgxrE2eaH…
G
Did he really try that with Justice Manzanet-Daniels? Wow, that is the last jud…
ytc_Ugz-2NzqV…
G
Also face recognition are not as accurate as we think for brown and black people…
ytc_UgwKplNmy…
G
I don’t like Ai art myself because I want to train my own mind to draw I don’t n…
ytc_UgwRlojOs…
G
I've lost photography competitions and then found out competitors were using AI …
ytc_UgzdB10AG…
G
I am kind of both sided. First of all AI what i think should had meant to be an …
ytc_UgyM0gNiF…
G
10 ways to stop AI from harming humankind...
1. Dont allow AI access to nuclear…
ytc_UgyGB9d8G…
Comment
It’s too late to go back with AI, it’s already been released and utilized by the public. Even if laws are released, there is no ways of enforcing them— think of it as the same as pirating music and movies, it’s such a mass spread issue that punishing the unethical use would be nearly impossible.
But let’s say there are laws and regulations, how are we going to stop people discretely using it on the dark web or hidden websites? Sure, it’d be more of a challenge to make it accessible... but if someone is so mentally deranged to make deep fakes of a nonconsensual person(s), then what’s stopping them from using it anonymously.
It’s just a matter of adapting to the way things are now and doing our best to minimize the damage AI poses on ethical use. But I can’t imagine how we could possibly stop something of this magnitude.
youtube
2023-07-19T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxMmedBY7huL3YNL_N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOXhvbjXlZcLut3yJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzxxp54I_EVDv55TvN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxenvIyoqdQuMmXxE94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzvKngei2bC-pf51B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz52YPEEUU7xo0KZWh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDJ-mAAC8T6topkEN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxpSEyaZsqSK9CiwJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx32PUv_7ycuPuqvSd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxcPVvGE-DdA_HKcWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]