Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LavenderTowne poisoning ai art generators alone is like chipping away at Mt.Ever…
ytc_UgzTbM1Dg…
G
I'm going to say this opinion as an artist. I believe that AI art isn't art in i…
ytc_UgxrtEK4e…
G
AI art is legitimately fucking disgusting. Especially when greedy corporates try…
ytc_UgxAk0YdK…
G
@kerwynpkits really not about artstyle. Also that argument about training off …
ytr_UgwiMouJG…
G
To be fair towards the AI Art community: This post that was created as a reactio…
ytc_UgxXja-x9…
G
As an artist, AI art is just an image, not art. You cannot compare it, as the ar…
ytc_Ugw3NykZb…
G
T’as discuté avec Dieu, sérieux ? Et il t’a laissé poster ça sans filtre ? Revie…
ytr_Ugyty3JGG…
G
AI has the ability to make you feel like you're gonna die by doing something to …
ytc_UgwwHitZF…
Comment
Companies aren't going to stop developing facial recognition tech because of some moral code. The reasoning is that if I don't develop it, someone else will.
So the only way to put a stop to it is for government intervention. But should we? Should we crawl back into our caves because we can't properly use a tool?
And when you bring up the argument of helping to catch criminals, forget about government regulation that would ban the technology.
So the argument that we should be having is how do we properly use that technology to not cause more harm to people.
youtube
2020-07-11T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgylHfXz9Y7ulraAchl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzFmhvk3sTkR6shmKB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxgD3khjON7yCzyU-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwAASWsYppaDNvCGIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz55X3uWBnA6OuLKYV4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrqB9DbygaTFxbQhd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyzJ3lUOAkjhRRZFh54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAoUtYS81izHxOSdR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwdqoo_vX1AT1xgJ9d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz82OOQj6-uKq7zzJ14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]