Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Consider this: humans are facing an unfair competition in simple tasks. The fact…
ytc_Ugy6dzo7S…
G
Counter argument: Your point is entirely emotional and based on what you think a…
ytc_UgyNuQqiQ…
G
Oh you want a 100$ to make a draw? AI makes it for free. Cry…
ytc_Ugw_nGQZY…
G
Art comes from the soul. Thats what AI lacks, a soul. Life. Its art is lifeless …
ytc_UgzrR7ALA…
G
People and governments have still not waken up. They think Ai is similar to Inte…
ytc_UgxvWB9w9…
G
Ai is an addition radio & Tv , your kids won't be able to live without this , Th…
ytr_Ugzt-M0E8…
G
I’m going to raise a point, drawing is a skill, and it takes a time investment t…
ytc_UgyG_W9XY…
G
What a bunch of nonsense. Ai doesn't think or reason, it only predicts text. It …
ytc_Ugw0uOVwT…
Comment
What do you think happens to a system you feed with unknown data from all kinds of sources...?
The amount of data you need to train an AI is so significant that the control of the data sets often falls far too short. There is already evidence of AI bias in the code.
youtube
AI Moral Status
2025-06-04T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyUFlx3vqzrBVcE-J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPUIsljKakarmejzV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxseywyidAD9PcFdnd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzx8VkgQxgVBRFOEFZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxCAik_r6-j0uLzPF54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy5pNhT781DINcrxIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMh8xvyhS5kN8CkNh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJ0K2uvjtjudpYxGp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzwqTj5ygv_u5nexQR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylhIXEfFv9ZQ7-jXZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]