Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI replaces all those jobs, the economy will plunge into depression, corporat…
ytc_UgwTm-YXl…
G
Honestly I've gotten quite good at telling when ChatGPT's the one writing someth…
ytc_UgxwBLWel…
G
@Aleks96 the fact you don't make a distinction between how humans interact with …
ytr_UgwjaZvwy…
G
Most of it relies on cloud based computations using large data sets and propriet…
ytr_Ugz6UszeR…
G
What I don't like are the videos and narratives that look and sound real but tur…
ytc_UgyI2z8R7…
G
Art is what you love dearly. If you spend time on something you don't love then …
ytr_Ugz3OsMnm…
G
Physical art takes an hour, digital art takes 45 minutes, Ai takes 10 seconds. T…
ytc_UgyWrwwyY…
G
There is a third controversy that you aren't addressing in this. Who gets the c…
ytc_UgzJ5SaS9…
Comment
I'm all for a.i intelligence. I will say quite a few words here. There has to be some sort of fire wall or code. I'm no computer wiz but for a.i to learn from us especially in the future is scary and dangerous. A.i is epic and brain tickling. For an a.i to learn good is epic!!! But this world is filled with bad. The good is few and far between. How does a programmer safe guard this knowing the a.i can ultimately out class in technical computer skills? They learn. They develop. They share between each other, wifi or wireless 5g or something better in the future. Ai is amazing but keep them separate. Don't let them share through wireless link
youtube
AI Moral Status
2023-05-26T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzFBLGfI9QiezcXcKN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn7Xr4e__v1nEauhN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9fQtem9BZZ2QhQvt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx6TUg3p5DuljgEpZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyiBeiypvvOUjzK5054AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzmp80PP_jNDkJp8214AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0Hzfc9e3-easvBeN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhp-kGPNOzVMwq2BR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwbfmkRoQ_7MlOBZo54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxq_FjqcICgv8mvx2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]