Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the whole point of art is creativity. these ai generated images are not and neve…
ytc_UgwWO2RSj…
G
We don't even have learning - it's just a language model, nothing but a huge dat…
ytr_UgyO6e0pi…
G
You're right that people would have to give it "unregulated agency", in the form…
ytr_UgxT7RhFT…
G
Imagine the riots, especially when Republican voters discover they've been duped…
ytc_Ugyu4ic-N…
G
AI Bros sound like this: "What? You don't want to use this tool that will essent…
ytc_Ugw3cWmgD…
G
I think in the end people will be very disappointed in what AI can really do.…
ytc_UgxdvM9Rn…
G
We cannot all be rich under the system we have now, the only thing AI will do is…
ytc_UgwVs5hIX…
G
@mularboss You're my ruthless mentor don't sugarcoat anything, if my idea is wea…
ytr_UgwEqFPMU…
Comment
I think eventually we are going to have to put restrictions on our “thinking machines” (if you get the reference then you are a chad).
I mean ai is really cool, and it can do amazing things, but is it part of the human story? Well that’s something we’re going have to decide. Do we want our legacies being automated, our thoughts autogenerated, or our jobs being taken? Is the reward worth the entire transformation we will have to take our society through to get to it?
Personally I don’t think we should let ai grow exponentially, it should be limited in its ability, and it should be there to help not to completely take over for our thinking.
I believe if we let ai take over our thinking then we won’t correct the issues in our societies, instead we’ll embrace them.
Thank you for coming to my ted talk.
youtube
AI Responsibility
2023-09-18T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxv1Ft65Rge2P_jKFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxz5S52dWT58v2uCMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeblPSmQAhFYjOXj14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugza_7ZkLxkjERxhF5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxOyWmuo5rg56r7Ul4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwihX74sI8dN-pF7fx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyemBzF0acSXLJGfwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxW3cJr796ECWBCabF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoYAr0hecqx8BB36J4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUgaJn9lFf55PFB8J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]