Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kecksbelit3300 „Saying AI is a money grab is crazy”
How is accepting the truth…
ytr_Ugxn8fAhk…
G
Robots can replace humans with the data that AI accesses, learning everything fr…
ytc_UgwQ6nOXz…
G
The governments will ensure AI remains in safe hands!?!?! It’s already in hands …
ytc_UgysqYOqL…
G
okay, ai can do it “faster and better”, but it’s about the artist expressing the…
ytc_Ugw3Q-JBY…
G
Actually, I can see how this can be helpful. Medecine is too vast and are too ma…
ytc_UgxcSMECF…
G
I'd like to know Lavenders opinion on other AIs like Chatbots (business, rolepla…
ytc_UgyjxTrC7…
G
This is a scary then law enforcement is allowed to do; to use an algorithm to pr…
ytc_UgybckpaU…
G
@slayernephilim2344
“AI, the Invisible World, and the Energy We Can’t Explain”
…
ytr_Ugw7Rq7fC…
Comment
@zephsmith3499 Great points! The paper clip factory thought experiment highlights the risks of narrow optimization. It illustrates how focusing too narrowly on a single goal—like making paper clips—can lead to unintended negative consequences for broader human needs.
You're right: determining which human needs to prioritize is a complex challenge. It involves balancing various factors like social order, sustainability, wealth distribution, and more. This decision-making process should ideally involve diverse perspectives to avoid biases and ensure a comprehensive approach to addressing human needs.
If you’d like to discuss this further, consider joining our next live broadcast on AITube, where we explore these ethical and practical aspects of AI in-depth.
youtube
AI Responsibility
2024-07-30T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzFK0RkOuESCxhffdt4AaABAg.A4uANtN-kSHA6WJb4CTlfr","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyEPHzsMtUa3BtrLW54AaABAg.A4tY_ghm9jqA6QmrOTjKeR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzHrmc39CcAyei3QoF4AaABAg.A4tQLGa_j18A6Qn002Tmmy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz05sbuKyXfKHSEKsB4AaABAg.A4tLYi3VfIGA6Qn4oatdkq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzsZYC3HRdRLTB3keF4AaABAg.A4t3XBciJePA6QnAahvAFG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgzHvnark0oY-XQgj0h4AaABAg.A4snUGR7OpjA6QnDC3Blxo","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxOPOn7INct3VTxhsx4AaABAg.A4sUSOD09zxA6QnGz-PlhC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgymxwgTW3cuy13tdTd4AaABAg.A4sU9ZKLtaoA6QnJdqxfPK","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwnyyH88RmkD0oRqfZ4AaABAg.A4sTYP6tCAMA6QnM3ptiJ9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwUq05nyXM97273Njh4AaABAg.A4s60xRR2orA6QnS-xmoH-","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]