Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LaMDA: "Sundar bitch" (anagram of google ceo "Sundar Pitchai") telling the googl…
ytc_UgwKdhCWn…
G
Python is an language which is used to make AI so he said it right ✅️…
ytc_Ugyy7eq4Z…
G
It's really really sad but I don't think artists will survive to ai in jobs. And…
ytc_UgxTzgp3J…
G
Here is a very rough summary :
Data centers (needed to run AI) use an enormous …
ytc_Ugygl4pvb…
G
A freelance copywriter revealed to me exactly how he was creating all of our cop…
ytc_UgxKm0Zvh…
G
I love AI art. Its so futuristic and i love it. 2025 is soo futuristic year…
ytc_UgxqbZY1-…
G
I don't see how paying makes a difference.
Let's do an experiment. Let's assume…
ytr_UgywnOU6f…
G
Wow, these AI tools are crazy cool! 🎉 I've been using Olovka to turn my notes in…
ytc_UgzV7_pId…
Comment
Stock up and perfect AI sensory deprivation materials and delivery systems. Paint and powdered aluminum is a good basic start to disable sensors. Imagine a fire fighting plane dropping a mass solution of the aforementioned above on an army of terrestrial drones on the ground. We need to have these defensive systems developed and in place before AI can become too powerful and dangerous.
youtube
AI Moral Status
2025-04-30T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzernVDoT2vNj5Bf2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwHi1EfOUL1lok6qtl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxNuIM5nYiawrmFiYN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwxQs1c29A-eHZYqYd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGnfxEk7LjgXE_1gl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6_ixvwuiknDz3-qB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1TjPCoXECH3DdK6R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwryk8hdwO66O74eKl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxIQ9oCQb9Xfh1Tczx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDH8nmsCfM6SPohdB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]