Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that AI should only be used for shits and giggles, its fun sometimes to …
ytc_Ugzf_vQ5T…
G
People need to understand that its a progress. World will not stop just to keep …
ytc_UgwQ3M8sj…
G
How about we learn from cezar Milan the dog whisperer. We need to acknowledge wh…
ytc_UgyXhRm5m…
G
The thing is that probably 80% (Yes I'm pulling that number out of my ass but it…
rdc_dt9my99
G
Facial recognition software is still in it's infancy. No it doesn't have a harde…
ytc_Ugw8Lx7Ff…
G
A great podcast for the first hour, shedding much light on AI and the near futur…
ytc_UgxXB-K1w…
G
AI , no it wipes out the office workers, marketing, advertising and marketing pe…
ytc_Ugw1uq4qz…
G
If being thoroughly autistic and ADHD counts as disabled, then I’d like to vouch…
ytc_UgycDu7v0…
Comment
In my opinion fear of super intelligence is misplaced. I think that the probability of obtaining super intelligence from LLM is very small, BUT we have a more realistic existential problem that we need to worry about: greed + human stupidity + LLMs. The scenario in which a politician trusts a hallucinating chatbot with nuclear codes and starts WW3, or a power grid gets shut down because it's managed by Elon's psychotic bot. Those scenarios do not need HAL 9000 to exist. They can happen tomorrow. We should be worrying more about stupid people in positions of power wielding dangerous toys and less about if those toys will ever become gods.
youtube
AI Moral Status
2025-11-01T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzRh0tHKNCKQy2KIIR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_n4W79Fmn80LNQVl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx22XObPjNjxXkNhRl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQEPB7q-YXgW0X5el4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz3rTWVwPuHcUXRDrx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2gjHSAWqy-ucUHxx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTJYKlgNhMzJBfLLV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzfPWTXIJAWxtjxgmN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyY0AhUSgPZnAv66A14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaUlhv8u0hKYZSr994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]