Which is worse? A world where human beings engineer a more successful intelligence or one where we engineer more successful biological life forms? What if both things are happening simultaneously? Most of us are not scientists who happen live in Cambridge or Boston, which evidently are the main hotspots for thinking about these sorts of problems. What if you’re just a humble writer who gets itchy when it seems that near future probabilities are outpacing cataclysmic conditions in your imagined worlds? I think we need a CalamityCon. Seriously. We need a place to learn about these issues in greater depth and contribute to the thinking about preventive measures. Note: This is not the duty of writers, and no one should be obligated to do any such thing, but it surely wouldn’t harm the world’s science community to hear from us, no?