The Altruism Equation: Science Fiction
Nov. 26th, 2009 07:03 pmThey didn't realize what they had found, or what that relationship would do to us, on that first day. The signs were there, if they'd found it conceivable: we marched toward the closest civilization group to make contact, armed and ready for danger. We found it abandoned, as if they had vanished into the forest when they heard us coming. If it hadn't been for our body-heat sensors, we never would have found them. They didn't make a sound. When we rolled back the mat that the village had hidden under and they stared up at us with big eyes, at first we laughed a little and then tried to reassure them. Then we saw the dead babies. Their species was close enough that we could see what had happened. A couple of the greener recruits turned and vomited in the bushes. But--the aliens, they didn't act like there was anything wrong. Oh, sure, they grieved, yes; the mothers sat there holding their smothered babies with tears rolling down their faces.
Inspiration: Radiolab's podcast about the needs of the many vs. the few, and how humans only follow that about 50% of the time, and almost never follow it when the person is outside their immediate view. The examples being 1) smothering a baby so that it doesn't cry and give away the whole village to people that will kill everybody, and 2) jumping into a lake to save a drowning girl even though it will ruin a very expensive suit, but not being willing to give the money that could have bought that suit to an organization that will save a girl across the world.
Story Potential: High.
Notes: There's a lot of thinking involved in making this work out, though. Because there are a couple of really easy paths this story could go down that would be less interesting. Avoid 1) the "Giving Tree" scenario in which we destroy a native (alien) culture, 2) their goodness humbles us and we miraculously become a better society, and (probably) 3) it's a virus, we catch it, our civilization collapses. Also avoid the "Noble Savage" paradigm--make them equal to us in tech, etc. How would that evolve? So it's really about figuring out a way to fit their worldview into our worldview or vice versa in a way that will benefit all. Make us always see them as being the girl in the lake in front of us, or something like that. Exchange programs? Symbiosis (might be too close to 3)?
Also, dayumn am I coming up with good titles lately!
Inspiration: Radiolab's podcast about the needs of the many vs. the few, and how humans only follow that about 50% of the time, and almost never follow it when the person is outside their immediate view. The examples being 1) smothering a baby so that it doesn't cry and give away the whole village to people that will kill everybody, and 2) jumping into a lake to save a drowning girl even though it will ruin a very expensive suit, but not being willing to give the money that could have bought that suit to an organization that will save a girl across the world.
Story Potential: High.
Notes: There's a lot of thinking involved in making this work out, though. Because there are a couple of really easy paths this story could go down that would be less interesting. Avoid 1) the "Giving Tree" scenario in which we destroy a native (alien) culture, 2) their goodness humbles us and we miraculously become a better society, and (probably) 3) it's a virus, we catch it, our civilization collapses. Also avoid the "Noble Savage" paradigm--make them equal to us in tech, etc. How would that evolve? So it's really about figuring out a way to fit their worldview into our worldview or vice versa in a way that will benefit all. Make us always see them as being the girl in the lake in front of us, or something like that. Exchange programs? Symbiosis (might be too close to 3)?
Also, dayumn am I coming up with good titles lately!