Science, Technology & Health: December 2003 Archives

I've written about comment paradigms before, and discussed some reasons the high-traffic sites don't want to bother with comments at all.

But what if a site owner could charge commenters $0.01 per comment? Using technology like BitPass it would be possible to configure Movable Type in such a manner. By restricting access to the comment CGI script and requiring the commenter to enter a BitPass login, the owner of a busy site could make a few bucks a day, at least. Heck, most people would probably be willing to pay $0.10 or more to leave a comment.

Charging to leave comments would also eliminate much of the hassle involved. People would be less likely to post spam, and even though the cost-per-comment would be low flame wars would get expensive. When people have to spend money (even small amounts) to access a service, they're generally more careful and conservative.

I'm hardly the first one to come up with this idea, but I became aware of the BitPass technology through an email from Bill Hobbs (who's moving to www.billhobbs.com soon). Here's a draft document from BitPass explaining how to charge access for scripts. It's the wave of the future, folks.

I mentioned last month that Dell is closing their Indian tech support centers, and Jay Solo has a great example of exactly why.

Maybe IBM should pay attention and consider Dell's experience before deciding to move many of its employees overseas. I'm not against companies finding cheaper labor (in fact I expect it, as a shareholder), but if cheap labor results in shoddy products then it isn't going to be profitable, and that's what counts.

Apparently the fellows at WETA had some problems writing their battle simulator: the soldiers kept running away.

"For the first two years, the biggest problem we had was soldiers fleeing the field of battle," Taylor said.

"We could not make their computers stupid enough to not run away."

Richard Taylor is the special effects designer, and I highly doubt that any of the software engineers would agree that they spent two years making their system "stupider". I'm very familiar with such simulations (they're related to my Ph.D. field, after all) and I have no doubt that the programmers had to spend some time tweaking the numbers, but if they had to actually remove functionality (or in some other way make the software "stupider") to get the result they wanted, it was probably due to time or budget constraints.

(HT: GeekPress.)

I've written before about the increasing average marriage age, and FoxNews reports that:

The average age at which American women are having their first child has climbed to an all-time high of 25.1, the government said Wednesday.
There are a lot of factors, and I'm glad to see the dramatic reduction in teen pregnancies which I think is largely attributable to abstinence programs.

What really caught my attention, however, was that the main FoxNews page linked to the article with the text "Preheating the Oven".

In an amazing coincidence, the world is getting fatter as it's getting richer.

The WHO believes there is a world-wide epidemic: "Obesity has reached epidemic proportions globally, with more than 1 billion adults overweight -- at least 300 million of them clinically obese -- and is a major contributor to the global burden of chronic disease and disability." Indeed, some say that "epidemic" is simply not a big enough word to describe the size of the overweight problem. "The word 'epidemic' doesn't even do this justice. It is one of the most profound medical crises we've had in generations," said Eric Topol, chief of cardiology at a US clinic in Cleveland. ...

We are not being killed off by an obesity epidemic, although many people are plumper. In the developed world, work has become less physical and food is more abundant. We are living longer, healthier lives. However, there are some negative cultural factors. Too often snacking has replaced the family meal and kids are getting less exercise as parents drive them everywhere, too fearful to let them walk the streets.

Naturally, this "epidemic" prompts many fascists to cry out for government intervention, but what's the point?

For thousands of years the prime struggle of humanity was to kill enough food to feed your family. Thanks to technology, we're past that, and the genes that once served us so well are starting to fall into disrepute. Our bodies don't need to use calories so efficiently, and storing fat for later no longer yields a useful survival advantage -- in fact, it may make you less able to survive. The solution isn't to force people to eat better and exercise more if they don't want to, the solution is to wait.

Within a few generations the fattest genes will be weeded out of the population as fat people die earlier and have fewer children. The problem -- with respect to the human population as a whole -- is self-correcting. Those of us born with less efficient metabolisms will have more kids and pass our genes on, and in a few hundred years humans may all require the 4000 calories a day we Americans love to shovel down our gullets.

Contrary to popular misconception (no pun intended), the so-called "morning-after pill" is not an "abortion pill". Generally, it works the same way standard birth-control pills do: by preventing conception, not (generally) implantation.

Conception occurs when a sperm fuses with an egg to create a zygote, and this is the stage at which most pro-lifers believe life begins. Implantation occurs when a zygote implants itself in the tissue lining of the mother's uterus. Under normal circumstances, it's fairily common for conception to occur without being followed by a successful implantation, and the zygote is subsequently lost during the woman's period.

No one who accepts common birth-control pills (which occasionally do fail to prevent conception, but then succeed in preventing implantation) can reasonably object to the "morning-after pill" on the basis that it "causes abortions". That said, some conservative groups still object to the pill, ostensibly for health reasons.

It doesn't make sense to approve over-the-counter access to a high dose of this drug, when a lower-dose [birth control pills] cannot be obtained without a medical exam, physician oversight and prescription, said Wendy Wright, CWA's [Concerned Women for America] senior policy director.

That's a very reasonable argument, but it already seems silly to me that so many drugs are so heavily regulated. I'm not a doctor -- but I play one on TV -- and maybe birth-control pills do require physician supervision, but it seems unlikely to me considering how widespread their use is around the world. Most of the CWA's objections appear to be pretty feeble, and I'm not sure what their real motivation is. I suspect that since their donors are pro-life (and possibly ignorant of the details I mentioned above) the organization feels pressured to find some reason to object to the pills.

Nevertheless, I find myself agreeing with Planned Parenthood on this issue (amazing, I know).

"Wider access to emergency contraception will prevent hundreds of thousands of unintended pregnancies and abortions every year," said Planned Parenthood President Gloria Feldt. "There is no scientific basis for denying...over-the-counter availability," she added. ...

The group also said its research indicates that widespread availability of EC could prevent 1.7 million unintended pregnancies and 800,000 abortions each year in the United States.

800,000 fewer abortions in America each year would be an astounding achievement, and well worth the minor potential health concerns raised by CWA.

As you know, I enjoy the images Google uses to commemorate significant days. Today is the 100th anniversary of the Wright Brothers' first flight at Kitty Hawk, and Google celebrates with:

Twenty years ago a movie named Blade Runner -- ostensibly about human-like robots rebelling against their masters -- adroitly confounded the near-certainty we all hold of our own memories. In the end, the question was: how do you know that you're not just a replicant, with implanted false memories of a childhood that never existed?

Now, new research is continually showing that human memory is incredibly malleable, and that we're wide to doubt ourselves, no matter how clear our memory may seem.

"We can easily distort memories for the details of an event that you did experience," says Loftus. "And we can also go so far as to plant entirely false memories - we call them rich false memories because they are so detailed and so big."

She has persuaded people to adopt false but plausible memories - for instance, that at the age of five or six they had the distressing experience of being lost in a shopping mall - as well as implausible ones: memories of witnessing demonic possession, or an encounter with Bugs Bunny at Disneyland. Bugs Bunny is a Warner Brothers character, and as the Los Angeles Times put it earlier this year, "The wascally Warner Bros. Wabbit would be awwested on sight", at Disney.

Elizabeth Loftus' research has obvious implications for the reliability of eyewitness testimony. And it was as a result of her findings that in 1994 she co-wrote her book, The Myth of Repressed Memory, and took a strong stand in the recovered memory debate of the 90s, for which she was reviled by those who claimed to have uncovered repressed memories of abuse - alien, sexual or otherwise.

In Memento we all pitied Leonard Shelby, who had no long-term memory and couldn't remember anything that happened more than a few minutes in the past. But really, how much better are our own memories? They're mostly amalgamations and approximations of real events, all jumbled together and distorted by perspective, time, conscious desires, and self-delusion.

It's fascinating to me that every time I turn on talk radio the host is discussing news and issues from from the internet. Bill Hobbs has written about blogs becoming journalism (and I've commented), and in a large way journalism is being shaped by the net as well. Internet news sites are far more mainstream than blogs -- of course -- and the fact that traditional media is starting to "link" back to the net is strongly indicative that we're in the last phase of the news revolution. Surveys show that in 2003 more Americans say the internet is an important source of information than say the same about television, radio, newspapers, or magazines.

Since radio is no longer the top-tier medium it once was, radio producers are less reluctant to credit internet sources than television producers seem to be. Television news shows will direct viewers to the network's website, but I've never seen a show mention the Drudge Report (for instance) except when they're doing a story about internet media itself. Likewise, I've never seen a sit-com mention a website as an offhand pop-cultural reference. Nevertheless, television is losing audience hours to the net at a dramatic rate.

Television executives have more to fear than a future filled with gross-out reality shows. The Internet is rapidly eroding television viewing hours and emerging as a powerful information medium in its own right, according to a study being released today by the University of California-Los Angeles.

In the same way that television eclipsed radio as the primary medium for entertainment and information, the Internet poses a major threat to television.

"The thing that's easy to prove is that Internet users watch less television," said Jeffrey I. Cole, director of UCLA Center for Communication Policy, which conducted the study. "What we've been trying to see is does their Internet time come out of television time? The early indications are pretty clear that it does." ...

Internet users watched about 4.8 fewer hours of television each week than non-users. And the decline in TV viewing hours grows more dramatic as Internet users gain experience. Internet veterans watch about 5.8 fewer hours of TV than non-users.

With the rise of streaming media and high-bandwidth connections in peoples' homes, it seems clear that the days of television as the Big Man on the media Campus are numbered. How will we know when we've reached the turning point? Television shows will start "linking" to websites they don't own, just as radio shows do.

About this Archive

This page is a archive of entries in the Science, Technology & Health category from December 2003.

Science, Technology & Health: November 2003 is the previous archive.

Science, Technology & Health: January 2004 is the next archive.

Find recent content on the main index or look in the archives to find all content.

Supporters

Email blogmasterofnoneATgmailDOTcom for text link and key word rates.

Science, Technology & Health: December 2003: Monthly Archives

Site Info

Support