Denver — E-mails were flooding in from all over the country. Something strange was going on with the Internet, alarmed computer users wrote. Google, eBay and other big sites had suddenly disappeared. Kyle Haugsness scanned the reports and entered crisis mode.
Part of the Internet was broken. For the 76th time that week.
Haugsness was on duty for the Internet Storm Center, the closest thing to a 911 emergency-response system for the global network. He and a few colleagues began investigating and discovered that a hacker had taken advantage of yet another security hole. As many as 1,000 companies had effectively had their connections "poisoned," so when their employees typed in legitimate addresses they were taken to bogus Web destinations. Haugsness wrote up an alert and a suggested solution, and posted it on the Web.
Then, Haugsness turned back to his inbox. In the few hours he had spent sleuthing that March day, several dozen e-mails detailing other suspected issues had piled up.
Built by academics when everyone online was assumed to be a "good citizen, " the Internet today is buckling under the weight of what is estimated to be nearly a billion diverse users surfing, racing, and tripping all over the network.
Hackers, viruses, worms, spam, spyware and phishing sites have proliferated to the point where it’s nearly impossible for most computer users to go online without falling victim to them. Last year, the Carnegie Mellon University CERT Coordination Center logged 3,780 new computer security vulnerabilities, compared with 1,090 in 2000 and 171 in 1995. Computer security firm Symantec Corp. over the past decade has cataloged 11,000 vulnerabilities in 20,000 technologies, affecting 2,000 vendors.
"I’m very pessimistic about it all," said Haugsness, who has worked for the storm center for two years. "There are huge problems and outages all the time, and I see things getting worse."
Originally developed by the Defense Department, the Internet is now a global electronic communications network made up of hundreds of millions of computers, servers and other devices run by various governments, academic institutions, companies and individuals. Because no one entity owns it, the network depends on goodwill to function smoothly.
The Internet has become so huge — and so misused — that some worry that its power to improve society has been undermined. Now a movement is gathering steam to upgrade the network, to create an Internet 2.0. How, or even if, that could be done is a subject of much debate. But experts are increasingly convinced that the Internet’s potential will never be met unless it’s reinvented.
"The Internet is stuck in the flower-power days of the ’60s during which people thought the world would be beautiful if you are just nice," said Karl Auerbach, a former Cisco Systems Inc. computer scientist who volunteers with several engineering groups trying to improve the Internet.
Many of the bugs in the Internet are part of its top layers of software, the jazzy, graphics-heavy, shrink-wrapped programs that come loaded on new computers or sold in retail stores. But some of the most critical issues were built into the network’s core design, written decades ago and invisible to the average user.
For example, a way to verify the identity of a sender of e-mail or other communications is just beginning to become available, meaning that many criminals roam the network with relative anonymity. And the system that matches addresses to Web sites is vulnerable to hackers, redirecting users to sites they never wanted to visit.
Technological solutions for many of those problems have existed for years, but it’s been difficult to build a consensus to implement them. Arguments about global politics, potential profits and ownership of intellectual property have plagued groups trying to fix things.
"The problem with the Internet is that anything you do with it now is worth a lot of money. It’s not just about science anymore. It’s about who gets to reap the rewards to bringing safe technologies to people," said Daniel C. Lynch, 63, who as an engineer at the Stanford Research Institute and at the University of Southern California in the 1970s helped develop the Internet’s framework.
As the number of users exploded to more than 429 million in 2000 from 45 million in 1995, Lynch remembered watching in horror as hackers defaced popular Web sites and shady marketers began to bombard people’s e-mail inboxes with so much spam that real messages couldn’t get through.
When the Internet’s founding fathers were designing the network in the 1960s and 1970s, they thought a lot about how the network would survive attacks from the outside — threats like tornados, hurricanes, even nuclear war. What they didn’t spend much time thinking about was internal sabotage. Only several hundred people had access to the first version of the Internet and most knew each other well. "We were all pals," Lynch said. "So we just built it without security. And the darn thing got out of the barn."
Years passed before the Internet’s founders realized what they had created.
"All this was an experiment. We were trying to figure out whether this technology would work. We weren’t anticipating this would become the telecommunications network of the 21st century," said Vinton G. Cerf, 62, who with fellow scientist Robert T. Kahn, 66, helped draft the blueprints for the network while it was still a Defense Department research project.
Even as he marveled at the wonders of instant messaging, Napster and other revolutionary tools that would not have been possible without the Internet, Leonard Kleinrock, 71, a professor at UCLA who is credited with sending the first message — "lo," for "log on" — from one computer to another in 1969, began to see the Internet’s dark side. "Right now the Internet is running amok and we are in a very difficult period," Kleinrock said.
Some technologists have said the Internet or parts of it are so far gone that it should be rebuilt from scratch, and over the past decade there have been several attempts to do so. But most now agree that the network has become too big and unruly for a complete overhaul.
For now groups are working on what are essentially bandages for the network.
Today, a complicated bureaucracy of groups known by their abbreviations help govern the network: the IETF (the Internet Engineering Task Force, which comes up with the technical standards), ICANN (the Internet Corporation for Assigned Names and Numbers, which manages the naming system for Web sites) and the W3C (the World Wide Web Consortium, which develops technologies for the Web). But their power is limited and their legal standing murky. Some have recently argued that the United Nations should take over some regulatory functions. Firms have set up their own standards groups to suit their own interests.
The one thing everyone seems to agree on is that security must be the priority when it comes to the next generation Internet. Major companies are promoting technology that will give recipients of e-mail "return addresses," or a better way of ensuring that senders are who they say they are, though the companies disagree on whose technology should be used. A group of scientists from the Internet Engineering Task Force, perhaps the most important standards- making body for the network, are working on a way to better collect and share information on computer intrusions.
Internet2, a consortium of mostly academic institutions that has built a screaming-fast network separate from the public Internet, is testing a technology that allows users to identify themselves as belonging to some sort of group. Douglas E. Van Houweling, president of Internet2 and a professor at the University of Michigan, thinks the system could be used to limit access without using passwords to, say, chat rooms for women with children on a certain soccer team, or to subscribers of certain magazines or newspapers.
"You’ve heard the saying that on the Internet nobody knows you’re a dog, and that’s of course the problem," Van Houweling said. "Authentication will allow communities to form where people are known and therefore can be trusted."
But there’s a trade-off for such security. The network becomes balkanized, with more parts of it closed to most people. Auerbach, who has been involved with ICANN and the IETF, said more security raises the "specter of central authorities."
Lynch believes the Internet will never truly be secure, though, because of the diversity of software and devices that run on it. If one has a flaw, others are vulnerable.
For years computer designers have tried to build a machine that lives up to the "orange book," a specification written by technologists at the predecessor to the National Institute of Standards and Technology. It describes a bug-free, completely secure computer that has to be built in a clean room with designers who have gone through extensive background checks and are not allowed to communicate with anyone.
"There have been a few computer systems built like this for the military and they vanish, just vanish. Nobody talks about them anymore," Lynch said. "They have been created, but for the average person they may as well not exist. "
Until that perfect machine is built for consumers, it will be up to people like Haugsness at the Internet Storm Center to keep the network up and running. The center is operated by the SANS Institute, a Bethesda, Md.-based nonprofit dedicated to computer security. But most of its work is done by an eclectic group of volunteers who sign on remotely from around the world, including a former National Security Council staff member and a grandmother in Iowa. Haugsness is in his late 20s and is an avid snowboarder and mountain biker.
One recent Sunday afternoon, Haugsness was at his company’s office checking the storm center reports. One person said he had found a new variant of a program that allowed hackers to take over a computer by creating a "back door" through holes in its security system. There were also complaints about a few phishing e-mails that tried to trick people into giving up their personal information. Internet traffic patterns worldwide seemed fine — only a few sections had congestion that would qualify as serious, or "red."
Nothing "super bad" so far, Haugsness concluded. All in all, only about a half-dozen documented problems. That might have been considered a disaster a decade ago. But it was a pretty good day for the Internet in 2005.
Ariana Eunjung Cha, Washington Post
[Quelle: San Francisco Cronicle]