Victus Spiritus


Optimal Dynamic Network Paths, why current wireless Internet frameworks fail

18 Oct 2009

Mobile vs. Internet Communication, we share one Internet split by greed

Current wireless Internet providers use a single service provider or access point to the full Internet backbone. Whether you're on AT&Ts network through your iPhone, or Verizon's network for your soon to be released Droid, you send and receive information first through a local wireless connection. But there is a problem with this restricted network design, not all available paths for local wireless connectivity are equally burdened, and enforcing structure on information flow at the beginning and end of wireless access only serves to slow it down.

I made an off the cuff comment about mobile networks on Fred Wilson's post on mobile tech this morning. What we really need is all the wireless providers to come to an agreement, and work out a solution that allows for access to ANY of their networks as a local solution. The least burdened/most capable network provider (judged by a third unbiased party/business) would always win out. The even distribution of wireless data will provide value to all users/customers and profit to the providers themselves.

How the Internet Works

Here are quick introductions to how the Internet works in regards to data routing (routing for dumbies). Real time packet navigation/routing is non-trivial because the entire network knowledge isn't known at any point in the journey.

As to why I believe the Internet has become so important, I refer you to another comment on Fred's blog (I spend a lot of time commenting there) in response to a friend Shana Carp's comment.

I was just having an old man's mid afternoon siesta (was up till 1am playing with Google Wave, then got up at 5am with Michelle) and dreaming about network link strength (a cool visualization chart that I don't know how to code-> much like my current foray into building a google wave robot with scala).

Google's breakthrough information mapping was counting links, real time information quality doesn't have time to develop links, so we require other measures. Every time I follow a link, some part of me feels like I'm strengthening a neural connection of some super smart but as of yet non-existent entity. Our usage of the Internet "teaches it" by making direct connections between abstract concepts. Loosely we train the neural network of the web. Crowd sourcing sites do this with votes, social sites do with likes, and twitter does it with retweets.

The further strides we make in building the Internet's depth, by making it a richer experience, the greater we are rewarded. It's a feedback loop. The more I improve Internet usage efficiency, by adding useful services, the richer I become. The intelligence of the Internet isn't an isolated entity either, it's composed of a billion or so people actively using it, generating and connecting content. Our attention, and shared time and thoughts here are changing the structure of the web in a real way.

Look at what became of traditional advertising, Google's 20billion dollar plus revenue stream didn't come from no where, it came directly out of the pockets of traditional advertising, which was a lot bigger market than 20 billion dollars. So Google shrunk the industry, and captured the revenue - classic disruption.

Information Theory

Shannon's Information Theory provides us with a measure of information transmission capability for a channel, and it's direct relation to the Entropy of the information distribution. The more the underlying probability distribution tends toward higher entropy (less predictable system), the greater the amount of information that can be transmitted over a channel.

This applies to dynamic network paths as well theoretical channels (this is a hypothetical leap for me, it's a gut feeling). Restrict portions of the path, and limit the overall information transmission capability through the network. This is one of the reasons why the Internet works so well for information sharing, it minimizes network path restriction. When telecommunication providers restrict the beginning and end of a wireless packets journey (they're own service), they are in fact reducing the choices for the network path, and ultimately the information transmission rate through the network.

As an interesting aside, in regards to text transmission (blogs, text messages, email), this excerpt describes the entropy of human language in trials (our text data can be encoded in many different binary formats though):

A long string of repeating characters has an entropy rate of 0, since every character is predictable. The entropy rate of English text is between 1.0 and 1.5 bits per letter,[1] or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments

[wave id="!w+ZCtUJ1z4A"]