Internet Development Layers
The Internet is evolving in layers of applications. New programs and businesses reside on top of multiple web functional layers, as well as providing APIs to external, potentially higher order applications. One sharp example Semanta has a high functionality API. Their interface has available effective semantic tools to extract meta data from natural written language. I'd like to fulfill a user need to have real time search information based on thoughts they express within social media. This could facilitate discovery of others who are interested in discussing similar topics. The long term goal I have for a business I'm working to startup (along with some sharp friends) is to create the ultimate web based virtual assistant.
There are several advantages to having a public API:
- First of all it invites developers to create tools which can remarkably add value to your baseline service
- Second it allows your products to become available to customers in a personalized manner, potentially increasing the value your product has to each user.
- Once your software is weaved into the rich tapestry of the Internet, you have a greater potential for (unpredicted) monetization by future applications
- Each product has a limited time to explode in value to fill a need, before a competitor does so. By having an open API, the chance of reaching the maximum potential of any product's utility is increased, due to the rapid nature of layered development and information consumption (you learn about how users utilize your products faster)
Be careful not to make the mistake of depending on an outside business to be the sole source of required functionality. It's important to be familiar with multiple competitors and be ready to move to a stronger utility in your baseline choice has been bought up or gone under. The alternative is leveraging multiple external tools and using a aggregated source of information. Building on top of other applications actually makes this type of software change easier, as long as you can fulfill the necessary information required from the original interface.
Real Time Search Authority
Experts in the field have user respect, and may be outspoken and/or well informed. But how do we democratize real time filtering without delaying it? As a quick background, Google's famous Internet search engine uses page rank. Google monitors links (among other things) to a page from outside sources. The weight of a link is a function of how many external links point to the parent page.
There is an analagous functionality within twitter which is retweeting. There's a similar functionality in friendfeed, the "like" function, identical to the facebook like function. The problem with ranking real time information using these criteria is that they may only occur as fast as people consume and review the content. But they can be used to identify influential or trusted sources of information. And real time search can push status from these sources up in search priority, as long as the semantic meaning of the status resides within an area of expertise of the broadcaster.
Another method of recognizing real time authority is to check the "hotness" of a status. Reddit takes advantage of this simple algorithm in their social crowd sourcing site. Hotness is driven by the derivative or change per time interval of likes, votes up, or retweets and can signal a fast rising or important status. Unfortunately this method of detection is susceptible to being gamed by coordinated groups.
Related articles by Zemanta
- The Future of Search: Social Relevancy Rank (readwriteweb.com)
- How Twitter plans to generate revenue! (ceoworld.biz)
- Google Reader Gets a Social Makeover, Adds Likes and Followers (mashable.com)