Thursday, December 29, 2011

Web content should be measured so that we would get high quality results or recommendations

Search is a killer application of 21st century. We have information overload problem which can be solved by search/recommender systems. The world knowledge is available in the web and it is reachable by filtering or joining techniques. Keyword based filtering is the current prominent seeking technique allows to get relevant documents from the web. Hence the quality of the information that we receive is not only depends on ranking algorithm, it’s also depends on how good the key words (high-quality relevant keywords) that we enter into the search systems. Many times best keywords may not result required content.
I believe that humans have constant intelligence for centuries. The intelligence increases very slowly over long period of time, but knowledge on the other hand increases exponentially in short period of time. Based on quantity/quality of knowledge stored in our brain, the thinking/processing power and results of thinking getting better. Similarly the web should have high quality data so that it will enhance our/machine thinking.
Ever increasing web has quantity factor by default, but quality of the content is questionable and can’t be measurable easily. Web systems should be designed in a way so that it can give back quality information. For example, search systems scratched the surface of the quality (on the web) by means of counting number of inbound links. But this is only a start, we have to go further in this direction to make the quality web. For example: “ranking humans” in order to find quality of the content written by people. Hence we can easily map experts who write about their subjects into novices (who may be experts on other fields) who need the particular content.

No comments: