The accumulation of the clicks to certain homepages causes the polarization of the information.
If we are thinking that the Internet is a library, and a search engine is the card file, we understand better the power, what those search engine companies have. The searching the Internet happens by using the search engine, what is the algorithm, what is acting as a librarian. When somebody would some type of data the algorithm would go to the card file, and then search the cards, which are under some certain topics.
The cards are telling where the books are found. Homepages are the books of the Internet. But there is something different between the Internet and the physical library. The search words that are used on the Internet are mostly topics. They are like "Arctic birds".
The search engine would check the cards that are sorted by under that topic, and then the user would get the list of homepages, what are handling this kind of thing. At this point, I must tell what means the indexing. Then some searches have been making. The system stores or save the search words and the network addresses of homepages, which are clicked after the search.
So when some homepage will get clicks that would raise its position in the list of the homepages. Indexing the search means that there is the list of the most clicked homepages, and the top of the list is the most clicked homepage. And this is the thing that causes criticism. The clicks are accumulating to certain homepages, which causes the polarization of the data.
Sometimes people are saying that the information is power, and that means that Google is the most powerful company in the world. The data or rather saying the "big data" is the unsorted data. This "big data" is stored in the servers of the network. And it would give the power of the information to the hands of the people. If they have the will and ability to search and sort that data.
The algorithms, artificial intelligence-based virtual robots are searching and connecting the data bites in the databases, and the thing that those programs are getting depends on the things, what kind of commands those algorithms are getting. And the second thing is that the authorities what the searcher has will limit the data, what the algorithm would get. So if the person who uses those algorithms has a certain authority, they can collect every type of data of the targeted person.
The big data that is stored by Google for indexing the search results on the Internet is the thing, that makes that company so powerful. When I'm talking that the big data is the unsorted data, the thing is that inside the servers of the Internet is the card file or database that tells the search algorithm where to find a certain homepage. And this is the problem with the new homepages.
When people are using Google or some other search engine to get data about something, what they want they just write the words to the search box. So the method that Google and other search engines are using is the computer is searching from the card file the most used homepages, and then the search engine would put the most clicked homepage to the top. This makes it difficult to get clicks to new homepages. If the virtual card file of the search engine would be physical, the thing is that the most clicked homepages would get their cards to the front.
Comments
Post a Comment