By
the middle of the 90's began to be valued the positioning and visibility of the
places web in the searchers - mainly for their commercial & marketing importance-
creating an opportunity for internet users that began to specialize in
technical of Search Engine Optimization (SEO) considering that to occupy the
first positions in the pages of results can be crucial for a place web.
This
process consists on improving the visibility of a place web in the results of
the different searchers, and although it is also common to call it positioning
web, this term is not so precise since it includes other traffic sources
outside of the search motors.
In
the case of the natural or organic positioning the objective is to appear in
the highest positions possible of the search results for an or several concrete
words, what supposes to optimize the structure of a web and its content
increasing the fame of the web, due to the increase of mentions, what is
translated in payments in commercial places and has given place to denominated Search
Engine Marketing (SEM).
Continuing
with these initials of three letters, more recently the optimization of social media,
or Social Media Optimization (SMO) has arisen, making reference to the strategy
carried out to be positioned as company, institution or person in the net,
generally with an advertising or commercial purpose, task that is generally in
the administrator's of social nets hands, well-known as Community Manager.
And
where do arrive with all this? To that every day a bigger quantity of sources
of data exists, but at the same time new digital sources are developed that
allow its in real time monitor through instruments, sensors and internet
transactions, what impels a new sector denominated Big Data: and why the BIG?
Because according to statistical of IBM, 90% of the data that circulate at the
moment in the digital world was generated in the last two years and every day
2.5 quintillions of bytes of data they are believed, it calculates in increase
due to the dynamism of this sector of the knowledge that modifies in each
second.
More
data? Those more than a thousand millions of users of Facebook registered generate
more than 1.500 upgrades per second. The platform of trade eBay and it
constantly upgrades the data of more than 100 million active users,
differentiating their buyers profiles clearly.
And
who does break the record in the handling of data? The laboratory LHC (Large
Hadron Collider) that possesses the biggest accelerator of particles in the
world, located in the frontier of France with Switzerland, where according to
the popular culture the is looked for "particle of God". In only three
years of operation they have accumulated 100 petaflops of floating data
(100.000.000.000.000.000 of operations per second), as this figure it is unpronounceable,
we could say that it is equivalent to about 700 years of HD movie in high
quality.
And
finally, the practical question: What supercomputer will be able to manage
these gigantic quantities of data that constantly modify? The Cray Titan,
created by the signature Cray Inc. is operational from October of 2012. Is
expected that it overcomes the capacity of 20 petaflops, although up to now stayed
in a real speed of 17,59 petaflops, being the quickest supercomputer in the
world, composed by a hybrid system that consists of a total of 560.640
processors, of which 299.008 are AMD Opteron 6274 of 16 nucle and the other
ones 261.232 remaining they are graphic processors NVIDIA Tesla K20x. It also
includes 10.000 rigid disks of 1 terrabyte and 7200 rpm, with a speed of
transfer of 240 GB/s.
Without
place to doubts, with all the data that circulate in our digital world, from
these supercomputer will be able to in some moment to substratum that the swiss
psychiatrist Carl Gustav Jung denominated our "collective
unconscious", common to all the human beings of all the times and places
of the world, something that is still beyond the reason.