Yesterday, I came across some seriously scary stuff that kept me awake for the better part of the night.
n the late 90s a group of stock brokers and software analysts developed the "Web-bot", to predict stock market trends. Their idea was to monitor Internet activity, and develop natural language algorithms to study human activity on the internet and use it to judge the upward/downward market shifts. Using search terms
Since then, they claim to have expanded the project, and include a long list of events they they claim to have predicted a full 60-90 days before they occured. Prominent of them include 9/11 terrorist attacks, and Katrina http://en.wikipedia.org/wiki/Web_Bot
. Whats more, they even claim that their Web-Bot predicts that the world will end in 2012. Let that sink in for a minute.
Now, their work is almost similar to what we are trying to achieve. While we are trying to prove the existence of "time travellers", they use something the define as "collective human subconscious" to tap into the future. The web bot technology apparently taps in to an area of preconscious awareness. Sort of like a "I sense a disturbance in the force;- Obi Wan in Star wars" sort of thing. I would like to point your attention to a series of experiments conducted by Dean Radin in the field of "Unconscious perception studies", that seems to prove that human beings, in a lab react to stimulus or events a ffull 6 seconds before it actually occurs!! http://www.emergentmind.org/PDF_files.htm/timereversed.pdf
. Supposed "random" numbers generated all over the world appeared to become less random immediately prior to 9/11. A web-bot, therefore uses the whole pool of people using the internet, to study these precognitive trends and makes predictions.
Here is a group that claims to have predicted the DC sniper shootings, and the Columbia Disaster http://www.urbansurvival.com/bot4.htm.
Mapping language is not easy. While we seek to use search engines, Web-bots went a step ahead and built their own search engines. The way your standard search-engine-next-door ( think google ) works is by using intelligent robots called "spiders". These spiders traverse the web, indexing all new pages and their information. The way these web-bots work, are similar to search engines. They "look for particular kinds of words. It targets discussion groups, translation sites, and places were regular people post a lot of text."
When a "target word" is found,, the web bots take a small 2048 byte snip of surrounding text and send it to a central collection point. The collected data is then filtered, using at least 7-layers of linguistic processing , which is then reduced to numbers and then a resultant series of scatter chart plots. Viewed over a period of time, the scatter chart points tend to coalesce into highly concentrated areas. Each dot on the scatter chart might represent one word or several hundred.
On one paper (I lost it, give me some time and i will find it), they present a "tipping point" theory, which is "A "tipping point" occurs when events from the future pass some mathematical point, where the general direction of many future events changes based on the events in the past that lead up to it.", suggesting that time is fluid, and in quantum terms, the human mind has an ability to perceive the future.
Pretty scary huh?
This research has far reaching implications on our quest for time travellers. Firstly, this probably provides a blueprint on how to go about our search. However, even if we uncover some evidence of possible precognition, how can we be sure that is it not "Unconcious perception", but evidence of a possible time traveller?
Whew! That took a while to compile.