Big data expected to drive 12,000 jobs in Northern Virginia
Big data and analytics is the collection and analysis of large sets of information that reaches into virtually every industry.
Retailers like Wal-Mart and eBay use big data to process and analyze the billions of transactions they make every year to find trends and inefficiencies.
The healthcare industry uses databases of information to create new drugs, process large caches of patient information and review epidemiological trends in diseases.
Scientists used big data to help find the Higgs boson, or the “God Particle.”
The report, sponsored by the Northern Virginia Technology Council, The George Washington University and Attain, a cloud-computing solution, showed that 14 percent of the big data and analytics companies in Northern Virginia employ more than 1,000 people each, for a total of more than 1 million jobs in Northern Virginia alone.
One of the major hurdles for the big data industry, which the report laid out, was a major skills gap between the number of qualified workers and the expected demand over the next few years.
Some area schools have taken a notice to the expected shortfall in qualified employees. George Washington partnered with IBM for a big data master's program last year, and the University of Virginia earlier this year received $10 million for its Data Sciences Institute.
"Recognizing the increasing need for training in this area, 90 percent of colleges, universities and other educators who responded to the survey already offer or intend to offer courses or programs specifically targeting Big Data and Analytics within the next five years," read the report.
In a similar way Stanford University used its proximity and brain power to become the leading feeder school for Google.
Since revenues are expected to grow quickly in the next few years and the fact that big data can be applied to most any field of interest, the demand for services in the big data industry is only expected to grow as the computational power goes up.
Computing power and lowering prices
Moore's Law, originated by Gordon Moore in a 1965 paper for Electronics Magazine, stated that in a year the number of components that could fit on a computer chip would roughly double every year.
His model proved surprisingly accurate, and even though that length is now closer to every two years, it has been set for many companies as an industry barometer for growth in planning for future advances in computing.
Calculations that now take seconds once took hours or days.
In an interview Intel Moore said, "Now what was I trying to do was to get across the idea that this was the way electronics was going to become cheap."
The size and most important price point for much of the technology to crunch such large sets of data was not available in years past, but the democratization of computing made it so that start ups and expansion in the field were much easier and less costly.
The expected growth in big data means great potential for growth in jobs in the region.
Be the first to post a comment!
Post a commentCommenting is not available in this channel entry.
Comments express only the views of the author and do not necessarily reflect the views of this website or any associated person or entity. Any user who believes a message is objectionable can contact us at [email protected].