Big Data is Huge

We grew up in the early days of the internet which began with a screaming modem to access the information highway. Data had always been out there, but was not easy to find until search engines arrived on the scene. The next effort once the data was found was to download, format, import, and then analyze. The tools of the day: FTP, Pascal, spreadsheets and Visual Basic.

As the internet continues to expand an exponential rate, so has the availability and size of data to the point it's now called big data. Modems have been replaced with the term highspeed and physial connections are longer required with WiFi. Search engines have been refined making it seem easier to find data on almost anything. The tools/apps have expanded to the point of massive oversaturation where if you don't like one application you can assuredly find an alternative to suit your liking. In most cases, efficiency is lost with too many choices.

Since the beginning, we have worked with HTML, PHP, SQL, assorted script languages, and spreadsheets. To power through the big piles of data, we use the R/Python one-two punch. We make big data tap-out.

Big Data can be a scary proposition, but not to us. We dig big fat scary data sets..

Data is huge. 90% of the world's data was collected/uploaded in the last 2 years. The ultimate answers are usually obscured by the white noise of getting things done. Too often, we don't have the luxury of optimizing based on properly aligned data.

So what's the best technique to gather the answers? One thing for sure, there is no single answer. There are lots of paths to being better.

What will make you bang your head: TRYING to find the best tool/app/program to solve data handing issues. Yes, TRYiNG, not finding. The combinations are mind boggling.

That being said, we have our go-to tools and can slice-and-dice data of any size to find the information that it holds.

Contact us to start the discussion.