GPU Computing: The Key to Unleashing the Mysteries of All That Data

This post originally appeared as GPU Computing: The Key to Unleashing the Mysteries of all that Data on our COO, Carlos Meléndez, Under Development blog at InfoWorld and is reprinted with permission from IDG.

A few years ago Marc Andreesen accurately predicted that “software is eating the world.” An update to that declaration today could very well be that data is eating the world.

According to IDC, by 2025, the world will create 180 zettabytes of data per year (up from 4.4 zettabytes in 2013). And until recently, that data was structured, which means it is presented in rows and columns and easily entered, stored and sorted. Today, much of the data is unstructured, coming from different websites, devices and databases, and presented as video, images, graphics, etc.

All of these reams of structured and unstructured data hold the key to helping organizations better understand their customers, follow patterns of behavior to predict future actions, uncover breakthroughs to cure diseases and make communities safer, among a myriad of other things.

What to Do With all That Data

To harness this goldmine of data, artificial intelligence and machine learning has emerged, which uses algorithms to train data to find patterns. The problem is, traditional CPU (Central Processing Unit) processors just can’t adequately handle the bulk processing required for this complex, boat-load of data, and this is where Graphics Processing Unit (GPU) computing comes in.

GPU-based servers are fast becoming as necessary to effectively process data for artificial intelligence (especially for deep learning algorithms), as CPUs have been to do virtually everything else. While CPU’s have the capabilities to process tons of unstructured data — eventually – in a matter of days or hours, GPUs can do so in a matter of minutes.

If CPUs are Race Cars, then GPUs are Cargo Trucks

Yet while GPU computing is the hot new technology, it’s really nothing new. GPUs have been used for many years in gaming applications, as well as for video play-back and analysis. They were designed to work with arrays of data – for faster processing of big arrays of data.

At Wovenware, we like to describe the difference between CPUs and GPUs as the difference between a race car and a cargo truck moving lots of supplies from one place to another. A race car is really fast and compact and can get to where it needs to go quickly, yet it may need to make many trips to accomplish the task. A cargo truck, on the other hand, may not be as fast, but may require only one trip to deliver the goods.

The Forerunner of GPU-Enabled Deep Learning

The current leader of the “cargo truck” fleet is NVIDIA, which started out as a video card manufacturer, providing superior 3-D rendering for gamers. Yet it saw how it could help process big data, so it invented a computer platform that allows data scientists to use GPU to process this data more efficiently and quickly than ever before.

NVIDIA has enabled this level of use for deep learning algorithms through its software platform, CUDA, which allows hardware to be used for something different than what it was originally intended for.

As the market continues to heat up, however, new players are entering the ring, providing competitive advantages and new features and capabilities, and the big companies, such as Intel or Google continue to focus on GPU computing. This competition will only serve to inspire continued innovation and lower costs.

The GPU Investment Considerations

Adopting this technology is not something to be taken lightly, however. Consider these requirements:

Cost. one GPU card alone can cost more than $10K, and most deep learning projects would require a minimum of four GPU cards.

Specialized Hardware Architecture. Companies looking to deploy GPU servers also should understand that it’s not just the cost of GPUs, but a special hardware architecture, as well as lots of CPUs and RAM. GPU servers are almost always custom-built to meet the specific machine learning needs of the developer.

Specialized Facilities. GPU servers require special housing, with specific power, airflow and temperature requirements.

Another thing to note is that since operating GPU servers, requires specialized expertise, ongoing maintenance and lots of financial investment. Most companies – especially mid-market ones, just don’t have the ability nor budget to handle it in-house and are turning to the few artificial intelligence and software engineering service providers around who can provide the GPU capabilities to build and manage machine learning projects.

GPU computing could eventually become the standard computing processor for all software development projects as machine learning takes over. Understanding its origins, capabilities and possibilities provides organizations of all shapes and sizes a considerable leg-up in the fast-paced world of business.

How Smart is Artificial Intelligence Really?

Let me start by correcting a common misperception. Artificial intelligence isn’t really intelligent at all. Intelligence is about being able to learn and make judgements along the way. It’s not enough to be able to acquire and apply knowledge – which computers are able to do by leveraging AI and big data – but also about self-awareness, creativity and being able to pose problems.

There are many types of intelligence, including Emotional Intelligence – which requires the ability to accurately read people and situations, have empathy, know how to respond to them, etc. No software today or in the near future has any of these capabilities of true intelligence.

So, if AI, isn’t really about intelligence, what is it about? What we call AI and smart apps today is really software that excels in pattern recognition – the ability to sift through large quantities of data and find the patterns humans cannot perceive. They are able to use these patterns to make highly educated “guesses” of future outcomes – or predictive analytics – based on huge amounts of historical data.

To make this happen, these programs rely on data scientists to continually design and refine algorithms, as well as give them access to large stores of relevant data. The data scientists begin with a set of questions to explore and then develop a hypothesis to predict what will happen. They feed a comprehensive set of data points – the variables – into the smart app and test it to see if the hypothesis is correct. If not, they would remove variables, and add others to see what happens, and so on – it’s how they train the data.

Despite their lack of true “intelligence,” however, these smart apps have provided critical value in a variety of fields, from healthcare and medical devices to retail and security.

On the path to intelligence

The closest we have to “intelligent” computers today are Siri, Alexa and Cortana because to some limited degree you can interact with them. You can ask them questions and get answers back. While people might think Watson is intelligent, in the end it knows a lot of information, which it can process quickly, but it doesn’t have the capabilities of self-learning or experimenting on its own.

We as humans have the capabilities of making these computers better, if not actually smarter. To do so, we should focus our efforts on trying to create AI that doesn’t require the massive amounts of data it relies on today. At the same time, we need to keep improving the tools to create better algorithms.

While we might be able to create intelligent computers someday, before we can do so we need to better understand how our own brains work and how we learn. We could apply that understanding to try to create software that could actually learn by itself, and perform the “scientific method” on its own by coming up with a hypothesis, performing experiments and learning from the outcome.

This would be a worthwhile endeavor. “Intelligent” computers could help us solve some of the most vexing, complex problems that we haven’t been able to solve, such as curing cancer and other diseases, stopping global warming, and a myriad of others. But on the path to creating intelligence we have to be careful not to let algorithms decide ethical or situational questions. For example, a self-driving car should not decide if it’s going to cross a bridge that is about to collapse or if it should drive through a live electrical wire in the street — both very real infrastructure concerns in Puerto Rico right now.

In the end, AI programs today are still number crunchers – it’s just that the huge amount of data they are crunching and the algorithms that they are programmed to use, enable them to recognize patterns and make predictions. We’re still probably more than 50 years away from understanding how our brain and intelligence works before we can begin to create truly intelligent computers. And a long way from having to worry about computers – like Hal in 2001 and Terminator, and Skynet – dominating humans.

Where do we go from here?

In the meantime, we should focus our efforts on using AI to solve specific business problems, like predicting defects in medical devices before they are used in people, or the likelihood of patient hospital admissions, and developing algorithms to make them better. In the end, that will not only make businesses more productive, profitable and competitive, but also solve problems for the greater good.

Is the Mid-Market Ready for AI?

Artificial Intelligence (AI) is finally allowing organizations to do something with all that data, and transforming the way they do business, regardless of their size or industry. From companies using virtual assistants in the boardroom to answer specific questions, to ecommerce firms using chatbots to act as personal shoppers, AI-based tools are augmenting the role of humans, with the ability to absorb huge amounts of data that would be impossible in a human.

But until now, many businesses have thought of AI as something straight out of Star Wars. The reality is, it’s alive and well today and being implemented to solve specific business challenges. Mid-market companies are embracing AI solutions and it’s easier to achieve than you would imagine.

AI is Not Just for the Big Guys

Not only is AI not something confined to huge businesses with the resources, cash and expertise to implement it, but it actually has benefits that are more suited to mid-market companies.

AI’s ability to take over the key functions of humans allows companies with limited staff to seem much larger and improve customer service – even providing 24/7 call center support when it would be impossible to hire enough people to do this. Additionally, at approximately $25K to build a single-purpose chatbot, the cost is within reach for most mid-market firms, who would pay much more to hire a customer service representative. These chatbots, for example, can quickly and accurately answer customers’ questions or interact with them as they shop.

If you’re just starting your AI journey, don’t worry, you’re not too late to the game, but you should start now, to reap the benefits in the next few years. So what are the key considerations for mid-market companies looking to take the cognitive plunge?

It’s all about the data. While it’s not necessary to modernize legacy equipment to run AI programs, it is critical that you have a way to store data which is the lifeblood to good AI apps. If you don’t currently have the needed data to train an algorithm, this information can be collected in about 3-6 months.

Ask the right questions. Before you even think about building a predictive model, you need to know what you are trying to solve and work backwards, making sure you have the right data to find it. For example, in insurance it may be to determine where fraud may be occurring; or in banking it may be to determine the customers who may be good candidates for new loans, or who may actually default on a loan. You may also want to determine churn rates, the amount of customers coming into your company versus going out. Gathering the correct data helps to ensure accurate predictive models.

Don’t go it alone. While implementing AI doesn’t require huge modernization requirements, it does require experienced data scientists – working alongside software engineers. Since these positions are not typically filled at mid-market companies, work with a service provider that can provide these capabilities – that know what data to collect, how to collect it and that have the high-performance hardware and infrastructure to support your project.

Continuously maintain the app. Once you build a predictive analytics model, a chatbot or another form of AI, it doesn’t end there. The algorithm you develop is only as good as the data it receives. It’s important to understand that an AI project is not a one-time thing, but must continuously be maintained, trained and receiving updated information to be effective.

Mid-market companies are embracing AI as a way to increase customer satisfaction, better understand customer behavior and automate tasks so that humans are available to take on more strategic roles. It’s clearly the future of computing and companies who don’t embrace it could be at a clear disadvantage.

Puerto Rico Tech Companies Start the Work of Rebuilding

I recently spoke with Leigh Buchanan of Inc. Magazine about how tech businesses like Wovenware are keeping customer projects moving forward despite the ravages of Hurricane Maria.  Check out her story.

As I mentioned, the key to success is preparing in advance, with back-up generators, multiple Internet providers, and constant customer communication.  As an artificial intelligence and nearshore software  provider, we were also fortunate to send some of our software engineers and data scientists to customer locations on the mainland, without all the hassles that would have occurred if we were a non U.S. nearshorer.

We’re getting back on track but have learned some valuable lessons about how to prepare for worst-case scenarios.

Wovenware Goes to Washington

On September 13, I had the great honor of participating in a congressional hearing in Washington D.C. on the effectiveness of HUBZone reform, at the invitation of Ranking Member Nydia Velázquez (D-NY) and Chairman Steve Chabot (R-OH).

Wovenware HUBZone Reform Testimony

I am very supportive of HUBZone, a Small Business Administration (SBA)-run program that provides federal contracting opportunities to small businesses in economically distressed areas. As co-founder of a HUBZone certified company, I was asked to provide testimony on my experiences in the program and share my thoughts about the need for reform of this historically underutilized support program.

While I was thrilled to take part in the hearing, the timing could not have been worse. Hurricane Irma had just landed on the island and the Category 5 Hurricane Maria was on its way. But l was committed to doing my part to improve the HUBZone program so that small businesses across Puerto Rico and in other areas across the U.S. could get the contracts and support they need to thrive. In light of the devastation from the recent hurricanes, it’s something that is needed now more than ever.

Wovenware with Congresswoman Velazquez

Right after our CEO Christian Gonzalez and I arrived in D.C. that morning, we met in Congresswoman Velasquez’ office to discuss my testimony. While I was only allotted 5 minutes to testify – along with three other small business owners – followed by a brief Q&A, I worked hard to communicate my passionate support of the program.

Despite the U.S. government being the world’s largest buyer of services and goods, spending billions of dollars each year with both large and small businesses, federal agencies still struggle to comply with the minimal requirement of negotiating three percent of their contracts with HUBZone-certified companies. I firmly believe the proposed reforms to the HUBZone program have the potential to change this situation and I was inspired to see other small business owners who testified that day feel as strongly as I did about the need for reform.

Wovenware with Senator Risch

After the session, we met advisors from different congressional offices and procurement officers from other agencies, followed by a meeting with Senator James Risch from Idaho and Chairman of the Senate Committee on Small Business and Entrepreneurship (R-ID).

Wovenware with Congresswoman González-Colón

We also met with Puerto Rico’s Resident Commissioner Jenniffer Gonzalez-Colón who was very happy with our accomplishments as a member of HUBZone, as well as being a local success story. We look forward to continuing to work closely with her to help promote HUBZone.

The bill to reform HUBZone, Bill H.R. 3294, will go to congress sometime in November or December, and we will be sure to provide updates on it in this blog.

To cap off a productive day, we enjoyed a fabulous dinner at Founding Farmers in D.C., a farm-to-table restaurant owned by the North Dakota Farmers Union and Farmers Restaurant Group. It was a perfect ending to a perfect day.

Moving Business Forward Despite Maria’s Devastation

Puerto Rico has been dealt with one of the worst natural disasters to hit the island in more than 80 years, and as I’m writing this, we are struggling with a lack of electricity, fuel, water, and food shortages, among other things.

But as a I mentioned last month when I testified before Congress about HUBZone reform, that here on the island, we are resilient, resourceful and committed to keeping our economy and people moving forward.

At Wovenware, this deep commitment extends to our customers, and we have implemented measures to make sure our customers’ projects across the U.S. and beyond continue to move ahead as we continue to develop and support their AI needs, legacy modernization initiatives and other solutions.

Over the last few days, we’ve implemented a triage system to ensure that the most critical and time-sensitive projects — which would have been impacted the most by delays — are operational and on schedule. While we do not yet have power, we are fortunate that our San Juan headquarters received little damage, and we continue to meet our customers’ needs.

For example, we have implemented flexible work environments – with many of our software engineers working out of customers’ local offices, such as Claro’s headquarters, or the local Medicaid office for our software development work with the Department of Health. Some of our partners, like our Bank, BanescoUSA, and our Internet Service Provider, Icomm Networks, have also provided temporary space for us. Finally, we’re also renting out additional temporary space from local businesses, and have staff members working from home, where they can.

For a major customer in Portland, Oregon, we’re sending six of our software professionals to work on-site; and we will do the same with customers in Washington D.C. and other locations. We’re able to seamlessly continue our work onsite in these locations because we’re a U.S. based nearshorer – which means we don’t have to worry about visa requirements or foreign worker regulations.

After Hurricane Nightfall

Have no doubt, Hurricane Maria has caused extensive damage to our island, and it will take quite some time for us to achieve normalcy. Yet our perseverance and resilience is as strong as ever, and while our physical headquarters does not yet have power, we are open for business.

Our most important asset always has been our people – their expertise, excellence, and commitment to our customers – and all of that remains as strong as ever.

UPDATE (September 29, 2017): We are back at our office today! The office backup power generator was fixed yesterday and we are all together working from our headquarters.