Image Credit: Wit Olszewski / Shutterstock
Check out all the on-demand sessions from the Intelligent Security Summit here.
While it is bad form to sneer at the rapid fall of cryptocurrencies, some serious opportunities are emerging as a result. For those not aware, crypto miners in the past few years have been buying up pretty much every high-capacity GPU available on the market. This bid up prices and reduced availability to the point where even major cloud providers could not get their hands on current models.
When combined with Moore’s law, this has led to a situation where the average GPU hardware being used for anything other than crypto is several years old and probably four times less powerful than normal market conditions would support. But this has also led many software companies to avoid optimizing their wares for GPU. So on average, the software you are using is probably ten times slower than it should be.
That is probably the largest market opportunity in a generation, and smart companies should be looking now at how to exploit it. Speeding up your word processor or spreadsheet by ten times is unlikely to unlock any major business value. But there are several important areas which will.
Analyzing data and database systems
The most obvious area is database systems, particularly those operating on big data. The digitization of the world overall has not slowed down, and as a result, systems built on top of legacy databases are struggling these days just to keep up. This isn’t always apparent to end users as a database issue but typically manifests as painfully slow screen refresh rates or stuck busy cursors.
Event
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
Watch Here
This has been mitigated somewhat by a move to cloud computing with automatic horizontal scaling (adding more CPUs). However, as data volumes get really big, the process of moving data across systems and between CPU boxes becomes rate limiting. The result is non-linear returns, where doubling the compute applied only gets you, for example, 50% more speed.
The implicit response by most companies in this circumstance is, essentially, to stop even looking at all the data. For instance, you might aggregate hourly data to daily or daily to monthly. Under normal operating conditions with well-understood data, this can be fine. However, it comes at some risk because modern data science techniques require access to the primary granular data in order to drive a fundamental type of insight: anomaly detection.
Don’t ignore outliers
Anomalies can be either good or bad, but they are rarely neutral. They represent your best and your worst customers and your company’s best and worst responses. They include issues of high business risk and also of rewards. So solving a technology limitation by ignoring outliers is penny-wise and pound-foolish.
A classic example might be the utilities which until recently — and sometimes still — use 1km resolution data to monitor strike tree and forest fire risk. A single pixel in such a system might have 1,000 healthy trees and one dead one. But it only takes a single tree hitting a power line to a wildfire big enough to bankrupt a major utility.
The business risk, in that case, is hidden within decades-old data collection decisions underneath even older database technology — but it is nonetheless very real. And today would be a very good time to start addressing it since sources and methods have evolved rapidly over the last five years and have generally not exploited either GPU analytics or new hardware.
A similar situation exists with prospect and customer data within many businesses. An accounting mindset and older technology can lead to routine aggregation of data into monthly and quarterly reports ad nauseam. But you should never forget that your customers are individuals whose cumulative experience across multiple touch points forms the basis for the likelihood to buy or recommend (or lack thereof). Just as with the risk above, market opportunities are hidden by default in common aggregations like sums and averages.
This brings up another very important issue in business analytics, which is who within an enterprise is empowered to find such risks or opportunities. Perhaps the most important reason to upgrade older systems with GPU analytics is the availability of interactive no-code visual analytics. As the name implies, this allows a much wider number of people within an organization to notice a risk or opportunity and to dig in interactively to confirm or dismiss it. This could well be a salesperson or a front line employee not traditionally thought of as a ‘data analyst’ or ‘data scientist.’
Next steps for data and current systems
All business situations are unique, so an enterprise’s next move here may vary. But as a simple next step, managers should consider which parts of the business functions they are responsible for are using datasets or software tools more than five years old. Then look more specifically at ‘big’ data available relative to current systems and what value it might bring.
If they see an area of opportunity, then they have to consider what kind of quick pilot they might be able to organize to validate it. Paradoxically, without access to interactive GPU analytics, it can be hard to evaluate. So businesses should talk to vendors and consider testing in a cloud environment. The crypto miners’ pain may well be enterprises’ gain.
Mike Flaxman is product manager with Heavy AI.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers