Capitalize on the fall in cryptocurrency mining

Check Out All The Smart Security Summit On-Demand Sessions Here.


While it is misplaced to scoff at the rapid fall of cryptocurrencies, serious opportunities are emerging as a result. For those unaware, crypto miners over the past few years have purchased just about every high-capacity GPU available on the market. This offering drove up prices and reduced availability to the point where even major cloud providers couldn’t get their hands on current models.

Combined with Moore’s Law, this has led to a situation where the average GPU hardware used for anything other than crypto is several years old and is probably four times less powerful than normal market conditions could support. But it has also led many software companies to avoid optimizing their products for the GPU. So, on average, the software you’re using is probably ten times slower than it should be.

It’s probably the biggest market opportunity in a generation, and now smart companies should be looking to tap into it. Speeding up your word processor or spreadsheet ten times is unlikely to unlock major business value. But there are several important areas that will.

Analysis of data and database systems

The most obvious area is database systems, especially those that run on big data. The digitization of the world as a whole has not slowed down, and as a result, systems built on legacy databases are struggling these days to keep pace. This isn’t always obvious to end users as a database issue, but usually manifests as extremely slow screen refresh rates or stuck busy cursors.

Event

On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here

This was somewhat mitigated by moving to cloud computing with automatic horizontal scaling (adding more processors). However, as data volumes become very large, the process of moving data between systems and between CPU enclosures becomes rate-limiting. The result is non-linear feedback, where doubling the applied calculation only gives you, for example, 50% more speed.

The implicit response of most companies in these circumstances is, essentially, to stop even looking at all the data. For example, you can aggregate hourly to daily or daily to monthly data. Under normal operating conditions with well-understood data, this may be fine. However, this comes with some risk because modern data science techniques require access to primary granular data in order to generate a fundamental type of insight: anomaly detection.

Don’t ignore outliers

Anomalies can be good or bad, but they are rarely neutral. They represent your best and worst customers and your company’s best and worst responses. They include issues of high trading risk and also rewards. Solving a technological limitation by ignoring outliers is therefore cheap and foolish.

A classic example might be utilities, which until recently – and sometimes still – used 1km resolution data to monitor tree and forest fire risk. A single pixel in such a system can have 1,000 healthy trees and one dead tree. But it only takes one tree to strike a power line to cause a wildfire large enough to bankrupt a major utility.

The business risk, in this case, is hidden in decades-old data collection decisions beneath even older database technology – but it is very real nonetheless. And today would be a very good time to start tackling it since sources and methods have evolved rapidly over the past five years and have generally not exploited GPU analysis or new hardware.

Uncover hidden market opportunities

A similar situation exists with prospect and customer data in many businesses. An accounting mindset and older technology can lead to the systematic aggregation of data into monthly and quarterly reports ad nauseam. But you should never forget that your customers are people whose cumulative experience across multiple touchpoints forms the basis of the likelihood of purchase or recommendation (or lack thereof). Similar to risk above, market opportunities are masked by default in common aggregations such as sums and averages.

This raises another very important question in business analysis, namely who within a company is empowered to find these risks or opportunities. Perhaps the most important reason for upgrading older systems with GPU scanning is the availability of no-code, interactive visual analytics. As the name suggests, this allows more people within an organization to notice a risk or opportunity and interactively drill down to confirm or reject it. This could be a sales person or a front line employee who is not traditionally considered a “data analyst” or “data scientist”.

Next steps for current data and systems

All business situations are unique, so a business’s next step here can vary. But as a simple next step, managers should determine which parts of the business functions they are responsible for use datasets or software tools that are more than five years old. Then look more specifically at the “big data” available versus current systems and the value it could bring.

If they see an area of ​​opportunity, then they need to think about what kind of rapid pilot project they could organize to validate it. Paradoxically, without access to interactive GPU analytics, it can be difficult to assess. Companies should therefore talk to vendors and consider testing in a cloud environment. The pain of crypto miners may well be the gain of companies.

Mike Flaxman is a product manager at Heavy AI.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including data technicians, can share data insights and innovations.

If you want to learn more about cutting-edge insights and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider writing your own article!

Learn more about DataDecisionMakers

Leave a Reply

Your email address will not be published. Required fields are marked *