If your organization is running SAP’s software products and applications, SAP’s in-memory computing platform, SAP HANA, offers tremendous game-changing potential for delivering business value via data analytics. However, as EMC IT recently discovered, there are key steps you should take to pave the way for leveraging this new platform.
EMC is still a relatively new customer of SAP’s software products and applications, having gone live with a large-scale greenfield SAP deployment back in July 2012. The implementation program continues to thrive, having added more SAP modules and solutions over the course of the last couple years, and gearing towards a large SAP CRM deployment in 2016.
As the deployment has matured, we’ve had a few opportunities to leverage SAP’s new in-memory computing platform: SAP HANA. Following some initial success with the ‘Sidecar’ solution, supporting real-time analytics onto a custom HANA data mart populated with replicated data from the source ERP instance, we realized we needed to focus on a couple key areas in order to operationalize this new platform. For starters, we needed to build employee skills that would allow us to appropriately leverage the capabilities of the SAP HANA platform, and to drive ROI, we also needed to drive down our hardware costs.
We focused a concerted effort into training and enabling our SAP technical teams – Sys Admins, BASIS, and Developers—but much of our skills development was on-the-job type learning. We’ve found that while some tasks are somewhat simplified – for example, a DBA doesn’t need to manage indexes or tablespaces anymore – the critical skills needed were more in the development area.
Traditional ABAP developers are taught to be database agnostic in writing code and bring the logic into the application tier. Now with HANA being SAP’s own in-memory database platform, this approach is being challenged: drive the logic down into the database. Fortunately, we have some excellent database developers within EMC IT who have been instrumental in translating their skills onto the HANA platform. One of our developers has even been accepted into the HANA Distinguished Engineer program. Coupled with senior BASIS and DBA skills, this has been the backbone of empowering our HANA deployments.
The other focus area for us has been in managing infrastructure demand, which we’ve done through different channels. We first collaborated closely with VMware to virtualize our SAP HANA deployments, starting with all non-production environments. This has allowed us to tailor our compute allocation to specific CPU/memory requirements versus over-allocating on a physical appliance. If you were to add up all of our non-production HANA environments today and deploy these on traditional appliances, we have driven down our infrastructure costs by roughly 300 percent.
Another option for driving down compute costs is to minimize your DRAM (in memory) demand in the first place. It’s rather simple – smaller database size means less data to load into memory which means less memory/compute required. So how can we reduce our database size? The obvious answer is applying best practices for data cleansing and archiving. In the case of a data warehouse such as SAP Business Warehouse (BW), the notion of archiving is not quite as straight-forward as many customers need to store historical records for long-term analytics and/or compliance purposes.
Fortunately, SAP has a solution for decreasing your BW footprint called Nearline Storage (NLS). While this solution has been around for quite some time, it wasn’t until recently (3Q’13) that SAP offered the integration of this solution with a secondary database store: SAP IQ. As SAP IQ is columnar-based and is designed for high volume analytics (in fact, its predecessor Sybase IQ has been a dominant player in the financial markets for years), off-loading data out of your primary BW database and into SAP IQ initially showed zero performance degradation. In fact, we saw some performance improvements when querying against IQ.
In the case of our SAP BW deployment, we were able to leverage NLS and Sybase IQ to shrink our database size by roughly 38 percent from 7.2 TB to 4.5 TB. Since we implemented this solution as a predecessor to BW on HANA, we significantly drove down our infrastructure demands for BW on HANA.
The methodical approach we took to adopting the SAP HANA platform, including executive level support and focus on skills development and infrastructure management, ultimately paved the way for a successful BW on HANA deployment, and while query response times have improved as expected, what’s been interesting is how much smoother our data and batch loading cycles are running now. In fact, we estimate that we’ve driven down support costs by more than 50 percent due to the decrease in “hand holding” that our loads now need. Furthermore, we’re able to actually run some loads more frequently throughout the day, giving our business users improved data quality.
And we’re not done. As our BI Director Pat O’Sullivan adds, “HANA has been a game changer for us, it has driven record performance and consistency in delivery that we haven’t seen before. We’re now exploring how we can capitalize on HANA technologies and capabilities in other areas to drive additional business value.”
We have initiatives under way to bring real-time reporting capabilities into the BW instance itself, streamlining traditional BW and real-time analytics into a single instance.
As EMC IT continues to evolve and mature in the SAP HANA space, feel free to follow our story on Twitter.Analytics, HANA, IT Transformation, SAP, source:itb