Taking the Data Center to the Next Level

Data is growing exponentially, challenging IT organizations to adopt and embrace new methodologies to harness and empower data faster than ever. Yet, the challenge isn’t just managing data’s explosion but doing so in a way that provides easier access and performance for the business while keeping costs to a minimum.

This is the equation facing most database administrators. How can they meet the challenge of simplifying data while pleasing the business? Through EMC ITs implementation of XtremIO and the benefits derived from the all flash array, a balanced performance was achieved that features simple provisioning, a suite of powerful features, while providing database consolidation and increased response times.

For more information about EMC IT’s use of Flash and XtremIO, read Flash Comes Down to Earth: High-performance Storage Goes Mainstream by KK Krishnakumar, VP and Chief IT Architect, EMC IT.

Assessing Data Loss Costs: Value-Driven Protection of the Bottom Line

Omer2By Omer Sagi — Data Scientist, EMC IT

In an age when most companies invest to become data-driven, the value of data is increasingly a key criteria for making IT decisions, and the protection of the data becomes paramount to those decisions

When making backup-related decisions, price justification involves the potential capital loss to the organization when a data loss or unavailability occurs. Understanding the value of data and access to that data is key when prioritizing backup technology or even for deciding which infrastructure to protect during a cyber-attack. However, estimating this price is not trivial.

I recently worked on a research project with a team of academic partners at Ben-Gurion University for prioritizing data replication to minimize the monetary loss in the case of a disaster. The method we derived can limit the costs of data loss, and could provide a high return on investment (ROI) of up to one million dollars per incident.

Continue reading

Your Data Protection Strategy: An Evolving Business/IT Conversation

Paul-Gogan-1b_By Paul Gogan — Manager, Cloud Platform Protection and Storage, EMC IT

Creating a data protection strategy for your organization is a little bit like selecting the right insurance policy for your home. It isn’t the most flashy of endeavors and nobody likes paying those insurance premiums, but when a hurricane rips the roof off your house, you’re glad that you took the time to do it right.

Structuring your data protection strategy is not exclusively an IT decision. It’s primarily a business decision involving a range of stakeholders (not just IT) which provides the products, solutions and processes to execute that strategy based on the value of the data and the objectives of the business.

Data protection is not a one-size-fits-all process, as we in EMC IT, have come to learn. The following are best practices and lessons learned that EMC IT uses to create and maintain our data protection strategy.
Continue reading

Adopting the SAP HANA Platform to Power Your SAP Implementation

By Mike Harding — Senior Technical Architect, EMC ITHarding

If your organization is running SAP’s software products and applications, SAP’s in-memory computing platform, SAP HANA, offers tremendous game-changing potential for delivering business value via data analytics. However, as EMC IT recently discovered, there are key steps you should take to pave the way for leveraging this new platform.

EMC is still a relatively new customer of SAP’s software products and applications, having gone live with a large-scale greenfield SAP deployment back in July 2012.  The implementation program continues to thrive, having added more SAP modules and solutions over the course of the last couple years, and gearing towards a large SAP CRM deployment in 2016.

Continue reading

Service Portfolio Management: Where the Rubber Meets the Road

KKBy KK Krishnakumar — Vice President and Chief IT Architect

When it comes to running your IT operation like a business to deliver IT as a Service (ITaaS) and competing with outside providers, Service Portfolio Management (SPM) is where the rubber meets the road.

SPM is the process by which your IT organization makes sure your service catalog is providing the right mix of services that will meet customers’ needs and deliver business value while at the same time enabling you to be a financially viable service provider. Or, put in plain business terms, SPM is how you make sure you are selling the right product mix to meet your customers’ demands (and needs) at the right price to keep you in business–to keep IT relevant. It is basic supply and demand.

That said, achieving SPM as you transform your traditional IT operation to ITaaS has its challenges. EMC IT has been in the process of transforming to an ITaaS model for several years now. And just as our transformation journey has been a learning process, so has our journey to effective SPM.

Continue reading

Best Practices for Virtualizing Your Oracle Database – Datastores

By Darryl Smith — Chief Database Architect, EMC IT

First off, my apologies for delaying the last part of this four part blog for so long.  I have been building a fully automated application platform as a service product for EMC IT to allow us to deploy entire infrastructure stacks in minutes – all fully wired, protected and monitored, but that topic is for another blog.

In my last post, Best Practices For Virtualizing Your Oracle Database With VMware, the best practices were all about the virtual machine itself.  This post will focus on VMware’s virtual storage layer, called a datastore.  A datastore is storage mapped to the physical ESX servers that a VM’s luns, or disks, are provisioned onto.   This is a critical component of any virtual database deployment as it is where the database files reside.  It is also a silent killer of performance because there are no metrics that will tell you that you have a problem, just unexplained high IO latencies.

Continue reading

Best Practices For Virtualizing Your Oracle Database With VMware

By Darryl Smith — Chief Oracle Architect for EMC IT

In this blog (the third in a series on virtualizing Oracle), I will describe the best practices that EMC IT developed as we virtualized our most mission critical and highly transactional databases. You can find the earlier blogs here: [Running Oracle on Virtual Infrastructure Really Pays Off, Best Practices for Virtualizing Your Oracle Database]

There are two trains of thought when you talk to people about virtualization. From the infrastructure point of view, it is all about getting more efficiency out of the physical infrastructure layer. On one hand you can try to go extreme with this approach, but it will come at the expense of incurring higher administrative costs required to constantly troubleshoot performance and functionality issues.  The other point of view is mainly about reserving all of the resources of the underlying servers, just in case the application needs it. Fortunately, with VMware vSphere you can have both, by using a more balanced approach.

I promised, in my earlier posts, that I would publish the secret sauce to achieving both great performance and high efficiency when virtualizing Oracle databases – so here it is. I have broken it up into four categories: memory, networking, CPU and storage (vSphere datastores).  I will actually save the datastore best practices for the next and last post in this series, due to their complexity.

Continue reading

Database as a service …

Let us jump in feet first into ‘database as a service’. So what do we mean by this ? We have three database platforms that we can provide ‘slices’ of to our business users. Oracle and SQL Server have been the traditional platforms we have built upon and Greenplum is something we have adopted quickly and which lends itself to ‘database as  a service’ very well.

How have we done this ? Tier, Consolidate, Virtualize

Of course, this has been a journey on its own merit. We started off by looking at the database tiering models required based on business criticality, required availability and I/O profiles. At EMC, we separate the mission criticalapplications and databases (as in revenue impacting and/or customer facing, typically with stringent RTO/RPO and data loss constraints) from the business critical applications and databases (impacting subprocesses vs enterprise processes).

To gain efficiencies of scale, we decided to consolidate mission-critical Oracle and business critical into 3 and 8 node Oracle grid architecture (and along the way reduced the number of Oracle versions from 9 down to 1). We also consolidated and virtualized a number of production and non-production databases for the business critical side. This consolidation and virtualization exercise resulted in the reduction of databases and database servers from 50+ to single digits. This has provided the basic technology foundation for implementing database as a service on the Oracle platform. The current environment provides us a mechanism by which a large environment can be sliced to service different needs at different points of time, with standardized and published service levels and predictable scalability and performance.

Continue reading