Setting a course for integrating IT operations to bring together Dell and EMC in the largest merger in high-tech history is a bit like climbing a mountain. You need to decide where to start and map out the path you will take to get to the top.
Defining Dell IT’s (the name of the new combined organization) journey for integration was the first step as we began the process of molding both companies’ IT resources into one IT organization to serve what was becoming a 140,000-person global company. And while we still have a ways to go in our integration effort, here are some of the methods we used and lessons we learned so far that might make your IT integration journey a little less daunting.
It’s the holiday season and that time of year where we get together with friends, family and loved ones. Big gatherings are common which are fun for some, stressful for others. My family has grown beyond the dining room in my Mom’s house, so we now find ourselves having the big family holiday dinner quite literally in the play room above her garage. Yes, we carry all of the holiday dinner out of the kitchen, up the stairs, through a hallway, and into this room, because it’s the one big room in the house where we can all comfortably sit and eat. So Mom gets a little stressed, but luckily she has a nice, empathetic family. Inefficiency be damned, we optimize for the moment.
IT executives face a similar dilemma in managing day-to-day data center operations, but their customers and user base are a touch less empathetic. Fortunately, we now have the ability to expand off-premises, so we don’t have to buy that big expensive house anymore and can pay for capacity on demand. But we still struggle with how to optimize those expansion capabilities and manage our TCO of our enterprise IT infrastructure assets. (more…)
How do you cool today’s modern data centers, running increasingly high density and high performance equipment built to manage exploding amounts of enterprise data? This presents a substantial cooling challenge for data center managers. Fortunately, we at Dell IT have found a way to take the heat off of such cooling demands.
After many months of careful experimentation, we recently determined that using a cold aisle containment approach in our Durham, N.C, data center, we can safely maintain our equipment at 78 degrees F. This is six degrees warmer than the original design threshold of 72 degrees F. The increase means we can now leverage free-air cooling—air circulated from outside rather than mechanically cooled air—in our data center 80 percent of the time instead of 60 percent. (Think of it as opening a window in your house rather than running the air conditioner.) This will cut our cooling costs by 25 percent.
Organizations of every size are in the initial phases of modernizing their IT infrastructure to transform themselves into digital businesses. One of the key themes for that modernization is the utilization of Flash-base storage technologies in the data center to achieve such benefits as peak performance, consistent operations levels, and higher throughput rates.
However, just because you have Flash storage doesn’t mean your IT application is going to perform miraculously. Getting the most out of your Flash environment takes understanding your application requirements, operating characteristics, SLAs and your entire application ecosystem. It takes careful analysis of your use case, workload patterns, data services, capacity needs, and operational skills to determine which Flash platform will provide the desired results. The goal of this blog is to provide a set of guidelines from which organizations can make informed decisions.
Fortunately, we at Dell IT have a rich Flash storage portfolio and, working with our engineering teams, we have put together the Top 10 Tips and Tricks on how you can optimize your Flash environment in today’s cloud-enabled, Big Data focused, IT world. These platforms, products, tools and best practices will help you make the most of your Flash investments irrespective of the modernization stage you are at.
The data lake has not only allowed IT to open up Big Data to a broader community of internal business users, it is now helping us channel unprecedented amounts of information to DELL EMC customers as well.
Using data lake technology, for example, IT and our DELL EMC business groups forged a groundbreaking partnership to allow customers to leverage Big Data to monitor and proactively manage their IT environments. We created a tool called MyService360, an on-line solution that gives DELL EMC Support customers and partners easier and faster access to near real-time service information. It features a personalized dashboard that provides customers with a 360-degree view of their environment and customer service experience.
Launched last May, MyService360 only scratches the service of the potential value that is expected to spring-board from leveraging Big Data in the data lake. Having all the data in a centralized location provides easy access and gives developers and data scientists the opportunity to gain data insights that would be extremely difficult to achieve without the data lake. Those insights can then be used to create metrics that we can share to empower our customers.
Whether companies refer to results, outcomes, ROI, or case studies, Big Data and data science are finally moving beyond the hype and proving to deliver dividends over time. Several new Big Data technologies and predictive tools have been launched to meet the growing demand within business and technology groups to harness the constant growth of both structured and unstructured data within and outside of the enterprise. But such technologies and tools won’t be effective unless you define the problem to be addressed.
Most data science initiatives start with a proof of concept (PoC) or in some cases with a proof of value (PoV) if the foundational concept is clearly established. Developing a pipeline of PoC’s can be extremely helpful through working sessions with data scientists, business subject matter experts (SME’s), data experts, and leaders. Following this, prioritize PoCs by stack-ranking each of them based on business value and ease of implementation which factors in availability of data, granularity, and quality.
As organizations unleash the power of the data lake by providing business broader access to more and more data, they are facing a growing IT dilemma—How to keep improperly governed or poor quality data from polluting the data lake.
While IT’s traditional approach to managing data governance and quality have been quite effective over the years, the magnitude of data in today’s data lake is much larger than traditional data warehouse levels. Traditional tools and tactics are being overwhelmed by Big Data in the lake.
There are, however, strategies that organizations can use to reshape data governance and quality standards in the Big Data world. While our tactics and tools are still evolving, I will share some of the efforts we are developing at EMC IT to keep our data lake clean.
Successful companies like Ford and Netflix have deployed more than just innovative consumer service models; they also use cutting-edge cloud native IT architecture to quickly adapt to changing market demands.
Cloud Native is an architectural principal that helps IT developers write applications in a way that that maximizes the use of cloud environments where tight coupling of applications to underlying infrastructure is eliminated. Combined with the right Platform as a Service (PaaS) capabilities, this approach reduces your organization’s time to market, increases responsiveness to customer feedback and cuts operating costs—all the things today’s innovative companies thrive on.
If your organization is struggling with how to keep your enterprise data secure in the cloud, you aren’t alone. The fact is, the modern data center poses some fairly new security challenges and there is no rule book on how to meet them. Even in security, we are learning as we go.
From using analytics to predict how our storage arrays will perform in the field, to engineering product configurations to best meet customers’ future needs, EMC is just beginning to tap into the gold mine of intelligence waiting to be extracted from our new data lake.
In fact, we are currently working on dozens of business use cases that are projected to drive millions in revenue opportunities. And we are just scratching the surface. There’s a lot more data available, more to be harvested, and more analytics to be built out as data scientists and business users hit their stride in exploring a new era of data-driven innovation at EMC.
As I noted in my earlier blog ( The Analytics Journey Leading to the Business Data Lake), EMC IT embarked on creating a data lake to transition from traditional business intelligence to advance analytics more than two years ago. A key focus of this effort was to address the fact that data scientists and business users seeking to leverage our growing amount of data were stifled by the need for such projects to go through IT, which was a costly and slow process that discouraged innovation.
We now have the foundation and tools in place to use data and analytics to create sustainable, long-term competitive differentiation. To get here, we worked closely with EMC affiliate Pivotal Software, Inc. to mature together and leverage the multi-tenancy capabilities of their Big Data Suite.
The opinions and interests expressed on Dell EMC employee blogs are the employees' own and do not necessarily represent Dell EMC's positions, strategies or views. Dell EMC makes no representation or warranties about employee blogs or the accuracy or reliability of such blogs. When you access employee blogs, even though they may contain the Dell EMC logo and content regarding Dell EMC products and services, employee blogs are independent of Dell EMC and Dell EMC does not control their content or operation. In addition, a link to a blog does not mean that EMC endorses that blog or has responsibility for its content or use.