Venkat Chakravarthi's blog

03 Sep 2019

home

Five Azure Cloud Migration Challenges That Can Catch You Off Guard

Legacy data warehouses are choking under the weight of new unstructured and fast data sources, and enterprises are struggling to address challenges like secure data access, reliable backup storage, scalability, and increasing ownership costs. Moving to a cloud-based modern data platform like Azure can help organizations address these critical issues by providing the benefits of improved scalability and reduced costs.

However, the journey to modernization is not easy—the process of transformation to Azure can be time-consuming and error-prone. Here are the top five challenges that organizations face when migrating their ETL & EDW workloads to Azure, and ways to address them:-

1. Identifying the transformation candidates

One of the critical problems that organizations face is understanding the differences between cloud-hosted applications and traditional on-premise deployments. While many concepts are unique to the cloud, enterprises need to understand the unique characteristics of Azure as a cloud platform, and then decide which workloads to move to Azure, and which workloads to retain on-premise. To decide the transformation candidates, generating an automated system generated recommendation for candidate workloads such as expensive workloads, resource-intensive workloads, poor-performing workloads, and rarely used data can help. 

2. Assessing application compatibility

Compatibility issues after going into production can lead to severe service disruptions. Therefore, apart from ensuring compatible databases on Azure, it is essential to ensure application compatibility before migrating workloads to Azure. Testing the applications on cloud and considering proof of concept before migrating the entire workloads to the cloud are effective ways of checking application compatibility. 

3. Managing existing dependencies 

Existing applications do not run in isolation. Therefore, before migrating workloads to the cloud, it is crucial to have a detailed understanding of all the application dependencies. Having details of shadow applications, and how often and when each application communicates with other applications and servers can help to minimize interruptions during migration. It is also essential to understand and determine the right processes, capacity planning strategy, and service level agreements before deploying applications in the cloud. 

4. Addressing security concerns

Security is one of the major concerns of CIOs when considering moving workloads to the cloud. Deploying security policies across hybrid infrastructures is challenging and needs a centralized security console to implement policies across all endpoints and workloads. To avoid data breaches, enterprises should consider an automated solution that seamlessly integrates with both on-premise and cloud workloads without creating manageability issues or affecting performance. 

5. Planning disaster recovery

Unknown issues might crop up during migration. Therefore, it is recommended for enterprises to have a robust back-up and recovery plan in case of an application error during migration. Although catastrophic data loss is unlikely in Azure migration, preparing for data loss in advance, in case, it happens, and having a backup solution is essential to ensure minimal damage.

 

Venkat Chakravarthi
Venkat Chakravarthi
VP of Modern Data Architecture Practice
08 Aug 2019

home

Is hybrid cloud the future of data warehouse modernization?

The need for digital transformation is compelling enterprises to move from traditional data warehouses to the cloud. Gartner  estimates that the worldwide public cloud services market will increase by over 17 percent to $206 billion in 2019. IDC  forecasts the global spending on cloud services and infrastructure to reach $210 billion in 2019. These estimates highlight the increasing adoption of intelligent cloud-based technologies.

However, not all businesses can move to the public cloud, which offers more of ‘one-size-fits-all’ solutions. Enterprises invested in on-premise infrastructure and business models cannot simply discard their entire model and move to the cloud. Therefore, enterprises looking to combine the best of both worlds are adopting hybrid cloud, which is a combination of private, public, and on-premise, allowing more flexibility and data deployment options for enterprise workloads.

The hybrid cloud market is emerging. The RightScale 2019 State of the Cloud Report  reveals that 84 percent of respondents have a multi-cloud strategy (vs. 81 percent in 2018), while 58 percent (up from 51 percent in 2018) enterprises have adopted a hybrid cloud strategy. Gartner predicts that by 2020, 90 percent  of enterprises will adopt a hybrid cloud, which leverages the existing infrastructure investments and technologies and combines them with flexible and expandable cloud resources.

Realizing the trend, Microsoft has enabled cloud services like Azure to work with on-premise solutions for enterprises to transform ETL workloads from legacy data warehouses to the cloud.

Reasons for hybrid cloud adoption

The Cisco Global Cloud Index forecasts that by 2021, 94 percent  of all workloads and compute instances will be processed in cloud data centers. While the reasons for cloud adoption may vary from business to business, the primary reasons are as follows:

Scalability: As enterprise data warehouses are exploding with massive volumes and variety of data, scalability and flexibility are becoming prime considerations while choosing a cloud infrastructure.

Cost efficiency: Artificial intelligence enables enterprises to process huge volumes of data in real-time, leading to a revolution in areas like predictive analytics, dynamic pricing, and intelligent chatbots. As AI matures, the implementation costs will decrease enabling enterprises to take advantage of AI and ML solutions even in the cloud.

Custom integrations: Enterprises can customize their solutions to integrate their legacy systems with third-party solutions. Though the solutions are targeted, they are not domain-specific, which gives developers open API access to customize to suit their requirement.

Data protection: Cloud providers take the security burden off the shoulders of IT departments by establishing rules, privileges, and providing role-based access across the organization to avoid confusion.

Securing the hybrid cloud

With more enterprises adopting hybrid cloud for flexibility and productivity, enterprise data is left at the hands of third-party cloud vendors. As traditional data security cannot protect data on the cloud, companies are leveraging the capabilities of cloud-based security solutions to gain visibility and control of their SaaS-driven work environments. Enterprises can lessen the risk of a data leak on the cloud by adopting some compliance measures like:

Security certifications

Cloud service providers are using Service Organization Control reports (i.e., ISAE 3402 reports) to certify their control environments. Many providers are also opting for security certifications from third parties like ISO/IEC 27001 to define their compliance strategy.

Security audits

Businesses can perform security audits before choosing a cloud provider to ensure that their security policies align with the organization goals, and confidential data is not at risk. Audits may involve remote testing, onsite visits, or third-party auditors. 

Signing a contract with defined compliance

The cloud provider is responsible for managing the controls and implementing the security measures. Therefore, it is crucial for enterprises to define their privacy and compliance and read through the terms and conditions before signing the contract. Businesses need to ensure that all their security requirements are addressed and legally documented in the contractual agreement. 

Combined with intelligent technologies and digital business services, including machine learning, artificial intelligence, and the Internet of things (IoT), cloud computing can propel companies into a new dimension of competitiveness. At Impetus, we have the expertise to help you seamlessly transform your workloads to the cloud and achieve operational excellence with the right security, access, and governance controls. To know more about how we can help you transform your existing data warehouse, contact us. 

 

Venkat Chakravarthi
Venkat Chakravarthi
VP of Modern Data Architecture Practice
15 Mar 2019

home

11 points to consider for a solid EDW transformation strategy

Making the move from EDW to big data can be daunting. A thorough understanding of requirements, possible scenarios, and processes is crucial to ensure a smooth transition. Organizations must also be equipped to deal with risks such as data loss, and even worse - failed implementation.

But even before enterprises embark on their transformation journey, they must clearly establish the business need and end goal of their EDW transformation. Here are 11 questions you must consider for a solid Enterprise Data Warehouse transformation strategy:

1

WHY?

What is your fundamental business reason for an EDW transformation?

2

COST & CAPACITY

Is your organization looking to free up premium storage capacity and reduce recurring cost of ownership and operations?

3

DEV/TEST CYCLES

How can you avoid the long, complex, and error-prone development, testing, and verification cycles by selecting an automated and validated approach?

4

AGILITY

How can architectural elasticity and scalability complement business priorities thus reducing time-to-market and boosting business agility?

5

POSITIONING

How can data-driven assessments and insight-driven recommendations be employed to mitigate risks, save time, and reduce effort?

6

GOING CODE-FREE

How can you overcome the skill set gap and the risks associated with manual logic transformation by going code-free?

7

EXISTING INVESTMENTS

Is it possible to reuse your EDW investments by transforming not just the data but scripts, views, reports, business logic and code, and more?

8

DRIVING INNOVATION

How can you drive the innovation agenda by improving data availability across the enterprise and staying ahead of the churn?

9

A PROVEN SOLUTION

Is there a platform that’s proven, reliable, fully automated, and capable of transforming all the required workloads?

10

OPTIMIZING EFFICIENCY

How can you optimize IT teams’ productivity by automating, simplifying, and de-risking transformation of EDW, ETL, analytical, and reporting workloads to the big data warehouse?

11

HYBRID APPROACH

How can an optimized performance be achieved for cloud, on-premise, and hybrid strategy?

 

  An Automated EDW Assessment and Transformation Solution is the answer.

 

Venkat Chakravarthi
Venkat Chakravarthi
VP of Modern Data Architecture Practice
15 Oct 2018

home

Five Things to Consider Before Performing a Workload Migration

Legacy data warehouse transformation is complicated and risky.  A successful migration requires a detailed evaluation on multiple parameters – including queries, tables, sub-queries, database views, users, applications, target query execution engines, and more.

To help guarantee that your workload migration is primed for success, we’ve put together a list of five things to consider before you get started.

Many other companies have been able to minimize the risks of a workload migration project by automating the whole process. Automation reduces the time required and removes error-prone manual activities while transforming the experience using a powerful, proven process. (More about that later).

1. Plan every detail by establishing your modernization objectives, determining the strategy and target blueprint, defining the roadmap and expected ROI. (More on that below).

When moving to a big data platform, your migration requires a tried and tested strategy to be sure to deliver the expected outcome.

Before getting started, work with the IT organization and your workload migration partners to define the overall goal of the migration, and define the project success criteria. We recommend starting by analyzing the current EDW workloads and the requirements for new analytical processing in the modernized environment. Then preparing the data to be migrated. (By the way, we call this the Assessment phase, and we’ve got tools that can automate the all-important assessment process for you and help you ensure that you do not overlook any critical steps).

2. Make predictions on performance in the new environment

Many companies rely on the broad experience of their partners to help with the approximation here, but we’ve witnessed far too many organizations skip this step. To understand how your applications will perform in the new environment, you will need to understand the current performance profile inside out. Gathering performance statistics and understanding them is mission critical. Performance optimization can help with huge datasets.  However, this step is complicated, and many organizations choose to work with partners even during the planning stage of the workload migration initiative.  Successful migration hinges on getting this preparation right and using a repeatable methodology is crucial at this step (We’ve got one).

3. Plan for the costs

It’s essential to know how much your application will cost to run in the new environment. Typically, migration to a big data environment curbs overall cost. Effective cost planning also ensures that the project does not get derailed due to inaccurate forecasting or project planning, because a new system running alongside a legacy solution can increase the operational cost by 100%. However, after migration, many enterprises realize a 300% ROI.

4. Identify aging or outdated software

Look, we still love, value, and appreciate the legacy data warehouse. It’s been the rock of the enterprise for decades. You don’t have to eliminate it entirely, but if you want the best capabilities and performance, it’s time to migrate to a modernized warehouse architecture. We can help you identify what you need to migrate and we can even define a data-driven roadmap for you.  In fact, we can simplify, optimize, and automate the whole process for you using our proven methodology.

5. Visualize your long-term success

Consider the long-term costs and benefits when making workload migration decisions. For example, investing in a good partner with experience might cost more upfront than tackling this internally, but the key is to build for scale and flexibility in the long term.

Now what?

Impetus Technologies specializes in helping large organizations to modernize their decision support environments. We are helping many organizations realize their EDW workload modernization goals with a successful implementation from start to finish. We do this by applying automation and our many lessons learned from our experience as the proven partner of choice for many leading enterprises. We offer a unique mix of full lifecycle consulting services, software tools, data science capabilities, and technology expertise:

  • Full lifecycle services

  • Technology strategy

  • Solution architecture

  • Production implementation

  • Ongoing support

No need to reinvent the wheel when doing workload migration today. For a free consultation with our experts and to learn more, call us today.

 

Venkat Chakravarthi
Venkat Chakravarthi
VP of Modern Data Architecture Practice
31 Oct 2018

home

Migrating Workloads from Netezza to Big Data: An Automated Approach

Manual migration can be complicated. Niche technologies, human error, lengthy assessment phases, testing, validation, execution, and the migration itself – these all contribute to the complexity.

The Impetus Workload Migration Solution addresses these complexities and minimizes migration risks.

Our approach: We work with you to ensure 100% execution

The Impetus Workload Migration Solution migrates workloads in four phases:

  • Initial business assessment

  • Execute a pilot workload to prove the ROI

  • End-to-end migration: Using the complete data and workloads

  • Post-migration considerations

 

 

Phase 1: Initial business assessment

In the first phase, our automated tools assess business goals and the existing data warehouse environment.

  • The goals are mapped to the pre-defined SLAs and performance benchmarks.

  • The assessment engine processes the query logs (or?) for your workloads to perform an in-depth analysis of all your system entities and provides recommendations for migration.

The comprehensive assessment of the Netezza data warehouse does the following:

  • Recommends the ideal migration candidates and their precise positioning on Hadoop

  • Furnishes low-level insights such as the most active users and applications, the most expensive transactions, as well as the most complex, resource-intensive, and frequently used entities

With this automated assessment paradigm, the Impetus Workload Migration Solution defines a clear migration scope and strategy.

Phase 2: PoC with client workloads

"83% of data migration projects either fail or exceed their budgets and schedules.” – Gartner

Automating the complete migration process helps determine the budget for cost and schedule. The pilot project then highlights how the Impetus Workload Migration Solution does that.

The pilot solution migrates all your sample (data and logic) workloads in less than a week. Additionally, you can validate your migrated data and metadata by applying numerous aggregate checks using the automated validation framework.

The pilot phase allows you to do the following:

  • Automagically translate SQL scripts and stored procedures to HQL/ Spark SQLs

  • Access a Netezza specific in-built library of User-Defined Functions (UDFs) and keywords to fill in target system gaps

  • Output lock-in free code

  • Avoid long development, testing, and validation cycles

  • Achieve 100% automated logic translation of your scripts using the translation experts contained within our automation engine.  Create migration workflows for execution of the transformed scripts on your preferred execution engine

  • Reload the transformed data to your Netezza data mart for critical reporting/analytical consumption

Phase 3: End-to-end migration

The end-to-end migration is implemented in small phases. A comprehensive business and workload assessment is conducted to establish business priorities, based on the workload to be migrated.

For instance, optimized data modeling can be performed during this phase to get a performance boost on Hadoop.

The Impetus Workload Migration Solution also lets you:

  • Load data directly from files to Hive tables

  • Migrate schema using the DDL files

  • Check audit logs and lineage

  • Migrate database views

Phase 4: Post-migration considerations

Some essential considerations post-migration are:

  • How often must the data be refreshed?

  • Apart from migration, are there workloads that need to be archived, manipulated, retained, or destroyed?

  • How can data lake-based infrastructure be capitalized and employed with business goals?

  • How would data be governed?

  • How would incremental data and continuous ingestion be handled?

  • What should be the access pattern for various workloads?

  • Does my platform provide a unified view?

Conclusion

The Impetus Workload Migration Solution allows a quick, effortless migration to Big Data, ensuring faster time-to-value for both short-term and long-term business benefits. It also lets you add new data processing capabilities, address the capacity constraints, and traditional tools that can choke your systems.

 

Venkat Chakravarthi
Venkat Chakravarthi
VP of Modern Data Architecture Practice

Witness our automated solution with your data