We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.
×

Instrumenting the AWS Cloud Migration Deployment

Every Amazon Web Services (AWS) cloud migration season runs the risk of being followed by a sequel season dominated by buyer’s remorse.

Remorse can set in because some companies that sign up to rebuild their applications as AWS cloud native (build and run applications in Amazon Web Services so that you can take full advantage of the platform’s benefits) find themselves saddled with a poorly executed cloud deployment project that is over budget and behind schedule. The AWS cloud dream of realizing decreased operating costs and increased scalability turns sour and the company considers either settling for a lift and shift project that compromises the benefits of AWS or scrapping the migration altogether. The good news is this can be prevented.

The Issue: The Legacy Application

One of the main reasons why AWS cloud migration deployments fail is hidden deep in the underbelly of the business world - the “legacy” application. Typically, the legacy application supports a business-critical function, is hosted on a company-owned data center, and is built using dated architectures and software packages. The application is held together by a patchwork of fixes and a tapestry of temporary remedies that have been layered on over the years. To compound matters, the application is poorly documented and maintained by a group of teams where each team knows only one specific part of the application, but not the entire application. In some cases, the business user community that prioritized requirements and guided the original deployment has long been disbanded and institutional knowledge about the application is left in the hands of a precious few.

Untangling the Mess

So how does one efficiently untangle the legacy application and decrypt the business rules that hold it together? By being use case and tactic driven. Let’s consider that the application being migrated is a CRM (Customer Relationship Management) application. The use cases surrounding CRM are broadly supported by customer activation tactics (campaigns) and business intelligence insights (reports). By inventorying these tactics and reports first, and then mapping the data assets needed to support them, the migration team can create a backlog of requirements that can then be prioritized and delivered.

When we start to focus our attention on delivery, the key activity to design and instrument a process around is the mapping step performed by the analyst. As mentioned earlier, the mapping step helps identify the data assets that are required to migrate the tactics or reports to the cloud. In cases where these data assets have not yet been accounted for in the data model or the business rules, data engineers will need to be engaged to address the gaps. Once the gaps are addressed, the tactic or report is “unblocked” and can be developed in the cloud environment.

As the saying goes, a process that can be measured is the one that can be improved. Here are a couple dashboards that can help you instrument the AWS cloud migration process. The first one is a dashboard that provides insight into how the data mapping exercise is progressing at the tactic or report level.

Migration Dashboard

In the above dashboard, Object 1 has been analyzed to utilize 100 data attributes, of which 80 have been mapped to the data model. The remaining 20 will need to be designed and developed by the data engineering team. The dashboard also gives a view into the work effort (30 hours for Object 1) remaining once the gaps are addressed and the object is “unblocked”.

The second dashboard provides insight into how data attributes that are identified as missing in the data model get validated as confirmed data gaps that need to be addressed by data engineers. The funnel view below acknowledges the fact that institutional knowledge that resides within organizational silos can come in handy in devising workarounds and ad-hoc solutions, while the data engineering team works on a more permanent, longer term solution.

Data Gaps Funnel

Parting advice here is to invest time prior to starting the migration project in taking stock of the tactics (campaigns or reports) associated with each of your use cases and assessing whether they are relevant and still performing consistently with your company’s current business realities. Conducting an assessment can provide insight and clarity into defining an MVP (minimum viable product) for your migration project. This assessment will go a long way in ensuring that season 2 of your cloud story is less about remorse and more about realizing the benefits of cloud migration.

Merkle recently entered into a strategic collaboration agreement with AWS. Learn more about our AWS capabilities here.

Join the Discussion