Speeding Up the Time to Value: Parallel Development

Merkle’s modernized marketing database has expanded capabilities using big data to source data for business intelligence, campaign management, and audience management users. However, there is no value to our clients or to Merkle during the implementation of a new marketing database; they are simply waiting for those audience management, campaign management, and business intelligence capabilities that they need.

Cutting the time to develop that marketing database is a win for both Merkle and our clients. Most custom database development requires a step-by-step approach, also called “waterfall development.” These projects require each component to be individually thought through, designed, and sometimes even coded before the next component can be built. This time consuming approach is reflected in a typical database build below:

Custom, Waterfall Development Life Cycle Steps

Waterfall development cycle

Step 1: The requirements for the project need to be determined from the client, often using the much un-appreciated “blank sheet,” or better still, “what do you want” approach, which has four typical outcomes. The first is the client spinning real needs and perceived need far outside the scope of what was agreed to, creating a time and cost issue from the very beginning of the project. The second is that the client senses what they need but takes a very long time to enunciate it and properly documents the requirements, unnecessarily prolonging the process. The third outcome is that the client is irritated because, “aren’t you the expert?! You tell me what I need!” which creates apprehension about whether they made the right choice for a vendor. The fourth is the requirements are developed per plan. One chance in four of getting it right is not a great way to start a project!

Step 2: Typically the data model for the marketing database needs to be defined, designed, and built before anything else can get started. Starting with the data model tends to create challenges revolving around inevitable adds and changes coming from the detailed design work of the campaigns, business intelligence, and the data integration. These types of changes alone can cause a project timeline to increase 20% - 30%.

Step 3: Data integration (a.k.a. ETL) typically accounts for 70% of the cost and risks of a data project. Custom ETL tends to be designed based on the discipline of the designer developer working on it. Custom ETL efforts are rarely on time and usually fraught with significant design and coding defects.

Step 4: CDI is often performed as a separate activity and is rarely considered part of the ETL work in order to leverage common activities.

Step 5: Campaign management is one of the “what do you want” activities. Campaign planning requirements and campaign metrics (such as acquisition and retention metrics) are often gathered in an ad hoc method that impacts the design and development as well as the final user acceptance of the campaigns.

Step 6: Custom business intelligence development is the classic “it is what I asked for, but not what I need” scenario. Waterfall BI development typically documents the requirements, and designs and develops the reports with little to no review by the end user until the final output is produced. Often times the end users do not see the reports or dashboards until user acceptance testing, revealing any discrepancies between the expectations and reality of the reports.

Sixty percent of the time, these projects take longer than expected and cost a client the better part of a year to implement. There is a faster way and Merkle’s industry-based marketing database assets are the key. If certain key design assumptions have been defined as best practice and developed into pre-built third normal form. For example, if the best practice assets are pre-built for dimensional data models in Hadoop and SQL Server, component-based data integration assets in Informatica, industry-focused, pre-built campaign management in Red Point, and BI reports and dashboards in Tableau, then it is possible to perform parallel development with multiple work streams.

What is a Work Stream?

When building a house, there are multiple “threads” of work. For example, after a house has been framed, there will be electricians, plumbers, and heating and cooling engineers designing and developing their threads or “work streams.” Each of these disciplines have known engineering best practices, patterns, and assets that allow them to quickly and in a repeatable approach, build their components. This is the same advantage with the Merkle marketing database assets, where we can significantly cut the time, effort, and cost using the assets in the discovery, design, and development phases of the project in parallel. This is portrayed in the next example.

Asset-based, Work Stream-based Development Life Cycle

Asset-based, Work Stream-based Development Life Cycle

Step 1 – Discovery Phase: This project would start very differently from the waterfall approach with the project team leveraging the pre-built, industry-based Merkle campaign segments and business intelligence dashboards to help scope the best practices for that industry with the client. Rather than performing the “blank sheet” approach, the Merkle Team will build instant credibility by showing industry expertise and helping reduce potential scope. The requirements for the BI and campaign management help drive the dimensional model requirements. Using the campaign management and BI requirements, the project’s data modeler would glean and customize the subject areas from the Merkle industry-based data model to build a project conceptual data model. At the same time, a small team of Data Integration BSAs would perform source system profiling, then map the data from the client’s identified source systems to the Merkle industry-based ODS data model, and finally create a project conceptual data integration model (similar to a process diagram or data flow diagram). 

Step 2 – Campaign Management Work Stream: A small team of campaign management architects, BSAs, and developers integrate the segments and measures needed for the campaigns, along with the channel formats and output components into a tool. The team would then prototype the campaign management solution with the end user and harden the templates for production.

Step 3 – Business Intelligence Work Stream: The BI architects and BSAs prototype the dashboard’s and reports’ look and feel. Another session would verify the reports navigation and harden the reports for production. By having the end users deeply involved in two to three prototyping sessions, the issues of “it is want I asked for, but not what I need” are usually mitigated and user acceptance becomes a much easier task.

Step 4 – Customer Data Integration: The CDI work stream architect and BSA will determine any required customization for the besting rules and customize or augment the pre-built cR-CDI components for the Connected Recognition (cR) input files. The final step is to prototype the data with the end user to ensure that the matching rules are correct to the business requirements and any PII data requirements are being met.

Step 5 – Data Integration Work Stream: In parallel, the DI architects and DI developers leverage the Merkle component-based data integration blueprint to:

  • Design and develop  a source system extract job from the mappings, with one per source system
  • Modify and/or extend the data quality and transformation common components jobs
  • Modify and/or extend the ODS subject area jobs
  • Modify and/or extend the ODS-to-MDB job

Step 6 – Data Management Work Stream: The project data modeler has a very different role on these types of projects. Since models already exist, they spend most of their time reconciling changes and extensions to the model, coming primarily from the campaign management and business intelligence work streams, and secondarily from the source system mappings in the data integration work stream. The data modeler plays an important role in quality assurance by verifying data mapping fields and business rules, thus reducing downstream data integration testing errors.

It is important to note that while assets help reduce the cost, time, and quality errors, there is always some risk in running multiple work streams in parallel. However, there is sufficient evidence that using assets in parallel development saves time and cost for clients, thereby increasing the time to value.

The ability to reduce the time, cost, and risk of a typical database deployment from 30% - 40% can reduce cost of a project by $300,000 to $400,000 per deployment and the time by three to four months, thereby providing both the client and Merkle, faster time to value.

Time to Value in Digital Systems Integration

Digital systems integration is the functional and technical integration of the components of the Connected Customer Platform. It provides the connective tissue between traditional marketing data platforms and adtech. For example, in a media execution solution, there may be the need for data from a DMP to be sourced from an FMP and integrated into an AdTech package such as MediaMatch. Each of those integration points will have a functional use case and recommended best practices for creating a combined capability amongst its piece parts. The time we will save clients by having the expertise and accelerators will again provide faster time to value.

The ideal end state for our clients to realize the full benefits of a connected customer strategy is an instantiated Connected Customer Platform. Our Merkle Data Management Platform and digital systems integration assets will help our clients realize time to value sooner with less cost and risk.

Join the Discussion