Posted by: Kalyanaraman Kuppuswamy On March 31, 2020.

Migration - A Mystical Technology Ritual

Whenever we think of migration of “on premise” assets onto cloud, most of it goes with data migration alone. This compulsive instinct has to be moderated, as one has to additionally shepherd through other activities prior to touching data. Movement of data is the final big baggageof migration, in the agile execution journey/sprint cycles and that it should not be approached only to do with data.  This is where all SIs may go wrong. Such incongruities should be removed and cleansed from our thoughts

While there are various approaches that exist, migration of “on premise assets” onto cloud is a combined movement of business data, metadata and technology assets that carry the data from point A to B – right from birth of data all the way to reports/dashboards.

In the appliance’s context, for instance, it should never be looked at from the standpoint of LIFT AND SHIFT of whatever inside. Instead should be approached through the dissection of usage data – which are consumed through reports/dashboards/alerts and other manifestations of data consumption

We should remember to question the legitimacy of data inside an APPLIANCE BOX, as to whether or not they are 100% consumed.  Industry captures data as if there is no tomorrow and every customer romanced appliances for ages as it could capture tons and tons of data; finally, they realized that the love affair costed money too – forcing them to divorce.

Further, one must interrogate consume-ability of data through the prism of how much reports/dashboards really served business units to take informed decisions.  Analysts’ metrics say that only 51% of organizations take decisions truly guided by data – while acknowledging aberrations even within those who claim of having constituted an INSIGHT driven enterprise.

When transformation could start operating at consumption layer (post rationalization of reports), following clarity will emerge

  • Learn to start migration at reports/KPI rationalization layer, to cleanse the GARBAGEs at informational delivery layer
  • Remove all reports that have not been accessed (in other words aged) for a longer period of time
  • Be confident of those reports/dashboardsor other form of informational consumption assets (be it statistical or cognitive assets) - before travelling deeper into MIGRATION PILGRIMAGE
  • As a consequence of above item, one would be intelligent to know as to what are the “REPORTS’ SEMANTIC FRAMEWORK” to migrate

This semantic layer migration could be at department level or at a business unit level; choice depends on consulting flavor we bring to the table, based on customer needs. Followed by, the data objects or data sources contributing or feeding to those reports/dashboards in semantic layer.

Trace what are those ETLs/data ingestion processes that carry data to those data source/database objects, which in turn feed data to reports (in other words to REPORTS SEMANTIC LAYER)

Ascertain at database/data source layer to see if certain portion or block of data used or not.  If they are not used for considerable time, discard them from migration, thereby help arresting wastage of storage resources in target environment. Approaching migration through the prism of consumption of data would be a wiser tactic. However, a reverse-sweep of migratory activities from source system perspective will complement the former top down method

Post migration, carrying out following tasks will significantly help managing target environment better.

  • Catalog targetsystem using metadata – covering Applications, their execution behavior and consumption of resources (both at data layer as well as at KPI/reports front)
  • use test data management capability to help refine testing, in addition to its automation.
  • Enforce stringent DQ rules at the source system, to stop/prevent data adulteration
  • Bring BI GOVERNANCE and DATA GOVERNANCE, to instill sanity over time at target eco system.
  • Institute an ability to cognitively POLICE the target (post migration) environment, using AI/ML techniques, to arrest and reprimand rebellious/reckless behaviors related to creation of redundant business KPIs/reports/dashboards/statistical/cognitive assets – which could lead into chaos once again.

Above discipline enables the target environment perfectly suited for dataOPS driven analytics approach. dataOPS is all about ensuring maximized throughput out of analytics by creating a culture of data transparency, traceability and trust. The fact that metadata driven governance paves way for such vibrant DNA, efficiency and agility become a natural consequence to the former.

Another closest cousin, aiding DataOPS, is the establishment of BI on BI. It will embolden organizations to monitor usage of “business insights consumption” over cloud, which inevitably wraps intake of infra/software assets’ base intake along with it. Industry is getting into the cult of measuring ROI from the established IT faculties and “BI on BI” in the data/analytics context perfectly fits into the bill – especially in “post cloud migration” context.

In the recent study, Gartner said growth in enterprise IT spending for cloud based offerings will be faster than growth in traditional (non-cloud) IT offerings through 2022. Organizations with a high percentage of IT spending dedicated to cloud adoption is indicative of where the next-generation, disruptive business model will emerge.

About The Author

Kalyanaraman Kuppuswamy

With over 20 years of industry experience, Kalyanaraman Kuppuswamy is a seasoned Data analytics professional. A hardcore/tech savvy specialist, Kalyan has held a variety of technical and business roles across delivery, presales and consulting - spanning across verticals like Telecom, Hi-Tech, Semi-conductor, Insurance, Banking and MFG.

Kalyan, currently heads the IPs/Solutions/Platforms for Tech Mahindra’s Data and Analytics competency in pursuit of his endeavor to build “thought provoking” differentiators for Tech Mahindra.

Among many other specialties of Kalyan is his ability to pen stimulating technical blogs-white papers (in Data/Analytics Space), on emerging technology trends and “business theme drifts”. He holds an MS degree in Software Engineering, from Fairfield University-USA.