Sunday, October 5, 2014

Digital Data Quality – Can you Improve it by Auditing and Governance?

I have heard someone say that life was much simpler when Apple was only a fruit and we just had to worry about one web analytics tool for our web site. Apple, the company, and proliferation of digital technologies have turned everything upside down - in a good and exciting way.

Talking about digital data quality is not considered as hot as talking about analytics, big data or machine learning. Probably, that will never change! But I am sure companies will do more about the digital data quality in the future than they do with it today. In the end, data quality is one of the underlying foundations for better reporting, analytics and decision making.

Everybody agrees about the importance of data quality but collectively we probably need to figure out an overall strategy from a data quality perspective. Who can argue about the need for your digital data to be accurate, relevant, complete, timely etc.? Haven’t we worked on data quality, data lineage and data stewardship for more than a decade in an offline world of datawarehouse, MDM and CRM? We still have, however,a long way to go in a digital world as far as data quality is concerned.

We definitely have become smarter, more practical and more sensitive about data quality.  We have also learned to live with imperfections and realize that 5-10% discrepancy is probably acceptable given the huge effort required to fix it in a real world. We are also learning to understand how to improve data quality iteratively by understanding the gaps, shortcomings in the process, developing feedback loops etc. But more methodical and structured approach is required to improve data quality instead of reacting to it after a glaring discrepancy in any critical report.

Before we talk about data quality, we need to talk about digital data collection process. You collect data with the help of tags. For those who are not aware, a tag is a simply a chunk of code —usually JavaScript — that performs the task of data collection for various purposes. It has always been painful historically as it was very hard to make it agile both from business and IT perspective given the constraints of release cycles in an enterprise environment. Unfortunately, the level of complexity in managing it manually without a tool and the effort behind it is sometimes not recognized. All of this has been changing in last few years with the new category of tools called Tag Management Systems.

For those who are new to this area, using a Tag Management system lets marketers easily insert snippets of code, called tags as mentioned above, which enable third-party tracking, analysis, reporting, remarketing, conversion tracking, optimization, and much more. A marketer can log in to the tool themselves and add, edit or delete tags as they see fit, without needing code-level access. To make it even simpler, these technologies have already integrated with other ad-tech companies, so the marketer can now just tick a box to activate the appropriate tags. Tag managers allow marketing to have control over their own little space on a web page. For example sake, if you have 5 to 20 tags on any given page then they are replaced by a single container in a Tag Management System. That container contains code that listens to rules dictated in the tag manager's backend as to when to fire what tags. The Tag Management systems can also do some cool things like reducing cost to POC a new tool in your ecosystem; correction of campaign issues in real time among many others. Innovative techniques like data layer is being implemented by companies which separates data collection, manipulation and delivery from web page structure. Data layer defines events and information uniformly across the site – basically a consistent place to store and retrieve data values so that different tags can easily and quickly find the same piece of data.

The world of tag management continues to gain more traction. The recent 47.2 million dollar funding of Tealium – one of the leading tag management vendor – validates the upside of this new space. There are dozen vendors like Tealium, Tagman, Ensighten, DC storm, Site tagger, Google Tag manager etc. with their own strengths and weaknesses and often rated based on number of tags they can support among other criteria.

But before you plan your tag management strategy, don’t you need to think about the existing tagging gaps and errors in your web pages? Even if you bought a brand new tag management tool today, you will still have existing tags throughout your website deployed probably in the last decade. The most common tagging problems are incompletely deployed; incorrectly configured; not configured; duplicated and non-removal of old tags. How do you know where to start to implement your new Tag Management tool? Basically, you need to audit your existing web pages landscape in order to develop any approach.

I still think that importance of tag auditing is not well understood because first it is new as well as process around tag auditing needs a broader conversation across different business units. Using a tool like Observepoint, founded by John Pestana (cofounder of Omniture), to audit your website can be a good option. There are few other tools also in the marketplace if you want to explore. Tag auditing is about validation and identifying tag placement and configuration problems. It helps you find missing tags; tags which are not firing and incorrect variables and parameters. And it presents its finding in an easy-to-read reports.
There are two components of a tag audit – a site scan and monitoring. During the scan, the system tests the web site and catalogs tag data for every page. The monitoring component – also called as “simulations” – is put into place to detect sudden disappearance of tags, or unexpected tag variable changes.

A tag auditing tool like Observepoint enhances your investment in the tag management solution - It doesn’t matter which tag management tool you own. An improperly deployed data collector can result in broken web pages, loss of site traffic and subsequently lost sales – you definitely want to prevent all of it from happening. If you worry about compliance then there is also a risk of data leakage also if tags without permission are deployed or not removed by mistake.

Even if you own Observepoint and generate reports with it, you still need to define your digital data quality governance process to make it work. There is no standard way of doing it as all of it is new and every organization is different. There are many questions to answer! You still need to define who owns to fix various issues identified in the audit report? How do you prioritize? How do you know it is working well? Do you know who owns digital data quality in your organization? Should the team who is collecting and normalizing data be responsible for data quality? Should it be your Analytics team, IT team, QA team or different business units should be accountable for every type of tag/data? What should be the frequency of your audits? Who makes a decision whether it is critical issue or not after an alert from your tag monitoring system?

We do need to recognize the learning curve in the world of tag management and auditing as it is still not mature. Enhancing digital data quality is a hard job and often tedious for the people who work hard for it. But it can be very rewarding in the long run! Also, improving digital data quality is a collective responsibility – not something which can be just owned by your analytics or your IT team though they can certainly lead the effort.

Monday, September 22, 2014

Online Video Personalization - Why is it becoming So important?

All of us have heard the phrase, “A picture is worth a thousand words.” But according to Dr. James Mcquivey of Forrester research, a minute of video is worth 1.8 million words. I am not aware of the exact math Dr. Mcquivey used for his conclusion but through experience we know that video does leave a lasting impression.

In the last few years we have seen an unprecedented growth in online video. According to Cisco, 74% of all internet traffic in 2017 will be video. And there are other cool things we keep hearing about video:
·         Web pages with video are 53 times more likely to appear on first page of Google.
·         52% of marketers say that online video has “among the best ROI.”
·         Visitors who view videos are 2 times more likely to purchase.
·         Every survey will tell you that there is a significant increase in digital video production budgets
·         10% of all video viewing in 2013 was online. And more and more people are using mobile and tablets to watch videos and for longer duration.

But there are enough challenges for most of the companies in the video space:
·         They don’t really understand how video contributes to sales in quantifiable way. It makes it more difficult to understand impact of video on sales if you are also a brick-and-mortar retailer i.e. you have some kind of Omni-Channel strategy
·         Most of them have very little idea about how many online videos should be produced every year. There is no quantifiable rationale behind those decisions.
·         Online video still has challenges of volume as it relates to professional grade content
·         It is not well understood how to target videos – basically personalization of videos. Basically, using data to identify the consumer and serving the relevant video based on context. Though, we understand at a high level at the viewing habits of large set of people but most of us remain clueless about viewing habit of individual users.

Personalization of online video is a big opportunity for online retailers from a sales and marketing perspective. It should be a big part of their overall personalization strategy. They need to work to tie their audience profiles and context to their video inventory through the help of analytics. A targeted video has all the benefits of personalized content but in addition to that it increases the engagement time. It has more probability of influencing a potential customer to buy from you than any other type of content on your website. It is also more economical to target videos through online channel. On top of that, video always has the potential to go viral.

Personalization of online videos should also take the customer/prospect journey into account. If the prospect is just considering your product/services then you should target a product/service video or a case study. For conversion, a presenter led video makes more sense; a training or a customer service video might be appropriate for loyalty purpose for a returning customer.
It is also rare to see web properties recommending videos in a similar way as you see on “Youtube” or watching videos on “Netflix.” Why?
 
Utilization of video is probably better understood on self-service side when it comes down to “How to” videos. It does help in significant savings in reducing calls to call centers.
 
Maybe, one of the reason for lack of video personalization strategy is that most of the companies are still trying to figure out the foundational pieces of online video - Basically, things which relate to video management, infrastructure and performance, SEO, video players, e-commerce support, mobile support, responsive design etc..  They are relying on one of the video platform vendors like Brightcove, Ooyala, Adobe/Scene7, Kaltura, KIT digital, Invodo etc. to take care of the foundation pieces. Personalization of video is not generally in the near term roadmap. Most of these vendors do a good job in building foundational framework for you and have their strengths in certain areas. You will have to figure out an evaluation process to choose one of them.
 
 
HTML 5 players, Schema.org for videos, mobile/tablet support and video interactivity are one of the important things happening in online video world. Companies like Adways and Wirewax can offer you great options to integrate with your video platform if you are interested in producing interactive videos.
 
 But despite these new breed of video vendors available to you, in the end you will have to own your video personalization strategy because it is based on the data you have about your visitors. It will also depend on analytical and machine learning capabilities of your organization. Also, how efficiently, you can make different teams like content production, IT, CRM, creative and analytics work together.
 
 There are also other types of personalization happening on video front. AT&T tried using a personalized video to explain mobile bills to their customers. It uses the technology developed in its AT&T Foundry innovation center in Ra’anana, Israel.  It will be interesting to see how this trend evolves. We will probably see this kind of personalization being applied in re-targeting of video ads.
 
 In the end, personalization and analytics driven video strategy is still a distant thought for most of the companies. But companies who are serious about offering the best online experience don’t have all the time in the world to figure out their video personalization strategy. Most of the large companies have all the tools, content, data infrastructure and resources in place. They just need champions, right thought leadership and executive sponsorship to make it happen. Video personalization shouldn’t be an afterthought in your personalization strategy – it should be one of the key drivers.

Saturday, March 15, 2014

Predictive Analytics Conference

Predictive Analytics has come a long way in last few years. It continues to get simplified and the consensus is that you no don't need a PH.D. to practice it. It is no longer a backroom activity and modern tools are allowing it to be deployed to much larger population in an organization. You also don't need to wait for your data warehouse to be in place before you can start predictive analytics. The tools continue to make it much easier to visualize and explain the results.

In general, the field is evolving at a rapid pace. Still most of the organizations, with few exceptions, don't have a strategy and roadmap in place. More awareness is required to understand use cases and best practices. Yes, there are more predictive analytics books on this topic which again validates the popularity of this discipline. But a conference like Predictive Analytics conference is a good place to start where there is a great opportunity to network and understand how others are benefiting from this discipline.

I am a blog partner with the Predictive Analytics and you can get 15% discount using the code "PMBP14." Here are some upcoming dates:
 

 
Predictive Analytics World Toronto – May 12-15, 2014www.pawcon.com/toronto/2014
Predictive Analytics World Chicago – June 16-19, 2014www.pawcon.com/chicago/2014
Predictive Analytics World Manufacturing Chicago – June 17-18, 2014 - http://www.pawcon.com/mfg/2014/

Sunday, December 23, 2012

Web Content Management: Opportunity to integrate with Big Data Analytics and Machine Learning technology!

Web Content Management (WCM) space has come a long way in terms of maturity from mid 90s to where it is today. Many companies, specially the ones who rely on their online business for revenue generation, are probably going through third or fourth iteration of overhauling their WCM solution. There are hundreds of commercial and open source offerings in this space catering to all sizes and different needs.The WCM vendors have also done a good job in keeping pace with the customers' demands. In fact, WCM has graduated to being part of a Web Experience Management (WEM) or Customer Experience Management (CXM) where WCM is one of the aspects of the solution along with  web analytics, optimization, targeting, A/B testing, social media integration, personalization and marketing integration. Overall, the space is matured and all major players from HP, Adobe, Oracle, Opentext, SDL, IBM, Microsoft and hundreds of others are playing in it. All of them are trying hard to find their own niche and differentiation as it is a crowded space.

CMS tools have also become more business friendly and no longer require the level of involvement of IT for every small task. Now the business has better ability to do tasks like content creation, presentation management and workflow without IT intervention. These tools continue to offer advanced features like in-context editing, multi-site management, multi-channel delivery, social media integration, globalization etc.. There is definitely room for improvement in the areas of multi-channel delivery, globalization, social media etc.. All leading products support standards like J2EE/.Net and have a cloud offering so that is no longer a competitive differentiation. Mobile is definitely a must-have thing now and everyone, CMS vendors as well as businesses, is trying to keep pace with the increasing number of varied mobile devices in the market. Definitely, there are more channels and platforms to worry about now. Content is no longer though of as a web page now! It doesn't mean because of these advancements in tools and technology, all projects run smoothly. They don't and the main issue is lack of CMS experience, inability to structure CMS projects correctly from timeline perspective and in some cases because of disconnect between business and IT. I also wonder where do we go from here as far as capabilities of CMS tool is concerned?

The focus on building an integrated digital marketing experience by vendors where content is at the core is the right thing but one important aspect is still lacking. That aspect is ability to have recommendation and deep personalization as part of the experience. The bar is set by Amazon.com and it is the model  everyone can look upto. It is not an easy thing to build but CMS vendors need to think of ways to offer templates which can enable it. They also need to think of  innovative ways to integrate with all the work that is happening in Big data Analytics world. In my experience, companies have a very good understanding about the level of personalization and recommendation they should provide to their end users. They just don't know how! They are just constrained by the limitations of their IT departments to deliver this functionality. IT also has a big challenge as they don't have access to out-of-the-box tools or an easier path to provide such functionality. Analytics and machine learning, technology behind deep personalization and recommendations, are not understood very well by many IT departments. It is also not easy to build as an  investment like an amazon.com in recommendation technology is not easily affordable. Something to keep in mind for CMS tool vendors!

Thursday, September 29, 2011

Data Without Borders: Data can be a burden if it is not set free!

Wikipedia describes philanthropy etymologically means "the love of  humanity"— love in the sense of caring for, nourishing, developing, or enhancing. Historically, philanthropy has always been associated with giving generous donations of money. It will continue to be associated with giving generous donations but some professionals can make bigger impact by donating their skill set than money. Yes, I am talking about the skill set of data science! Data science is relatively a newly coined term and probably originated from data geeks working on hard data problems in the companies like Linkedin, Facebook and other technology companies who needed these experts to make sense and insights from the vast amount of data being produced everyday. Data Without Borders, a newly founded organization, seeks to match non-profits in need of data analysis with freelance and pro bono data scientists who can work to help them with data collection, analysis, visualization, or decision support. The concept is brilliant and makes sense!

There are various initiatives out there where technology is being leveraged creatively to help the non-profit organizations. The Bill and Melinda Gates Foundation has recently funded a new digital-media hub call ViewChange.org. The hub uses semantic technology to create a platform that combines the video sharing power of YouTube with the open information of Wikipedia and the mission of your favorite advocacy organization. I had written about it in more detail in one of my posts titled - Philanthropy goes Semantic.  Ushahidi, initially started as a simple web site to map reports of violence in Kenya, is another non-profit tech company that specializes in developing free and open source software for information collection, visualizing and interactive mapping. To my knowledge, Hans Rosling, a medical doctor and a statistician with decades of work studying outbreaks in Africa, is probably the first data science philanthropist. He co-founded Gapminder foundation which developed the Trendalyzer software, acquired by Google, that converts international statistics into moving, interactive graphics. His TED presentation about his best stats you have ever seen is worth watching.

The genesis of the idea of "Data without Borders" is to match the NGOs, who are sitting on lots of data with nobody to look at because of  resource and budget constraints, with data scientists who have the energy, time and passion to make sense of this data. Timing of this initiative couldn't be better because data scientists can now have a common and noble cause to rally behind! It is the beginning of a powerful vision but it will surely have its own challenges.  Having some experience with an NGO myself, I can say that sustaining the enthusiasm and commitment of data scientist for a long-term can be challenging. We are all aware that data scientists are going to be one of the most sought after, busiest and highly paid professionals in the next decade! So I will go for a good data scientist with more commitment over a rock star data scientist in this context. Also, a weekend of data hackathon in this context will probably won't be enough because data Science is an iterative process and will require an ongoing engagement. It is still not clear to me that why there are not initiatives like open government data in case of NGOs to build powerful data mashups. I am aware of new standards like IATI but its more about aid spending by governments. In this context, I believe that too much data can be a burden if it is not set free and used effectively. Ideally, in case of NGOs, open data shouldn't have political or privacy barriers. In the end, the co-founders of "Data without Borders" will need all possible support, structure and maybe funding, to be successful in their mission. Winston Churchill, rightly said, "We make a living by what we get, but we make a life by what we give."