Site icon Software Reviews, Opinions, and Tips – DNSstuff

The Evolution of the Web and Digital Experience Monitoring

Digital Experience Monitoring (DEM) is so business critical that it’s featured in a Gartner magic quadrant. At the same time, it is so new that even Wikipedia does not know what it is.

(related read: What is Digital Experience Monitoring (and Why Should You Care)?

Why is that? Monitoring as a practice is as old as dirt, and digital experiences have been around since at least the birth of the web in the 1990s. The next most logical question is: How did developers and operations engineers (pre-DevOps) fix application performance anomalies in the past?

The answer reveals many fascinating stories, many of which come with lessons on how to improve digital experiences in the future.

APPLICATIONS ARE BUILT ON HOPE

The distance between a development sandbox and a production environment can be tiny or even non-existent. Sometimes they are virtual machines on the same box. Yet the experience of using an app in the sandbox is normally vastly different from an app in the wild. In the early days of app development, the best you could do is test locally and hope for the best.

It was clear that there had to be a better way to test if a fix addressed the problem in production and if that fix triggered other problems elsewhere in the code. That was the beginning of regression testing.

Performance variation for web-based apps turned out to be often related to the user’s OS and choice of browser, which required manual testing. In response, developers put their heads together and co-created automated testing platforms like Selenium.

Selenium generates test scripts in the most common languages, such as C#, Java, PHP, Python or Ruby. These tests then produce reports on how web apps will run on virtually any browser and OS. Selenium became popular by being open-sourced and free to use under the Apache 2.0 license.

A WORLD LIT ONLY BY BROWSERS

Before the mobile revolution, which kicked off with the first iPhone at Macworld 2007, Internet Explorer (IE) still ruled the browser market. In 2004, IE6 held onto around 95 percent of the browser market because MS Windows was still the predominant OS for businesses.

At the time, new browsers like Firefox were coming online with the most advanced features built in. These included tabbed browsing, spell check, bookmarks, download managers, geolocators and address bar search.

Although Firefox saw 60 million downloads within nine months, IE6 was still the standard. All developers had to ensure their web apps worked on IE6 and couldn’t afford to take advantage of the latest advances in browser technology. This is a good example of how the digital experience tends to be limited by the delivery channel rather than what is possible in terms of development. IE would soon lose relevance, however, as a result of the mobile revolution.

HOW APPS BROKE AWAY OF THE WEB

There were smartphones long before the iPhone, and there were apps long before the app store. What shook the world in 2007 was that the iPhone took 13 percent of the market within a year, eclipsing the Blackberry and pushing Google into releasing the Android OS for free. Microsoft’s Windows Mobile would be out of commission within months and the app ecosphere soon exploded. By 2013, there were 2 million apps, a million in Apple’s App Store and another million on Google Play.

Native mobile app development very quickly caught up to web app development, yet both delivery pathways continued to be problematic from the standpoint of application performance monitoring (APM). While web apps continued to face challenges from disruption in the browser market, native apps introduced a new problem.

By minimizing network and connectivity issues, apps could run blazingly fast on the tiny processors inside phones. Users became accustomed, and then psychologically dependent, on that kind of performance speed. Once you have driven a Lamborghini, it is frustrating to drive a Kia.

While there are various metrics on the ratio of loading times to revenue loss, Kissmetrics estimated that a one-second page delay is worth about $2.5 million in lost sales to an average e-commerce site. No matter how many customers you serve or what industry you are in, microsecond delays now mean the difference between a user staying in your app or abandoning it for the competition.

The most careful development teams have built ways to find and eliminate problems before they drive away users. What they created was synthetic monitoring, which is also known as proactive monitoring, or using scripts to simulate what a real user would do with your app. You can think of it as the pre-production version of DEM.

THE CONSTANCY OF CHANGE

The performance of web-based and native apps are only the beginning. Now IoT is on the forefront of development today as data collection and processing moves into practically every device you touch throughout the day.

Cars, kitchen appliances, digital assistants and even clothing are becoming aware. All of these channels and more must be optimized for the best digital experience and operate flawlessly. Users have become accustomed to instantly accessing what they need, and now have very little tolerance for poor performance. New competitors arise from every corner of the globe as barriers to entry tumble. They do not need to wait, so abandoning a poor process is just good thinking.

Any company that has a digital presence, which is essentially every company, can no longer afford bad performance. That translates into user churn and lost revenue. For now and the foreseeable future, business continuity is a matter of DEM.