6 business lessons learned as a result of COVID-19


The unprecedented coronavirus pandemic has caused changes in mindset, attitude, direction, and behavior for organizations, Cisco found.

Image: Niyaz_Tavkaev, Getty Images/iStockphoto

A Cisco report released Wednesday identified six key business lessons learned from the coronavirus pandemic. The unprecedented nature of COVID-19 caused businesses to make critical decisions never before faced, resulting in attitudes and practices that will remain well past the pandemic. 

SEE: Return to work: What the new normal will look like post-pandemic (free PDF) (TechRepublic)

The report, titled A New Perspective in the Modern Workplace, was conducted by Freeform Dynamics on behalf of Cisco between late 2019 to May/June 2020. The research explored the significant impact COVID-19 had on the enterprise, particularly with the change in working behavior.

One of the most blatant effects the coronavirus had on the working world was reflected in the acceleration of remote work. The enterprise has an average of 4.7 times more home workers now compared with before the pandemic, the report found.

This transition came with growing pains, forcing organizations to quickly provide the technological infrastructure and resources necessary to conduct work from afar. Despite those challenges, nearly three-quarters (74%) of respondents said their businesses will in some ways emerge stronger from the crisis. 

To help ensure all organizations come out stronger, the report identified the following six lessons for companies to consider. 

Six key lessons post-COVID 

1. What business agility really means

The coronavirus provided a big lesson in agility companies, as they had to be agile to adapt and respond quickly to changing events. This situation made companies realize how important it is to be comfortable with change and willing to shift gears when necessary, the report found. 

For many organizations, the shift unveiled the gaps and weaknesses in their businesses, helping them to pinpoint to errors and improve them for the future. 

2. The real value of modern technology options

The virus also emphasized the importance of technology, as tech was relied on more than ever for companies to conduct work. The majority of respondents (67%) said they agree or strongly agree that the pandemic accelerated their adoption of cloud-based communications, collaboration, and productivity tools, the report found. 

More than half (58%) of respondents also said they ended up using technology that was already available to them but previously rejected or ignored, forcing organizations to recognize the tech they had been wasting. 

Ultimately, respondents confirmed that digital collaboration tools are the new normal and will remain a staple well after the pandemic, particularly video conferencing.

3. The true nature of workforce productivity

Working from home revealed how many distractions there are in the office, according to the report. Many respondents found themselves more productive in their home environment without the interruptions they had in a physical office. 

This way of work has also caused many healthy practices to surface, many of which respondents believe will stay after the pandemic. Some of these practices that will stay include businesses being more trusting and empowering of employees (53%), managers increasing flexible working hours (49%), virtual teams working across locations and departments (38%), and agile teams forming and disbanding around specific activities (37%). 

4. The essential nature of social interaction

While digital collaboration tools have become critical to remote work and will remain post-pandemic, the new way of work also unveiled the need for social interaction for humans, the report found.

Some 64% of respondents cited a loss of informal kitchen and watercooler-style exchanges as a challenge. To try and mitigate this gap, respondents said they host social video conferencing meetups (67%), social chat channels (54%), news catch-ups (46%), and interactive competitions (36%).

5. The future of health and wellbeing

The virus has naturally had a huge impact on health and wellbeing. Working from home, some 76% of respondents said they find it hard to maintain work-life balance. Some 73% of managers said a big challenge for them was maintaining staff, momentum, and morale, the report found. 

However, the increases in these challenges have placed them at the forefront of organizations’ thinking, forcing them to find ways to handle them. 

Some 47% of employees believe that as a result employee wellbeing and work-life balance will have increased emphasis long-term. More than half (56%) said the same for employee engagement, the report found. 

6. The extended talent opportunity

Remote work has also resulted in remote recruitment and hiring, but companies are beginning to see the value of this widened talent pool. 

Half of respondents said they believe that the increased acceptance of remote and flexible working would almost certainly or probably lead to a more inclusive recruitment policy, and to recruitment of individuals from a broader geographical area even after the pandemic, the report found.

The report recommended business pros take all six of these lessons into account moving forward to help boost their workplace maturity and resiliency. 

For more, check out The rise of the digital workplace and the new future of work: Experts weigh in on TechRepublic.

Tech News You Can Use Newsletter

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet.
Delivered Daily

Sign up today

Also see

Big data lessons: 5 things COVID-19 has taught us


As challenging as it’s been to weather the COVID-19 crisis, there are valuable big data lessons we can gain from it. Here are a few.

Image: Ca-ssis, Getty Images/iStockphoto

We have learned a lot about big data in action during the COVID-19 crisis. Going forward, these lessons will make it easier for enterprises and vendors to deliver better big data projects and products.

SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium) 

Here are five big data lessons learned from the COVID-19 crisis:

1. Visualization is paramount

By now, most of us are very familiar with seeing COVID-19 spread maps on TV and the internet. The maps identify hot spots throughout the world and report the state of the virus state by state in the US. 

These geographic pictorial maps were facilitated by placing statistical data on mapping engines, thereby combining both structured statistics and unstructured map-based visuals to create an overall result. 

SEE: Big data management tips (free PDF) (TechRepublic)

The maps work because we can easily relate to the geography that maps represent, as well as the statistical data on the virus that are superimposed on them. In this case, a “best case” visualization of information is being used by presenters that ensures that the messages they are communicating about COVID-19 spread, and hot spots can be easily understood by audiences.

2. Big data is an enabler

Sometimes the value of big data is not so much in the data it presents, but in the capabilities it offers.

During the COVID-19 crisis, we have seen big data play an important role as an enabler because of its ability to process video, audio, and other types of non-standard information in ways that structured data processing can’t.

SEE: Inside UPS: The logistics company’s never-ending digital transformation (free PDF) (TechRepublic)

A prime example is the widespread use of telemedicine that has facilitated virtual doctor appointments between home-bound patients and their healthcare providers. Businesses with home-bound employees are also using video conferencing as a means of staying in touch and conducting virtual meetings. 

This flow of big data to participants around the world has created opportunities for real-time collaboration and information exchanges that COVID-19 “lockdowns” would otherwise have prevented.

3. Integration and aggregation play key roles

The best big data apps are those that unlock data value from both structured and unstructured data. In the fight against COVID-19, Internet of Things (IoT)-based devices, such as thermometers and contact tracers, can be combined with statistical data to track virus outbreaks so the outbreaks can be mitigated. To produce these comprehensive COVID-19 tracing and detection engines, data scientists must choose the best sources of data to integrate and aggregate so they can create a composite picture of what’s going on. Big data toolsets enable them to do this.

4. Big data projects are mission-critical

There are at least 78 COVID-19 vaccine projects underway worldwide. None of these vaccine formulations and trials could occur without the assistance of big data processing. This squarely positions big data processing as a mission-critical application, indispensable for developing a vaccine that can end the pandemic. 

SEE: How COVID-19 data highlights the search for a single version of the truth (TechRepublic)

Being able to process data at the speed of big data for drug and vaccine formulations, and then being able to use big data in-memory processing to connect and communicate with IoT and automated systems on manufacturing floors, will also determine how quickly mass amounts of vaccines can be distributed to the population. This IoT processing helps companies analyze product manufacturing and quality, and it speeds product to market

5. Collaboration is vital

COVID-19 is a worldwide pandemic. Logic dictates that the best approach to such a global sickness problem is global cooperation and information exchange so we can find a cure. This information collaboration can easily be done with big data projects as it has been done in the open source community. 

The key is countries being willing to share what they know so we can collectively find a cure. In the current political environment, this is not the case—but the big data collaboration tools are there. The lesson going forward is that we can achieve results faster if we can use big data collaboration tools and collaborate.

Data, Analytics and AI Newsletter

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence.
Delivered Mondays

Sign up today

Also see

Walmart lessons: Leveraging your “lazy assets” for more profit


The giant retailer is using its parking lots to host drive-in movies. What assets does your IT department have that could be used?

Image: J. Michael Jones / Getty Images

A few short years ago, most pundits had written off Walmart, the giant retailer that revolutionized commerce but was being threatened and seemingly bested by Amazon. In the past few months, however, Walmart seems to have rekindled its competitive spirit, and has gained ground on Amazon in areas from providing a unique data center offering leveraging its stores, to local grocery delivery that has made a significant impact as millions stay home due to COVID-19.

SEE: How to cultivate an inclusive workplace for LGBTQ employees (free PDF) (TechRepublic)

Recently, Walmart announced the rather brilliant concept of turning its parking lots into drive-in movie theaters. Initially, this might seem like an odd offering for a retailer, but on closer inspection it’s an incredibly savvy move on two primary dimensions. First, Walmart is leveraging a unique and largely underutilized asset that it already owns: Its thousands of parking lots. Secondly, movies create a broader “brand halo” and position hundreds of people within a dozen paces of its core retail offering.

The power of lazy assets

Walmart’s parking lots are the quintessential lazy asset: A physical object or capability that is key to your core business, but otherwise underutilized. Stores, especially those in rural areas, obviously need parking, and Walmart is famous for generally locating its stores in lower-cost rural areas and providing what seems like miles of blacktop to facilitate peak parking demand. That land is otherwise unused, and a minimal investment of projection equipment and temporary screens easily turns it into a movie theater that’s uniquely suited for an era of social distancing combined with consumers who are craving entertainment outside their homes.

SEE: Walmart Plus subscription service will reportedly launch this month (CNET)

These parking lots are also a perfect asset in that they’re difficult to copy. A competitor could easily acquire hundreds of projectors, but acquiring the real estate to create thousands of square miles of parking that’s in driving reach of most US residents would be costly and time consuming. This is also not the first time that Walmart is leveraging the power of its retail footprint, as the company announced that it would use stores as distributed data centers optimized for “edge computing,” whereby geographic proximity to a data center was required for applications like self-driving vehicles or Internet of Things (IoT) platforms.

Feeding the core business

The second element of this announcement that makes this so compelling is that the drive-in business directly augments Walmart’s core retail business. In many rural communities, Walmart is not only a store, but a grocer, eye doctor, and informal community hub. Adding “entertainment” to that list of features makes sense, especially since a drive-in movie theater requires minimal additional staff and puts potentially hundreds of additional customers at your store, most of whom might want a snack with their movie or need to do some shopping. It’s not much of a cognitive leap to imagine someone entering a store for a pack of Skittles and coming out with a half-dozen other items.

Look for your lazy assets

Walmart’s move may seem outside the purview of technology leaders who don’t exert control of a company’s physical assets. However, at most companies an “inverse Pareto principle” of sorts is probably in operation, with 80% of your assets utilizing only 20% of their capabilities. Consider your major software and technology platforms. Do your employees use more than 20% of the capabilities of an asset as simple as Microsoft Outlook? Does more than 20% of the data swimming in your data lakes get analyzed and leveraged into action? Does that peak capacity designed into your data center or cloud computing projects ever get used? An asset as simple as a desktop workstation that’s generally left on in the evening is a lazy asset, and video production companies have used exactly these assets to augment their video “render farms” when office workers have gone home.

SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium)

As part of your planning, take an inventory of your most expensive assets, as these are likely ubiquitous, difficult to duplicate, and potentially underutilized. Consider how that asset could be otherwise leveraged, even if it’s an idea that seems as strange as a retailer creating ad hoc drive-in movies, as there may very well be a direct link back to your core business. When doing this exploration, it can be helpful to bring in outsiders from other groups or even outside your company, as they will have the “beginner’s mind” that’s sometimes required to question why you’ve acquired assets that seem obvious to an expert.

While you may not be airing summer blockbusters using excess server capacity, it’s highly likely that there are technology assets that can create new value, and ideally augment the core business as well. Done well, you’ll also turbocharge the return delivered from assets you already own, creating a performance that’s anything but lazy.

Innovation Newsletter

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations.
Delivered Wednesdays and Fridays

Sign up today

Also see

3 big data lessons from a COVID-19 mapping and modeling project


Gathering data at the speed of life can make it hard to discern real information from a large amount of input. One data modeling and mapping project was able to make it work.

Image: libre de droit, Getty Images/iStockphoto

Finding a single version of the truth on the epidemiology of COVID-19 has proven elusive during this pandemic. There is no national case registry or medical inventory database. The epidemiological forecasting algorithms like SIR (Sampling-Importance Resampling) and IHME (International Health Metrics and Evaluation) that are used by federal and state governments lack reliable data. There is clearly a need to help public officials discern and navigate through health and economic risks better.

SEE: Return to work: What the new normal will look like post-pandemic (free PDF) (TechRepublic)

“I manage four different data labs throughout the world, and for the first few weeks of COVID-19, we were scrambling,” said Eric Haller, executive vice president and global head of Experian DataLabs, which provides advanced data analytics and research. “We had to learn how to shelter in place and to work remotely, but we were driven by a huge sense of responsibility to help government and healthcare providers sort through the data so we could make progress on the pandemic.”

The goal of lab efforts was to develop reliable data that could pinpoint and predict virus hot spots.

“Our process took about six weeks to build a core map that tracked COVID-19 outbreaks and responses,” Haller said. “We wanted to be able to provide the information to governments and healthcare so they could identify the hot spots and where they needed to double down with efforts for hard-hit communities.”

Data streams analyzed

Haller said there were three primary data streams that the analytics looked at.

The first was disease spread as represented by the number of cases and the number of deaths. A second data stream data stream provided co-morbidity rates. For those patients who died during a COVID-19 episode, how many had pre-existing conditions that made them especially vulnerable, such as heart disease or asthma? 

“From the correlations of this data, we began to develop a health risk score on a county-by-county basis,” Haller said.

SEE: Robotic process automation: A cheat sheet (free PDF) (TechRepublic)

A third data stream looked at social determinants and their effect on COVID-19 spread. How many patients had mobility, such as ready access to public transit? How dense was the housing in the areas where these individuals lived?

The team also looked at demographics, such as which age groups were the most vulnerable.

“What we did was blend all three data models into a master model for over 3,000 counties,” Haller said. “This made it simple for users to drill down into any particular county that they wanted to in order to see more specific data.”

Haller’s teams also creatively used unstructured data such as maps and photos to deduce information like housing density through aerial maps.

Lessons learned

For those responsible for data modeling and analytics development, there are three key takeaway points from this project:

1. Obtaining quality data is harder than data modeling

“When we compiled data from different states and localities, there were inconsistencies in data that we had to reconcile,” Haller said. “For instance, in New York State, they were reporting the number of COVID-19 deaths but also the number of ‘probable’ COVID-19 deaths. Some of this data was subjective, and we didn’t have a method to scrub that data.”

2. Using big data is good if you can eliminate the noise

For an item such as population density, the analytics team used available GPS data, but mapping was still inconsistent because GPS data continuously changes. “When there were questions, we had to use our own perspective to determine what was happening,” Haller said.

3. The project can move faster than you think

“We found that we could quickly adjust to having to work and collaborate remotely. The seriousness of the situation also helped us to move faster than we might have in a non-emergency mode,” Haller said. “When you work under emergency conditions like these, the smaller issues that can disrupt projects tend to disappear.”

Data, Analytics and AI Newsletter

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence.
Delivered Mondays

Sign up today

Also see