By WalkingTree October 23, 2020
Covid-19 pandemic has affected every aspect of our lives. The pandemic has caused major shifts in the way IT and big data work, another change happening right now is with DevOps. It has accelerated the blending of data analytics and DevOps, meaning developers, data scientists, and product managers are working more closely together than ever before.
Organizations have started to rethink how they were using big data, and generate a movement towards merging DevOps methodologies with big data analytics. The emergency of COVID has also resulted in an emphasis on getting analytics insights and results to market quickly. This has also resulted in revisions in operations and culture for IT. Let’s take a look at some of them.
Waterfall to DevOps development
Developing and deploying big data applications is an iterative process. It doesn’t follow the methodology of traditional IT waterfall development. Yet, a majority of IT departments are still using the waterfall development paradigm. There are separate silos within IT for development and deployment. These functions have to come together and end-users in the more collaborative and iterative process of big data application development.
Fewer absolutes for quality
The testing of big data applications is more relative and less absolute. This is a tough adjustment for IT because, in traditional transaction systems, it is either correctly moved from one place to another, or obtain a value based on data and logic. With big data, it could start off with results being only 80% accurate, but with the business deeming them close enough to indicate an actionable trend.
Read on to know more about how Big data and DevOps are no longer separate silos.