In November 2019, we partnered with a Global Chemical & Consumer Goods Company to help them digitally transform their business. As a company with over 50,000 employees, the client was ready to put an end to their traditional operations and automate their business activities. The first step towards digitizing their business was to find a way for the company’s employees and its partners to easily store, access, and share substantial amounts of data on one platform instead of using multiple tools. Our team brought forward a complete software solution for the client's needs - developing a web and cloud native platform to store and access data in a secure and efficient manner.
The Chemical & Consumer is a global company that operates worldwide with its headquarters based in Western Europe. Positioned in a number of markets in the world, ranging from beauty care to laundry and home care, the company has been around for over 100 years.
Ready to digitally transform their business and automate their everyday internal company tasks, the client needed a safe web and cloud platform for data to easily be stored and exchanged within the company.
What was RUBICON’s task?
RUBICON’s goal was to develop a generic and highly modular web-based portal that enables users to access and exchange relevant data located in Azure cloud (from Data Lake, Data Warehouse, and other sources) in a highly secure and well-managed approach.
The platform we created is called the Data Lake Management Portal where all company data is managed in one place - the cloud.
While developing the portal, our team overcame the following challenges:
In November 2019, after being briefly introduced to the client’s project, our team organized a Lean Inception Workshop with both our team and the client’s team in their main office. The goal behind the workshop was to create the very first version of the product for the first phase of development. During the four days, both teams collaborated to understand the client’s demands. We worked together to define the product vision and goals, explore features and user journeys, and review the technical/business/UX components. By the fourth day of the workshop, we had release version 01 ready for development with a set of clear guidelines and features.
Once we created the product roadmap and introduced the idea to the RUBICON team, we began onboarding the team members dedicated to the project. The roles that made up the team were:
Following the Agile framework, our team worked in Scrum by splitting up the workload into two-week sprints.
Once release version 01 was complete, we launched the beta version with the client’s team giving them access to the portal so they could incorporate it into their work routine.
After launching release version 01, RUBICON organized a set of user testing interviews. Taking our previous user testing experience into consideration, we subscribe to the theory that you only need 5 users in order to identify approximately 80% of all usability problems. You can read more about it here. The theory is that as you add more and more users, you learn less and less because you will keep seeing the same things again and again.
We performed user interviews with our candidates, one at a time. The interviews were done remotely, with the following team:
The interviewer would guide the candidate through the release version 01 with a series of essential questions (non-guiding, open-ended questions, which encourage the candidate to think critically). The candidate who had never seen the product before, logged into the system, went through the product and voiced their concerns and suggestions out loud. Users came from different departments, so we had input from all of the personas we defined at the beginning of the project.
As a result, we had a list of improvements and additional features that needed to be developed in order for the product to deliver more value to all users. We also knew that we were on the right path, as 80% of our candidates asked for features that were in development at the time.
After analyzing the feedback, we hosted another Lean Inception Workshop for release version 02, this time the workshop was organized remotely due to the COVID-19 pandemic. The purpose of this workshop was to extend the features and define the second project. We took the user testing results into consideration and defined our goals and features for the release version 02. With all of those results combined, our team started working on the second phase of development.
Once we implement the features for the release version 02, we will have a second wave of user testing, defining the improvements we need to make, organizing a new workshop, and starting the process all over again.
Data Lake Management Portal is a unified and user-friendly web portal and service designed and developed for an unlimited number of users to have quick and easy access to data and sharing in the Data Lake. The portal enables users to browse, download, upload, request, share, and view available data within the company.
The portal contains the following features:
Functional Features
Non-functional requirements
The client requested a scalable cloud solution with the following requirements:
The goal was to create a cloud native solution to support the requested features and requirements. The focus was to use Azure Platform as a Service (PaaS) offering lower effort to a minimum so the client wouldn’t have to maintain and manage resources such as storage, servers, applications, services etc.
Following industry-proven practices for scalable and performant cloud solutions, RUBICON implemented the solution shown in the diagram below:
This architecture shows a front-end single page application with JavaScript accessing backend REST APIs. The solution servers static content from Azure Blob Storage and implements back-end APIs with Azure App Services and Azure Functions. Users sign into the web application by using their Azure AD credentials and Azure AD returns access token, which the application uses to authenticate API requests. Furthermore, back-end APIs authenticate and invoke other company services and APIs by implementing OAuth 2.0 On-Behalf-Of authentication flow.
The architecture includes the following components:
For provisioning Azure resources we used Terraform and Terragrunt. Using these templates makes it easier to automate deployments and provision different environments in minutes, for example, to replicate load testing environments only when needed which saves costs.
Our DevOps team implemented continuous integration (CI) and continuous deployment (CD) pipelines using Azure DevOps Pipelines services that automatically build, test, and deploy every source code change. Azure DevOps provides Repos for source code control, Pipelines for CI/CD, Artifacts to host build artifacts, and Boards for developer collaboration and coordination. With these processes in place, we focus on the development of the applications rather than the management of the supporting infrastructure.
Frontend CI is programmed to trigger on every git push from feat, fix, chore branches. The Build Stage includes jobs and tasks that clone the repo, install npm tools, build the solution, run unit tests, and then package and publish artifacts to Azure Artifacts. If the source branch is being merged in, Build Pipeline triggers the Release Pipeline.
Backend CI/CD for Containers is one pipeline with the Build and Deploy stage. Build Stage installs .Net Core SDK, builds the solution, using Key Vault secrets runs Integration tests and publishes test results and Build Artifacts. Deploy stage gets skipped unless the source branch is develop or master. Deploy branch contains two tasks: building and pushing Docker image to Container Registry and a task to deploy the container image to Azure App Service for containers.
The initialization stage of the IaaC pipeline initializes a resource group with a storage account and a container that will be used as a remote state. In the Plan Stage, we are performing the validation and creating the plan for the infrastructure, but before we can do that we first need to download Terragrunt and the secrets file located in Azure DevOps. And finally, in the Apply Stage, after manual approval check, we are applying/deploying our changes to the infrastructure.
Seven months later, we developed a secure web portal with a simple and modern UX/UI design that easily enables users to read and write files from an underlying data lake. We successfully helped the client say goodbye to their traditional business ways and transition into using their new software making their performance more efficient and automated.
Applications & Data
Microsoft Azure Cloud
Frontend
DevOps
Company: Global Chemical & Consumer Goods company
Region: Western Europe
Industry: Consumer Goods
Project Duration: 2019 - ongoing
Project: Cloud Native & Web Development