McQueen Autocorp is a leading US-based used car selling company with a buyer network spread across the United States. McQueen Autocorp offers a nationwide modern car selling service that facilitates sellers with an instant offer for their car(s) using price prediction algorithms and real-time bids from an extensive network of licensed buyers.
Disclaimer: The name McQueen Autocorp is a placeholder as there is an NDA signed between both parties.
The custom application used by our client was operating via classic manual scheduling.
Since the team had to maintain the system manually, it resulted in the pausing of jobs. Then, these jobs had to be manually checked and run. Resultantly, some jobs often got skipped and wouldn't run on time. It led to frequent failure in the initialization of the application. The system was not reliable.
Scheduled jobs needed manual intervention for monitoring and running, which was tedious. Since data pipelines were complex, it would take hours to trace and rectify the bugs and get the application running again.
Their top priority was finding a solution to eliminate manual intervention, provide traceability of the bugs, and orchestrate workflows and jobs to solve the issue.
Maruti Techlabs was already working with McQueen Autocorp on a different project when this issue surfaced. Looking at our technical expertise, the founders decided to onboard us to solve the data orchestration challenge.
Our engineers understood the challenge and quickly got to work. Since the classic method of manual orchestration was the root cause of frequent application breakdown, we decided to automate data orchestration.
After analyzing various data orchestration tools, we decided to go ahead with Apache Airflow. We chose Airflow because of its Python base, making it easily customizable to fit our needs.
To avoid replicating the main application, we first took the application out of the existing server. The jobs were mainly sequential earlier, which contributed to the probability of failure. We coded the jobs to be parallel to each other inside the tool.
As the number of DAGs (directed acyclic graph) in the Airflow clusters grew, it became more challenging to manage and schedule the DAGs. We decided to move to Astronomer to simplify and scale up the management of Airflow clusters. Astronomer made it very easy to control multiple Airflow deployments from the central UI. It provides a centralized view of different metrics like DAGs, status, logs, etc., of all Airflow clusters.
After setting up the Astronomer platform, we created users via Okta login in the cluster. We created a CI/CD pipeline for deploying code from development into production. This automated process allowed us to develop new features quickly while maintaining the security and reliability of the application. Finally, to ensure easy deployment and maintenance of the entire setup, we hosted it on the AWS cloud application.
We follow Agile, Lean, & DevOps best practices to create a superior prototype that brings your users’ ideas to fruition through collaboration & rapid execution. Our top priority is quick reaction time & accessibility.
We really want to be your extended team, so apart from the regular meetings, you can be sure that each of our team members is one phone call, email, or message away.
A big part of our DevOps consulting and implementation is the synergic effect of the continuous involvement of various stakeholders. With DevOps implementation, experience a workload shift with reliable, high-quality software applications and services that support automation.