Deployment related interview question
Created on: Jan 15, 2025
-
What key aspects should be considered during code review.
- Code Quality and Readability
- Code quality and Readibility: Adherence to Coding Standards, Check for consistent indentation, naming conventions, and formatting. Coding standard and guideline
- Code Clarity and Conciseness: Verify that the code is well-structured, easy to understand, and avoids unnecessary complexity
- Comments and Documentation:
- Functional Correctness
- Test Coverage: ensuring if Edge Cases and Error Handling is done, code coverage,
- Security and Vulnerability: core adhere to security best practices, such as input validation, output encoding, and protection against common vulnerabilities like SQL injection. Consider checking in toll like SonarQube, Checkmark
- Check for code modularity
- Best practice like be timely, be respectful.
- Code Quality and Readability
-
Can you walk me through your development workflow and explain how you take your code changes from local development to production? Below are the process
- Development: code writing, testing, debugging, deploying in dev branch
- Code review: Peer review, lead review,
- Merge in QA branch and deploy in qa environment.
- Move jira task to qa and assigned to qa. Do any changes if asked by qa.
- After getting qa done, we create a new merge from feature branch to prod branch and merge it.
- The deployment plan should include a detailed rollback strategy, impact assessment, and communication plan.
- After deploying the change, we will monitor if something is affected. We will solve any minor bug. If it is a major issue, we will rollback.
- CI/CD pipeline
- Check for logs in ELK immediately after deployment.
-
What are the deployment strategy used and tell about canary deployment ?
- Rolling Deployment: Gradually replace the old version with the new version, one instance at a time.
- Cons: Risk of incompatibility between old and new versions during deployment.
- Harder rollback if issues occur.
- A/B Testing Deployment: Deploy the new version to a specific group of users to test specific changes or features.
- Blue-Green Deployment:
- Concept: Deploy a new version of the application (blue) alongside the existing version (green) but route traffic to only one version.
- Steps:
- Deploy the new version (blue) to a separate environment.
- Perform testing and validation on the blue version.
- Switch the traffic from the green environment to the blue environment once it's verified.
- Roll back to green if issues arise.
- Canary Deployment: Gradually roll out the new version to a subset of users before a full rollout.
- Steps:
- Deploy the new version to a small percentage of users.
- Monitor for errors, performance issues, or bugs.
- Gradually increase the number of users accessing the new version.
- Fully roll out if the new version performs well.
- Steps:
- Recreate/All at once: This is often the traditional deployment strategy where the changes are deployed to all the servers simultaneously, resulting in some downtime.
- This deployment type is usually done during a maintenance window- i.e., off-peak hours to minimize the impact.
- Deployment is faster than rolling updates, but users are impacted.
- Rolling Deployment: Gradually replace the old version with the new version, one instance at a time.
-
What are the difference between canary and blue green deployment ?
The main difference is that canary deployment releases updates incrementally to a small group of users, while blue-green deployment switches traffic between two identical environments.
-
What are the differences between AB testing and Canary deployment ?
A/B test's purpose is usually to see users' response (In a way, how much they like it) to a new UI, feature, etc. But you know that the new version works. So, you actually send randomly both versions of the application to all of them. It can be 50-50, 80-20, 90-10, anything. Sometimes the functionality is not even relevant. You might want to see which version attracts more clients and stuff like that.
Canary is more focused on how well works the new feature. Or if it actually works. It usually will be 90-10, 80-20, A >> B. Never 50-50, because if it goes wrong, you don't want half of your users to have a bad experience. So you are not positive if the new version is going to work as expected.
The most important difference (and this is what almost no one talks about) is that a canary testing has session affinity. So it doesn't send both versions to all users, but randomly sends some users to the new version, and keeps them on the same version.
-
How to do canary deployment for example booking page of movie booking system ?
-
Deploy Two Versions of Your Booking Service
booking-v1: The current stable version.booking-v2: The new canary version with the updated functionality.
-
Use a Load Balancer or API Gateway
- Setup Traffic Split Using a Load Balancer like NGINX, HAProxy, or Apache.
upstream booking_service { server booking-v1.com weight=95; server booking-v2.com weight=5; } server { listen 80; location / { proxy_pass http://booking_service; } } -
Start with 5% of traffic going to the
booking-v2version and gradually increase it as you gain confidence in the new release. -
Monitor and Analyze Metrics
- Use monitoring tools like Prometheus + Grafana track the performance of both versions.
-
Rollback Strategy If issues are detected in the new version (
booking-v2):- Stop routing traffic to
booking-v2. - Direct all traffic back to
booking-v1.
- Stop routing traffic to
-
