Job Title: Data Engineer (Databricks and AWS) Duration: 2 Hours Budget: ₹22,000 – ₹26,000 (based on screening) Tech Stack: Databricks, Python, PySpark, AWS, SQL, Git Job Description: We are looking for an experienced Data Engineer to provide short-term support. The role involves working on data pipelines, transformations, and analytics using Databricks and AWS. Responsibilities: Develop and optimize data pipelines using Databricks, PySpark, and Python Work with AWS services and SQL-based data processing Manage code and versioning using Git Troubleshoot and optimize data workflows Requirements: Strong hands-on experience with Databricks, PySpark, and Python Good knowledge of AWS data services and SQL Experience with Git and collaborative development Ability to deliver within short timelines