We are hiring a highly skilled DevOps Engineer with strong expertise in cloud platforms, automation, and distributed systems. The role requires hands-on experience in GCP, Databricks, Python, and PySpark, with a focus on building tools to automate and monitor large-scale systems.
Design, build, and maintain scalable and reliable DevOps solutions.
Automate deployment, monitoring, and system operations.
Manage and optimize workloads on GCP and/or Databricks.
Develop scripts and applications using Python and PySpark.
Troubleshoot system issues and improve system performance.
Collaborate with development, testing, and operations teams.
6+ years of overall IT experience.
Minimum 2 years of experience in GCP or Databricks (both preferred).
Minimum 2 years of experience in Python & PySpark.
Proven experience in building automation/monitoring tools for large-scale distributed systems.
Strong problem-solving, debugging, and system optimization skills.
1st Round: Virtual Interview
2nd Round: Face-to-Face Interview (Bangalore – Whitefield)