We are seeking an experienced Data Engineer to design, build, and optimize large-scale data platforms that support advanced analytics and reporting. The role involves working closely with cross-functional teams to translate business needs into reliable, scalable data solutions while ensuring data quality, governance, and performance.
Key Responsibilities:
Analyze existing data structures and workflows to identify inefficiencies and improvement opportunities
Design and implement optimized data models, schemas, and data pipelines for analytics and reporting
Build and maintain scalable data platforms including data lakes and data marts
Collaborate with business and technical teams to define and implement actionable data requirements
Develop and enforce data governance, quality assurance, and security strategies
Ensure compliance, reliability, and accuracy of data systems
Provide technical leadership and mentorship on data architecture and best practices
Build and maintain monitoring, logging, and alerting for data pipelines
Ensure data freshness, reliability, and performance
Establish and maintain SLAs for pipeline performance and data availability
Stay updated on industry trends and emerging data technologies
Required Qualifications:
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
6+ years of experience architecting and building large-scale data platforms
Proven experience as a Data Engineer or in a similar role
Strong background in database design and development
Proficiency in SQL, data modeling, and ETL processes
Experience with data lakes, data marts, and analytics platforms
Knowledge of reporting tools such as Looker, Power BI, or similar
Experience with Google BigQuery or equivalent data warehouses
Familiarity with cloud platforms (GCP preferred)
Experience with big data technologies such as Hadoop and Spark
Strong problem-solving skills and attention to detail
Excellent communication and collaboration skills
Nice to Have Skills:
Experience with event-driven architectures and streaming platforms such as Kafka
Experience with streaming frameworks like Spark Streaming or Flink
Knowledge of PostgreSQL
Experience working with real-time data processing systems