Back to previous page
Senior Data Engineer
Engineering
Glendale, California, 91201
Contract
Ref.: 140104
Job Summary:
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties:
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties:
- Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
- Build tools and services to support data discovery, lineage, governance, and privacy
- Collaborate with other software and data engineers and cross-functional teams
- Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
- Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
- Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
- Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
- Participate in agile and scrum ceremonies to collaborate and refine team processes
- Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
- Maintain detailed documentation of work and changes to support data quality and data governance requirements
- 5+ years of data engineering experience developing large data pipelines
- Proficiency in at least one major programming language such as: Python, Java or Scala
- Strong SQL skills and the ability to create queries to analyze complex datasets
- Hands-on production experience with distributed processing systems such as Spark
- Experience interacting with and ingesting data efficiently from API data sources
- Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
- Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
- Experience developing APIs with GraphQL
- Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
- Familiarity with data modeling techniques and data warehousing best practices
- Strong algorithmic problem-solving skills
- Excellent written and verbal communication skills
- Advanced understanding of OLTP versus OLAP environments
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered