GCP Data Engineer
Fractal
It’s fun to work in a company where people truly BELIEVE in what they are doing!
We’re committed to bringing passion and customer focus to the business.
Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks.
Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, Presto, Druid, airflow
Deep understanding of BigQuery architecture, best practices, and performance optimization.
Proficiency in LookML for building data models and metrics.
Experience with DataProc for running Hadoop/ Spark jobs on GCP.
Knowledge of configuring and optimizing DataProc clusters.
Offer system support as part of a support rotation with other team members.
Operationalize open source data-analytic tools for enterprise use.
Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification.
Understand and follow the company development lifecycle to develop, deploy and deliver the solutions.
Minimum Qualifications:
• Bachelor’s degree in Computer Science, CIS, or related field
• Experience on project(s) involving the implementation of software development life cycles (SDLC) GCP DATA ENGINEER
If you like wild growth and working with happy, enthusiastic over-achievers, you’ll enjoy your career with us!
Not the right fit? Let us know you’re interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Apply now
To help us track our recruitment effort, please indicate in your cover/motivation letter where (jobs-near-me.eu) you saw this job posting.