About the Role:
We are seeking a highly skilled and experienced Data Engineering Architect to join our growing team. As a Data Engineering Architect, you will play a critical role in designing, building, and scaling Google’s massive data infrastructure and platforms. You will be a technical leader and mentor, driving innovation and ensuring the highest standards of data quality, reliability, and performance.
Responsibilities:
Design and Architecture:
Design and implement scalable, reliable, and efficient data pipelines and architectures for various Google products and services.
Develop and maintain data models, schemas, and ontologies to support diverse data sources and use cases.
Evaluate and recommend new and emerging data technologies and tools to improve Google’s data infrastructure.
Collaborate with product managers, engineers, and researchers to define data requirements and translate them into technical solutions.
Other Jobs You May Be Interested In
- Senior HR Service Delivery Representative ID- 3454
- RN Call Center Overnight Remote – Compact Nursing States ID- 3452
- Remote RN Triage Nurse | WFH Opportunity ID- 3451
- Remote Registered Nurse (RN) with Flexible Schedule ID-3450
- Remote Buyer | WFH Opportunity ID -3449
- Remote Project Manager – Manufacturing & Supply Chain | WFH ID-3448
- Entry Level Remote: Data Verification Specialist (No Experience/Degree Required) ID-3447
- Data Entry Assistant (% Remote) ID-3446
- Amazon Data Entry Jobs from home – No Experience Needed ID-3445
- No Degree Needed Remote Customer Service Roles – Earn $25-$35 Per Hour ID-3444
- Discover a Flexible Customer Service Role with Pay Starting at 19 Per Hour ID-3442
- (No Degree, No Experience) Daily Motion Part/Full Time Remote Jobs – Apply Now ID-3443
- Senior Manager, Visual AI Microservices 3 Locations ID-1602 ID-3441
- Amazon Package Handler ID-1842 ID- 3440
- [Part-Time] Work From Home Amazon Customer Service Center ID-1618 ID-3439
- Fedex Remote Jobs (Entry Level, Data Entry, Full Time) ID-13973 ID-3438
- Data Entry Specialist (Remote Job) ID-3437
- Entry Level Fedex data entry jobs (Work At Home) ID-3436
- Digital Products Manager (Remote) ID-3434
- BCBS careers remote ID-3433
- Blue cross blue shield Customer Service Representative (Work At Home) ID-3435
- Blue Cross blue shield Careers Remote Jobs ID-3432
- American Airlines Customer Service (Work At Home) ID-3431
- Southwest Airline Remote Position $27/Hour ID- 3430
- (Online Remote jobs) Southwest Airlines Remote Jobs $24 (No Experience) ID- 3428
- (Work From Home) Amazon Customer Service Job- Part-Time Remote ID-3429
- Remote Call Center Customer Service Representative ID-3427
- Remote Call Center Customer Service Representative ID-3426
- Full-Time Remote Call Center Representative – Apply Now ID-3425
Data Processing and Pipelines:
Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
Develop and implement data quality checks and validation processes to ensure data accuracy and consistency.
Design and implement data governance policies and procedures to ensure data security and compliance.
Data Storage and Management:
Design and implement scalable data storage solutions using GCP services such as BigQuery, Cloud Storage, and Spanner.
Optimize data storage and retrieval for performance and cost-effectiveness.
Implement data lifecycle management policies and procedures.
Team Leadership and Mentorship:
Provide technical leadership and guidance to data engineers and other team members.
Mentor and coach junior engineers to develop their skills and expertise.
Foster a culture of innovation and collaboration within the team.
Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
8+ years of experience in data engineering or a related field.
Strong understanding of data warehousing, data modeling, and ETL processes.
Expertise in designing and implementing large-scale data pipelines and architectures.
Proficiency in SQL and at least one programming language such as Python or Java.
Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
Experience with open-source data processing frameworks such as Hadoop, Spark, and Kafka.
Excellent communication, interpersonal, and collaboration skills.
Preferred Qualifications:
Experience with data governance and data quality management.
Experience with machine learning and data science.
Experience with containerization and orchestration technologies such as Docker and Kubernetes.
Contributions to open-source projects or communities.
Google Cloud Professional Data Engineer certification.