Onix

Technical Lead

Onix United States

About Onix:


Onix is a trusted cloud consulting company that helps companies get the most out of their technology with cloud-powered solutions, best-in-class services, and the Datametica Birds, data migration products that unleash AI potential.


We are able to deliver exceptional results for our customers because of our 20+ years of cloud experience, depth of technology expertise, and IP-driven data and AI solutions.


We offer solutions across a wide range of use cases and industries that are tailored to the unique needs of each customer. From advanced cloud security solutions to innovative AI capabilities and data migration products, we have you covered. Our global team of experts are the most reliable, talented and knowledgeable in the industry.


Summary:


Onix seeks an experienced GCP Technical Lead (Data Platform) with a strong background in managing and optimizing data platforms. The ideal candidate will have primary expertise in PostgreSQL, Starburst/Trino, DBT, Dataform, Big Lake (GCS), BigQuery, Cloud Observability, and GCP Services. Along with the hands-on experience with data technologies Additionally, they should possess secondary skills in Spark, DataProc, DataFlow, Python, Airflow, and have experience in data and database migration.


Role: Technical Lead

Location: Remote


Primary Responsibilities:


● Provide technical leadership and guidance to the data platform team.

● Oversee the design, implementation, and maintenance of scalable and high-performance

data platforms.

● Analyze the source data solutions like PostgreSQL, Starburst/Trino and Architect and optimize and migrate the same to BigQuery, and other GCP services.

● Ensure data models and architectures meet business requirements and best practices.

● Lead data migration projects, including:

● Migration from PostgreSQL to BigQuery or Cloud SQL.

● Migration of workloads from Spark to DataProc.

● Migration of historical data from AWS to GCP.

● Develop and implement migration strategies, ensuring data integrity, security, and minimal downtime.

● Design and implement robust ETL/ELT processes using DBT, Dataform, and other tools.

● Ensure data pipelines are efficient, maintainable, and meet performance requirements.

● Manage and optimize Big Lake (GCS) and BigQuery environments.

● Implement best practices for data storage, partitioning, and querying to ensure cost-effectiveness and performance.

● Utilize GCP services to enhance data platform capabilities.

● Implement and manage cloud observability tools to monitor data pipelines and infrastructure health.

● Use Spark, DataProc, DataFlow, and Python for complex data processing tasks.

● Implement orchestration and scheduling of data workflows using Airflow.

● Work closely with data engineers, data scientists, and other stakeholders to meet data requirements.

● Mentor team members and provide technical training and guidance.


Minimum qualifications:


● Minimum 8+ years of experience in data engineering, data architecture, and technical

leadership.

● 5+ years of overall experience in architecting, developing, testing & implementing Data

Platform projects using GCP Components (e.g. BigQuery, Dataflow, Dataproc,DLP,BigTable,Pub/Sub,Compos etc..).

● Good Understanding of Data Structures.

● Worked with large datasets and solving difficult analytical problems.

● Experience working with GIT for Source Code Management

● Worked with Structured and Unstructured data

● E2E Data Engineering and Lifecycle (including non-functional requirements and

operations) management.

● Worked with client teams to design and implement modern, scalable data solutions

using a range of new and emerging technologies from the Google Cloud Platform

● Automating manual processes to speed up delivery.

● Good Understanding of Data Pipeline (Batch and Streaming) and Data Governance

● Experience in code deployment from lower environment to production.

● Good communication skills to understand business requirements.

● Ability to manage a team of 3-4 people.

● Work closely with offshore team in India


Preferred qualifications:


● Mandatory Skills - PostgreSQL, Starburst/Trino, DBT, Dataform, Big Lake (GCS) BigQuery and Cloud Observability and GCP Services

● Secondary Skills – Spark, Data Proc, Data Flow, Python, Airflow

● Knowledge of ETL Migration from On-Premises to GCP Cloud

● SQL Performance Tuning

● Batch/Streaming Data Processing

● Fundamentals of Kafka,Pub/Sub to handle real-time data feeds.

● Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer

● Ability to communicate with customers, developers, and other stakeholders.

● Mentor and guide team members

● Good Presentation skills

● Strong Team Player


Education:


  • Bachelor's degree in Computer Science or equivalent practical experience.
  • Travel Expectation: Up to 15% depending on project need


It is the policy of Onix to ensure equal employment opportunity in accordance with the Ohio Revised Code 125.111 and all applicable federal regulations and guidelines. Employment discrimination against employees and applicants due to race, color, religion, sex, (including sexual harassment), national origin, disability, age (40 years old or more), military status, or veteran status is illegal. Onix will only employ those who are legally authorised to work in the United States or Canada

  • Seniority level

    Mid-Senior level
  • Employment type

    Full-time
  • Job function

    Information Technology
  • Industries

    IT Services and IT Consulting

Referrals increase your chances of interviewing at Onix by 2x

See who you know

Get notified about new Technical Team Lead jobs in United States.

Sign in to create job alert

Similar jobs

People also viewed

Looking for a job?

Visit the Career Advice Hub to see tips on interviewing and resume writing.

View Career Advice Hub