Wavelynx is seeking an experienced and skilled Senior Data Platform Engineer to play a pivotal role in shaping and maintaining our core data infrastructure. You will be a key contributor, responsible for the reliability, scalability, and evolution of our data lake and orchestration systems. This role requires a strong blend of software engineering principles, deep data expertise, and a proactive approach to building robust, future-proof data solutions that power our analytics, reporting, and operational needs. If you're passionate about owning critical data systems and enabling data-driven decision-making across an organization, we encourage you to apply!
What You'll Do:
Own and maintain Airflow orchestration infrastructure: Ensure the reliability, efficiency, and scalability of our data workflows through expert management and continuous improvement of our Airflow environment.
Oversee the architecture and health of the data lake: Design, implement, and maintain the core architecture of our data lake, including robust pub/sub mechanisms for handling credential updates and efficient event-driven ingestion processes.
Manage critical data integrations: Take ownership of data integrations with key financial systems (e.g., NetSuite), ensuring the accurate and timely delivery of critical reporting datasets to our Finance department and external partners (e.g., Apple).
Develop and evolve the data lake as a central platform: Continuously enhance and expand our data lake to serve as a versatile and reliable central platform, supporting future analytics initiatives, comprehensive reporting, and seamless cross-team data access.
Implement best practices for data quality, security, and governance within the data platform.
Collaborate closely with data analysts, data scientists, and other engineering teams to understand data needs and provide scalable solutions.
Troubleshoot complex data issues, identify root causes, and implement long-term solutions.
Stay abreast of industry trends and emerging technologies in data engineering and cloud platforms, evaluating and recommending new tools and approaches.
What You'll Bring:
Bachelor's degree in Computer Science, Engineering, or a related technical field.
5+ years of progressive experience in data engineering, software engineering with a data focus, or data platform roles.
Expert-level proficiency with Apache Airflow: Demonstrated experience in designing, deploying, managing, and troubleshooting complex Airflow DAGs and infrastructure.
Strong experience with cloud data platforms (e.g., AWS, GCP, Azure) and their respective data storage and processing services (e.g., S3, ADLS, GCS; Redshift, Snowflake, BigQuery; EMR, Dataproc, Glue).
Deep understanding and hands-on experience with building and maintaining data lakes, including various ingestion patterns (batch, streaming, event-driven).
Proficiency in at least one modern programming language (e.g., Python, Scala, Java) with a strong emphasis on writing clean, maintainable, and efficient code.
Strong SQL skills and experience working with relational and/or analytical databases.
Experience integrating with third-party APIs and enterprise systems (e.g., ERPs like NetSuite).
Familiarity with pub/sub messaging systems (e.g., Kafka, Kinesis, Pub/Sub).
Experience with version control systems (e.g., Git).
Excellent problem-solving abilities, strong analytical skills, and meticulous attention to detail.
Ability to work independently, prioritize tasks effectively, and manage multiple projects simultaneously.
Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders.
Salary: $130-145k
Benefits include potential sweet equity for some roles, great rates on company-sponsored medical, dental, and vision with HSA-eligible plans available, generous retirement with up to 6% 401k match, and holidays, vacation, and sick leave.