We are
seeking a talented and dynamic Data Engineer with expertise in Google
Cloud Platform (GCP) and Python development to join our team. In
this role, you will help build scalable and efficient data pipelines that
support our data-driven initiatives. You will leverage cutting-edge
technologies in the cloud space to deliver accurate and actionable insights,
empowering teams to make data-informed decisions and drive business growth.
As a GCP
and Python Data Engineer, you will work closely with cross-functional teams
(marketing, digital, customer care, product, IT) to provide data-driven
solutions that impact customer data, optimize processes, and build long-term
customer relationships. You’ll be responsible for transforming raw data into
meaningful insights and delivering reliable data pipelines that directly
support business goals.
Requirements
- Design,
implement, and maintain scalable and efficient data pipelines within the Google
Cloud Platform (GCP) ecosystem, utilizing industry-standard tools such as Cloud
Data Fusion, Cloud Composer, Python, and Dataflow.
- Optimize
BigQuery tables and queries for performance, and use Cloud Functions for
event-driven automation.
- Implement
real-time messaging with Pub/Sub and ensure proper monitoring and alerting via
Cloud Monitoring.
- Develop,
manage, and optimize solutions for the company’s customer data systems, with a
strong focus on maintaining data quality, synchronization, and orchestration
for loyalty and credit data.
- Work
in an Agile environment, applying SDLC principles and SCRUM methodologies.
Utilize tools such as Jira, Confluence (Wiki), Bitbucket/GitHub, and Atlassian
Bamboo for continuous integration/deployment.
- Ensure adherence to software security
best practices and product scalability. Apply general data warehousing
principles, maintain comprehensive documentation, and adhere to best practices
in refactoring and testing to ensure system reliability and maintainability.
- Use Terraform for infrastructure
automation and provisioning
- Require
excellent communication and documentation skills, especially when communicating
with business users.
Qualifications:
Must Have:
- 3+
years of hands-on experience in software development, with a Bachelor's degree
in a relevant field of study.
- Strong
knowledge of cloud technologies and distributed computing, specifically with Google
Cloud Platform (GCP) or AWS.
- Experience
with cloud-based big data analytics platforms like BigQuery, BigTable,
Redshift, or Snowflake.
- Solid
background and hands-on experience in Business Intelligence (BI), Data
Warehousing, and ETL projects, including work with DWH appliances.
- In-depth
experience with Python and SQL, foundational technologies
essential for data processing and manipulation.
- Hands-on
experience with ETL/ELT processes, managing large datasets, and
building data pipelines.
- Experience
using version control, CI/CD, and automation tools.
- Proficient
in open source tools and technologies, and can use and extend them where
appropriate to develop solutions.
- Excellent
analytical skills, with a proven ability to troubleshoot, clean, and prepare
data for analysis
- Experience
with data quality and data cleansing.
Benefits