Data Engineer
We are in search of a skilled Data Engineer to join our expanding team of
experts. This role will be pivotal in the design and development of Snowflake
Data Cloud solutions, encompassing responsibilities such as constructing data
ingestion pipelines, establishing sound data architecture, and implementing
stringent data governance and security protocols.
The ideal candidate brings experience as a proficient data pipeline builder
and adept data wrangler, deriving satisfaction from optimizing data systems
from their foundational stages. Collaborating closely with database architects,
data analysts, and data scientists, the Data Engineer will play a crucial role
in ensuring a consistent and optimal data delivery architecture across ongoing
customer projects.
This position demands a self-directed individual comfortable navigating the
diverse data needs of multiple teams, systems, and products. If you are
enthusiastic about the prospect of contributing to a startup environment and
supporting our customers in their next generation of data initiatives, we
invite you to explore this opportunity.
Roles & Responsibilities:
- Build and optimize scalable data ingestion pipelines using Snowflake.
- Design and implement secure, governed data architectures.
- Collaborate with cross-functional teams to ensure reliable data delivery.
- Work on ETL/ELT development using tools like Fivetran, Matillion, or DBT.
- Support and improve data infrastructure on cloud platforms (AWS, Azure,
GCP).
- Write efficient, production-ready SQL and Python code.
- Partner with stakeholders to support data initiatives across multiple
projects.
Requirements
Qualifications & Skills
- Bachelor’s degree in Engineering, Computer Science, or a related technical discipline.
- 3–6 years of experience in relevant technical roles, with demonstrated expertise in data management, database development, ETL processes, and data preparation.
- Minimum of 1 year of hands-on experience working with the Snowflake Data Cloud platform.
- Practical experience with Snowflake, including architecture design, data modeling, and solution implementation.
- Proven track record in developing data warehouse solutions and building ETL/ELT ingestion pipelines.
- Strong capability in processing and deriving insights from large, diverse datasets.
- Proficient in SQL and Python scripting; knowledge of Scala and JavaScript is a plus.
- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Hands-on experience with ETL/ELT tools compatible with Snowflake, such as Matillion and Fivetran.
- Working knowledge of DBT (Data Build Tool) is an added advantage.
- Excellent interpersonal and communication skills, with the ability to establish and maintain strong client relationships.
- Strong project management and organizational abilities.
- Collaborative mindset, with the ability to work effectively in cross-functional and Agile team environments.
- Full professional proficiency in English (written and verbal) is required.
Signs You May Be a Great Fit
- Impact: Play a pivotal role in shaping a rapidly growing venture studio.
- Culture: Thrive in a collaborative, innovative environment that values creativity and ownership.
- Growth: Access professional development opportunities and mentorship.
- Benefits: Competitive salary, health/wellness packages, and flexible work options.