Cloud Engineer ( SnowFlake)
ref nr: 8/12/2024/JW/89161In Antal we have been dealing with recruitment for over 20 years. Thanks to the fact that we operate in 10 specialised divisions, we have an excellent orientation in current industry trends. We precisely determine the specific nature of the job, classifying key skills and necessary qualifications. Our mission is not only to find a candidate whose competences fit the requirements of the given job advertisement, but first and foremost a position which meets the candidate’s expectations. Employment agency registration number: 496.
Role Description
As a Cloud Engineer specializing in Snowflake, you will play a pivotal role within our Insights & Data team, contributing to the development of advanced data solutions. Your primary responsibilities will include designing, developing, and maintaining Snowflake data pipelines to support various business functions. You will collaborate with cross-functional teams to understand data requirements and implement scalable solutions. Additionally, you will optimize data models and schemas for performance and efficiency, ensure data integrity, quality, and security throughout the data lifecycle, and implement monitoring and alerting systems to proactively identify and address issues. You will also be tasked with planning and executing migrations from on-premises data warehouses to Snowflake, developing AI, ML, and Generative AI solutions, and staying updated on Snowflake best practices and emerging technologies to drive continuous improvement.
Company Description
Our company is renowned for delivering state-of-the-art Data solutions, primarily focusing on Cloud & Big Data engineering. We develop robust systems capable of processing extensive and complex datasets, utilizing specialized Cloud Data services across platforms like AWS, Azure, and GCP. Our expertise spans the entire Software Development Life Cycle (SDLC) of these solutions, with a strong emphasis on leveraging data processing tools, extensive programming, and the adoption of DevOps tools and best practices. Furthermore, within our AI Center of Excellence, we undertake Data Science and Machine Learning projects, focusing on cutting-edge areas such as Generative AI, Natural Language Processing (NLP), Anomaly Detection, and Computer Vision. We pride ourselves on fostering a culture of innovation and excellence.
Requirements
- Minimum of 3 years of experience in Big Data or Cloud projects, specifically in processing and visualization of large and/or unstructured datasets, including at least 1 year of hands-on Snowflake experience.
- Understanding of Snowflake's pricing model and cost optimization strategies for managing resources efficiently.
- Experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners.
- Familiarity with Snowflake’s security model.
- Practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking, and DevOps, supported by commercial project work experience.
- At least basic knowledge of SQL and one of the programming languages: Python, Scala, Java, or bash.
- Proficient command of English.
What We Offer
- Permanent employment contract from the first day.
- Hybrid, flexible working model
- Equipment package for home office.
- Private medical care with Medicover.
- Life insurance.
- NAIS benefit platform.
- Access to 70+ training tracks with certification opportunities, and platforms with free access to Pluralsight, TED Talks, Coursera, Udemy Business, and SAP Learning HUB.
- Community Hub with over 20 professional communities focused on areas such as Salesforce, Java, Cloud, IoT, Agile, AI.
Recruitment Process
- Initial Application Review
- Preliminary Acceptance
- Telephone Interview
- Client Verification
- Job Interview
- Offer
- Screening
We encourage qualified candidates to apply via the application form.