Företag: Veritaz AB
Anställningsform:Full Time
Länk: https://ec.europa.eu/eures/portal/jv-se/jv-details/MzEzMzAyOCAxMjE?jvDisplayLanguage=sv
Sista ansökningsdag:

Om tjänsten

Veritaz is a fast-growing IT-consultant firm. Our company is made up of insanely bright people from over 4 countries, and we are located in Sweden, UK, US and Pakistan. The voyage has been incredible this far, but it is only the beginning.Assignment Description:We are looking for a Data Engineer who is experienced in Data modelling and DI tools and is interested to be part of a multi-platform agile solutions environment. Do you value openness, transparency, and empowerment? Our squad is high performing cross-functional team who are set with a mission to provide win-win exchanges for our customers.What you’ll do:●       Setting technical direction as well as creating and maintaining optimal data pipeline architecture.●       Work in a multi-functional agile team to assemble large, complex data sets that meet functional / non-functional business requirements.●       Work with state-of-the-art data processing frameworks, technologies, and cloud platforms.●       Help drive optimization, testing, and tooling to improve data quality, and be an active member of the Data Engineering community and collaborate with other Engineers across the cluster.●       Increase developer productivity by building innovative tools that reduce maintenance overhead of working with data pipelines and help increase platform reliability.●       Advocate and advance modern, agile software development and help develop and foster good engineering practices.●       Help ensure the solutions are scalable, sustainable, architecturally sound, and technical debt is both incurred consciously and repaid in a reasonable time.●       Demonstrate and champion an appetite for knowledge and never stop developing as a Data Engineer.Who you are:●       Deeply knowledgeable and passionate about modern data architecture principles, e.g., data lake, data warehouse and data mesh.●       Steady foundation in coding and are comfortable working with data in Python and SQL, and have worked with cloud technologies, e.g., AWS and Azure.●       Worked with modern data formats such as parquet, and have experience in design, develop and deploy data pipeline based on use cases, e.g., batch vs streaming.●       Extensive experience working hands-on as a Data Engineer developing large scale data solutions in an agile environment.●       Knowledgeable about data modelling, data access, and data storage techniques, and are passionate about crafting clean code and test-driving development.●       You know and care about sound engineering practices like continuous integration and delivery, defensive programming, and automated testing.●       Pragmatic and understand the trade-offs between the perfect solution and a working solution.●       Technological experience in Kubernetes, data security and privacy, distributed computing, such as Spark/Ray/Dask and microservices.