Data Engineer [Rekrutacja prowadzona online]

Dodane: Poniedziałek, 03 Sierpień, 2020  11:00
  • Kategoria: Dam pracę / Inne
  • Forma pracy:Inna
  • Umowa:Inna
  • Dodatkowe:Brak wymagań
  • Pracodawca: PwC Polska

Opis oferty pracy:

PwC is a powerful network of over 250.000 people across 157 countries. All committed to deliver quality in Assurance, Tax, Advisory & Technology services. Match your curiosity with continuous opportunities to learn, grow and make an impact. Join PwC and be a game changer.
We are looking for candidates who are eager to work in the field of data engineering in a Big Data environment. Your role would be to join a long-term project for PwC US and work on developing and maintaining a data lake with consumer data with over 50 data sources and ca. 40TB total volume. Additionally, you will have a chance to join similar efforts for the polish market, which are in an early phase and are a great opportunity to implement your own ideas and show creativity.
In PwC Data Analytics we aim at delivering end-to-end solutions by collaborating heavily with Business Experts from other teams in PwC (Strategy & Operations, Financial Services, Digital Transformation, etc.), Developers and professionals from IT.
Our team employs specialists in Machine Learning, Big Data solutions and architecture, statistics and its applications, as well as Deep Learning with various backgrounds (computer science, mathematics, physics, engineering and economy) and degree of seniority (from juniors with 1-2 years of experience to country leaders with over 10+ years on the market).

- Development of data lakes and data marts

- Designing, implementing and monitoring ETLs

- Integration of various data sources into concise data products

- Evaluation of new data vendors

- Migrating data processing to a cloud environment

- Data quality control

- Ad-hoc data extracts

- Writing documentation

- At least 3 years of professional experience

- Working experience with Hadoop ecosystem (Spark, Hive, YARN) or MS Azure (HDInsight, Databricks)

- Good knowledge of Python

- Practical knowledge of version control in git

- At least intermediate knowledge of SQL

- Min B2 in English, written and spoken - it is a must because communication with main project stakeholders is in English

- Experience with analysis of large data sets

- Preferred faculties: Math, Computer Science, Physics, Operational Research or related

- Interest in Big Data and willingness to learn new tools and solutions

- Communication skills

Additional assets will be:

- Working knowledge of Scala

- Familiarity with Linux and bash scripting

- Fair knowledge of algorithms and data structures

- Working in an international team

- Flexible working hours and possible home office

- A broad offer of technical trainings and conferences

- Subsidized language courses

- Regular team building initiatives, including hackathons, parties and away-days

- Dynamic, project driven work environment

- Excellent working conditions and friendly working atmosphere

- Attractive compensation with additional benefits package
Na od 04 lip 2019
Strona użytkownika
Obserwujesz ofertę
  • Wypromuj
  • Zgłoś naruszenie
  • Drukuj ulotkę
  • Edytuj/usuń