Profinder_LOGO_2014_utan-gubbe

Data Engineers (refnr:22)

Core Competence:

  • BSc, MSc or PhD degree in Computer Science, Informatics, Information Systems or equivalent
  • 5 years’ experience in data engineering
  • Excellent communication skills in written and spoken English
  • Experience building ‘big data’ data pipelines, architectures and data sets
  • Experience from building processes supporting data transformation, and workload management
  • Knowledge of message queuing, stream processing, and scalable ‘big data’ data stores
  • Experience in Data warehouse design and dimensional modeling
  • Contributor or owner of GitHub repo

 
3-5 years’ experience using following software/tools:

  • NoSQL databases such as Cassandra, Solr, MongoDB, etc..
  • Tools/software for big data processing such as Hadoop, Spark
  • Handling data streams with tools such as Flink, Spark-Streaming, Kafka or Storm
  • Data and Model pipeline tools such as Azkaban, Luigi. Airflow or Dataiku.
  • Docker containers, orchestration systems (e.g. Kubernetes) and job schedulers
  • Serverless architectures (e.g. Lambda, Kinesis, Glue)
  • Microservices and REST APIs
  • SQL and traditional RDBMS systems
  • Experience in leading projects in the field of data engineering
  • Strong analytical skills and ability to acquire new knowledge
  • Leadership interest and skills
  • Outgoing and easy to cooperate

 
Programming and scripting languages:

  • Python
  • Java
  • C++
  • Scala
  • GO