Let's make work better.

company cover
Was die Firma über den Job sagt

For our client in Berlin we currently looking for a freelance Realtime Data Engineer (m/f/d)

#Keyfacts

  • Start: 20.06.2024
  • End: 31.12.2024 (with the Option of extentions)
  • Capacity: 100%
  • Location: 95% Remote, 5% Berlin

#Tasks/Activity Description

  • Design, develop, and maintain real-time data pipelines and streaming applications.
  • Architect and implement streaming data ingestion processes from various sources, ensuring reliability, scalability, and low-latency processing.
  • Optimize and tune performance of real-time data processing systems for efficiency and throughput.
  • Implement monitoring, alerting, and logging solutions to ensure the health and reliability of real-time data infrastructure.
  • Develop, maintain and use deployment pipelines (following infrastructure as code paradigm)
  • Producing clean, efficient code based on specifications and guidelines
  • Self steered pick up on the assigned software development track and incidents
  • Collaborate with peers in the assigned projects such as TMD architects, other members of the SCRUM product team and experts and Data Architects
  • Professionally maintain all software and create updates regularly to address customer and company concerns
  • Analyze and test programs and products before formal launch
  • Troubleshoot coding problems quickly and efficiently to ensure a productive workplace
  • Actively seek ways to improve business software processes and interactions
  • Preparation of training materials and delivery of training to other project team members in the use of software applications
  • Join daily stand-up meetings and project meetings on site or remote and work in a scaled scrum environment


#Offer requirements/profile requirements

  • Excellent problem-solving and communication skills
  • German language skills


#must haves

  • Bachelor's degree or higher in Computer Science, Engineering, or related field.
  • 5 years of experience in data engineering, with a focus on real-time data processing.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong experience with real-time data processing frameworks such as Apache Kafka, Apache Flink, or Apache Spark Streaming.
  • Solid understanding of distributed computing principles and microservices architecture.
  • Experience with on-premise Kubernetes platforms as well as with cloud and platforms such as AWS, GCP, or Azure.
  • English language skills (Level C1)

If you are interested in this Project, I look forward to receiving your current CV, with information about your hourly rate.