Open Source Data Platform Developer for TDP
- Categories
- Big Data
- Tags
- DevOps
- Hadoop
- Databricks
- Kubernetes
- TDP [more][less]
Job Description
We are seeking an experienced developer to join our team and build TDP, the 100% open source data platform.
As the developer of this platform, you will be responsible for designing, developing, and implementing an innovative data platform that will serve as the backbone of large organizations’ data infrastructure.
Responsibilities
- Work with stakeholders to identify data requirements and develop a roadmap for the data platform
- Develop and implement a scalable and reliable data platform architecture
- Design and build data pipelines to integrate data from various sources into the platform
- Create and maintain data models, schemas, and database structures
- Implement robust data security and access controls
- Develop tools and features for data analysis and visualization
- Contribute to open source communities and collaborate with other developers to improve the platform
- Write high-quality, maintainable, and scalable code
- Work closely with cross-functional teams to ensure the data platform meets business requirements
If you are passionate about open source software and data-driven technologies, want to make a difference in the world of data, and become an open source contributor, we would love to hear from you.
This is a full-time position located in the Paris region with competitive compensation and benefits.
Required Skills
- At least 3 years of experience in software development
- Experience in designing and building large-scale, distributed systems
- Strong proficiency in programming languages such as Python, Java, or Go
- Experience with database systems such as MySQL, PostgreSQL, or Elasticsearch
- Familiarity with open source technologies such as Kafka, Spark, and Hadoop
- Knowledge of data modeling and data architecture
- Familiarity with data security and access controls
- Excellent problem-solving skills and the ability to work independently or in a team environment
- Strong communication skills and the ability to work collaboratively with cross-functional teams
Company presentation
Adaltas specializes in data processing and storage. We operate on-premises and in the cloud to empower our clients’ teams in architecture, operations, data engineering, data science and DevOps.
We are active contributors to the TDP project, working closely with our customers. We invite you to consult Alliage, our support and consulting offer dedicated to TDP.
We are looking for big data experts familiar with the Hadoop and Kubernetes ecosystems. Also, anyone with a solid background in Linux operations interested with distributed systems is welcome to apply.
Remuneration
Compensation is based on your experience and skills. Salaries for this position start at €66,000 per year with 3 years of experience. Food and transportation are covered. We provide a laptop with Linux, 32GB RAM minimum, 1TB SSD, 8c/16t CPU.
Contact
For additional information and to submit your application, please send us an email:
Or contact David Worms:
- david@adaltas.com
- +33 6 76 88 72 13
- https://www.linkedin.com/in/david-worms/
Looking for new challenges, if no job description suits you expectations and becoming a consultant at Adaltas is your career choice, submit an unsolicited application.
Open job opportunities
Back to careersPython developer for TDP, the open-source data platform
Open-source contribution to TDP to participate in building the server and the engine of the platform.
Data Engineer Databricks and Azure - mid level developer
Collaborate with other data engineers, business analysts and data scientists to solve challenging business problems on the Databricks and Azure platforms
Big data architect with CDP - senior developer
Design and develop solutions including platform architecture, data ingestion, data lakehouse architecture and data science usages.
Data streaming engineer - mid level developer
Prototype, build, deploy and operate data ingestion pipelines on critical infrastructure, generate KPIs with real-time queries.
Big data administrator Cloudera CDP - mid level developer
Deploy and operate of big data clusters based on the Cloudera CDP platform.