Infrastructure as code (IaC)
The IaC concept applies the principles of agile development for the provision and operation of IT infrastructure. Systems are detached from physical hardware. Provision and maintenance is left to the software and manual work is no longer necessary. Changes are now possible within minutes, often even seconds. This reduces the risk and the costs. The solution can be deployed and can get to market faster with more confidence.
Related articles
Ansible variables: choosing the right location
Categories: DevOps & SRE | Tags: Infrastructure, Ansible, IaC, YAML
Defining variables for your Ansible playbooks and roles can become challenging as your project grows. Browsing the Ansible documentation, the diversity of Ansible variables location is confusing, toā¦
Mar 15, 2022
GitOps in practice, deploy Kubernetes applications with ArgoCD
Categories: Containers Orchestration, DevOps & SRE, Adaltas Summit 2021 | Tags: Argo CD, CI/CD, Git, GitOps, IaC, Kubernetes
GitOps is a set of practices to deploy applications using Git. Application definitions, configurations, and connectivity are to be stored in a version control software such as Git. Git then serves asā¦
Dec 16, 2021
Spring 2022 internship - building a Data Lab
Categories: Data Science, Learning | Tags: MongoDB, Spark, Argo CD, Elasticsearch, Internship, Keycloak, Kubernetes, OpenID Connect, PostgreSQL
Job Description Over the last few years, we developed the ability to use computers to process large amounts of data. The ecosystem evolved over a large offering of tools and libraries and the creationā¦
By David WORMS
Nov 24, 2021
Internship in Big Data infrastructure with TDP
Categories: Infrastructure, Learning | Tags: Cyber Security, DevOps, Java, Hadoop, IaC, Internship, TDP
Job Description Big Data and distributed computing is at Adaltasā core. We support our partners in the deployment, maintenance and optimization of some of Franceās largest clusters. Adaltas is also anā¦
By Daniel HARTY
Oct 25, 2021
Internship in Data Engineering
Categories: Front End, Learning | Tags: Metrics, Monitoring, Hive, Kafka, Delta Lake, Elasticsearch, IaC, Internship, Kubernetes, Streaming
Job Description Data is a valuable business asset. Some call it the new oil. The data engineer collects, transform and refine āāraw data into information that can be used by business analysts and dataā¦
By David WORMS
Oct 25, 2021
Internship in Web Technologies
Categories: Front End, Learning | Tags: DevOps, LDAP, React.js, CI/CD, Docker, GraphQL, IaC, Internship, Kubernetes, Node.js, OAuth2
Job Description As part of its Big Data activities, Adaltas Academy is an information-sharing platform bringing together articles, training content, and a knowledge base. The users of the platform areā¦
By David WORMS
Oct 14, 2021
Using Cloudera Deploy to install Cloudera Data Platform (CDP) Private Cloud
Categories: Big Data, Cloud Computing | Tags: Ansible, Cloudera, CDP, Cluster, Data Warehouse, Vagrant, IaC
Following our recent Cloudera Data Platform (CDP) overview, we cover how to deploy CDP private Cloud on you local infrastructure. It is entirely automated with the Ansible cookbooks published byā¦
Jul 23, 2021
Plugin architecture in JavaScript and Node.js with Plug and Play
Categories: Front End, Node.js | Tags: Asynchronous, DevOps, Programming, Agile, JavaScript, Open source, Release and features
Plug and Play helps library and application authors to introduce a plugin architecture into their code. It simplifies complex code execution with well-defined interception points, also called hooksā¦
By David WORMS
Aug 28, 2020
Spark Streaming part 3: DevOps, tools and tests for Spark applications
Categories: Big Data, Data Engineering, DevOps & SRE | Tags: DevOps, Learning and tutorial, Spark, Apache Spark Streaming
Whenever services are unavailable, businesses experience large financial losses. Spark Streaming applications can break, like any other software application. A streaming application operates on dataā¦
May 31, 2019
InfraOps & DevOps Internship - build a Big Data & Kubernetes PaaS
Categories: Big Data, Containers Orchestration | Tags: DevOps, LXD, Hadoop, Kafka, Spark, Ceph, Internship, Kubernetes, NoSQL
Context The acquisition of a high-capacity cluster is in line with Adaltasā desire to build a PAAS-type offering to use and to provide Big Data and container orchestration platforms. The platforms areā¦
By David WORMS
Nov 26, 2019
Gatsby.js, React and GraphQL for documentation websites
Categories: Adaltas Summit 2018, Front End | Tags: Gatsby, HTTP, JAMstack, React.js, SEO, API, GitOps, GraphQL, JavaScript, Markdown, Node.js
In the last few months, I have started to redesign some of our Open Source project websites. This includes the websites of the Node.js CSV project, the Node.js HBase client and the Nikita project, ourā¦
By David WORMS
Apr 1, 2019
Jumbo, the Hadoop cluster bootstrapper
Categories: Infrastructure | Tags: Ambari, Automation, Ansible, Cluster, Vagrant, HDP, REST
Introducing Jumbo, a Hadoop cluster bootstrapper for developers. Jumbo helps you deploy development environments for Big Data technologies. It takes a few minutes to get a custom virtualized Hadoopā¦
Nov 29, 2018
Hadoop cluster takeover with Apache Ambari
Categories: Big Data, DevOps & SRE, Adaltas Summit 2018 | Tags: Ambari, Automation, iptables, Nikita, Systemd, Cluster, HDP, Kerberos, Node, Node.js, REST
We recently migrated a large production Hadoop cluster from a āmanualā automated install to Apache Ambari, we called this the Ambari Takeover. This is a risky process and we will detail why thisā¦
Nov 15, 2018
Managing User Identities on Big Data Clusters
Categories: Cyber Security, Data Governance | Tags: LDAP, Active Directory, Ansible, FreeIPA, IAM, Kerberos
Securing a Big Data Cluster involves integrating or deploying specific services to store users. Some users are cluster-specific when others are available across all clusters. It is not always easy toā¦
By David WORMS
Nov 8, 2018
Lando: Deep Learning used to summarize conversations
Categories: Data Science, Learning | Tags: Micro Services, Open API, Deep Learning, Internship, Kubernetes, Neural Network, Node.js
Lando is an application to summarize conversations using Speech To Text to translate the written record of a meeting into text and Deep Learning technics to summarize contents. It allows users toā¦
By Yliess HATI
Sep 18, 2018
Ambari - How to blueprint
Categories: Big Data, DevOps & SRE | Tags: Ambari, Automation, DevOps, Operation, Ranger, REST
As infrastructure engineers at Adaltas, we deploy Hadoop clusters. A lot of them. Letās see how to automate this process with REST requests. While really handy for deploying one or two clusters, theā¦
Jan 17, 2018
Multi-Repo, Multi-Node Gating at Massive Scale
Categories: Cloud Computing, DevOps & SRE, Open Source Summit Europe 2017 | Tags: Infrastructure, Jenkins, Red Hat, Zuul, Ansible, CI/CD, OpenStack
This is a recap and personal review of Monty Taylorās presentation of OpenStackās Continuous Integration tool Zuul at the OpenSource Summit 2017 in Prague (not to mix with Netflixā Zuul projectā¦
Oct 28, 2017
From Dockerfile to Ansible Containers
Categories: Containers Orchestration, DevOps & SRE, Open Source Summit Europe 2017 | Tags: pip, Shell, Ansible, Docker, Docker Compose, YAML
This talk was an introduction to the Dockerfile format and to Ansible containerās tool and then a comparison of both. It was hold by Tomas Tomecek from Red Hatās containerization team. The Dockerfileā¦
Oct 25, 2017
Managing authorizations with Apache Sentry
Categories: Data Governance | Tags: Hue, Database, LDAP, Nikita, Sentry, Ansible, CDH, Deployment
Apache Sentry is a system for enforcing fine grained role based authorization to data and metadata stored on a Hadoop cluster. With this article, we will show you how we are using Apache Sentry atā¦
By Axel JACQIN
Jul 24, 2017
Asynchronous array iteration in Node.js with Each
Categories: Node.js | Tags: Asynchronous, CoffeeScript, JavaScript, Release and features
Control flow in Node.js is the sort of library for which almost all the developers have created and publish their own libraries. They usually aim at reducing spaghetti codes made of deep callbacks. Iā¦
By David WORMS
Jul 18, 2012
Announcing Mecano, a set of functions for system deployment
Categories: DevOps & SRE, Node.js | Tags: Automation, Infrastructure, CoffeeScript, JavaScript, Open source
Update July 2016, Mecano is now renamed Nikita. We are releasing Node Mecano on GitHub which gather common functions used while deploying systems. The idea was to group those functions into aā¦
By David WORMS
Feb 12, 2012