Suman Debnath, Principal Machine Learning Advocate at AWS, attributes his first experiments with open source to his curiosity to build better tools when he was working at Toshiba. In a free-wheeling chat with OSFY’s Yashaswini Razdan, he speaks about the need for fundamental skills such as knowledge of a basic programming language, networking and automation to build a career in DevOps.
Q. What are the career options available for a software engineer in domains where open source is important and useful?
A. There are plenty of opportunities in DevOps such as the DevOps pipeline in SaaS companies or quality engineering (QE)/quality assurance (QA) and automation. Most of the SaaS companies are built on top of some cloud vendors with a strong pipeline, which is completely automated. No one does everything manually because as a SaaS provider, scale is important. So, if you have DevOps skills, you should get the opportunity to work for companies who are on the SaaS background in their DevOps pipeline. When it comes to QA and automation for software or hardware products, most test cases are automated for any test team. As a DevOps engineer, you can build that skillset as you already know programming to automate things.
Q. Apart from DevOps, are there any other roles that software engineers can aspire for where open source plays an important role?
A. Yes, there are. Consider MLOps, a subdivision of DevOps tailored for machine learning products. If you’re interested in machine learning and have a DevOps background, you can transition into MLOps roles, leveraging your understanding of automation and containerisation. MLOps is a set of practices that automate and simplify machine learning (ML) workflows and deployments. This transition offers opportunities to move from manual testing to automation and from DevOps to MLOps engineering. With open source tools like Kubernetes and Docker, you can build pipelines for machine learning models, creating datasets, training, testing, and deploying models efficiently.
Moreover, open source is fundamental across various domains within computer science. For instance, proficiency in containers or programming languages like Python opens numerous opportunities and allows you to adapt to different roles within the tech industry.
What distinguishes an average DevOps engineer from a proficient one is programming skills. While many DevOps professionals excel in automation, aspiring for roles in software development requires a mindset shift towards building products rather than solely testing them. Continuous learning and expanding your skillset beyond just automation are vital for long-term career growth. By understanding the inner workings of systems and staying adaptable, you can carve out a successful career path beyond the boundaries of DevOps.
Q. What skills are important for DevOps roles?
A. At its core, DevOps consists of infrastructure and automation. The core skills needed to get into DevOps include a very good understanding of operating systems such as Linux, any programming language such as Python, Bash, Ruby, and PowerShell networking, and automation. These are the fundamentals.
Q. What are the tools that engineers need to learn to build those skills?
A. There are plenty of tools in DevOps, which can be overwhelming, and no one is expected to know everything. These tools and technologies can be learned as you start working with them. To get started, you can pick a few tools depending on your background and interest. Essential tools include Git for version control, Jenkins, GitLab CI or CircleCI for CI/CD, Ansible, Puppet or Chef for configuration management, Docker and Kubernetes for containerisation, Terraform or AWS CloudFormation for infrastructure as code; and Prometheus, Grafana, and the ELK Stack for monitoring and logging. Proficiency in scripting languages like Bash and Python is also important.
You don’t need to learn all the above in the first place, but a strong understanding of networking, basics of operating systems, and proficiency in scripting languages like Bash and Python is crucial and important. The rest of the skills can be easily learned once you start working on them. Tools evolve, so a strong understanding of containers and a programming language will help you adapt to new ones. Focus on mastering the basics and building a solid foundation to easily transition to new tools as needed.
Q. Are there any emerging tools or technologies that can help DevOps professionals to upskill?
A. In today’s landscape, having a basic understanding of cloud computing is necessary for DevOps professionals. While it’s impossible to learn everything about cloud platforms, grasping key concepts like virtual machines, virtual networks, and auto-scaling is essential. Cloud-native companies, which are prevalent nowadays, heavily rely on cloud infrastructure, making cloud skills indispensable. DevOps professionals can explore various cloud services and products, such as AWS CodePipeline for continuous integration and delivery (CI/CD) and Elastic Container Service (ECS) for container management. Transferring existing skills, like Git proficiency or Docker knowledge, to cloud platforms can be beneficial for career growth.
Schools and colleges are now focusing on cloud computing skills earlier on, which prepares students for more advanced learning and transitions into corporate environments. While specific emerging tools or technologies may vary, the overarching theme for upscaling in DevOps is mastering cloud computing concepts and leveraging cloud platforms to enhance automation and infrastructure management capabilities.
Q. How have cloud-native technologies impacted DevOps hiring?
A. Understanding cloud-native technologies has become a baseline requirement for DevOps roles, especially for the companies established after 2010 that are built majorly on cloud infrastructure. Forums like CNCF (Cloud Native Computing Foundation) and resources provided by organisations like the Linux Foundation offer valuable insights into these technologies. CNCF has standardised frameworks around containerisation, addressing the lack of standardisation that previously existed in the field, leading to compatibility issues, as different companies built their products using different tools and frameworks. Standardisation has enabled interoperability and facilitated smoother integration of different services such as AWS CodePipeline for continuous integration and delivery (CI/CD) and Elastic Container Service (ECS) for container management.
Rather than focusing solely on vendor-specific implementations, DevOps professionals should understand the blueprint or skeleton of these technologies for more versatile skillsets that are transferable across different environments. In terms of hiring, big companies prioritise core fundamentals and the ability to explain concepts from first principles. They often have the resources and time to train individuals extensively. In contrast, startups prioritise industry-ready skillsets and value candidates who can quickly become productive.
Q. Are there some common challenges that companies face when hiring for DevOps positions?
A. One challenge many companies come across is that candidates possess in-depth knowledge of specific tools but may lack flexibility when it comes to adapting to new technologies due to deep immersion in a particular tool. This makes it challenging for any tool or framework transition as required by the company. Many individuals pursuing DevOps roles undervalue programming skills, believing it’s not essential for their career path, which is a wrong notion.
Candidates need to know the underlying technologies and frameworks, and build on programming skills rather than just mastering specific tools even if they’re not directly used in the current role, since they need to adapt to evolving technology landscapes and contribute to roles in DevOps positions. While tools and products may change, the focus should remain on learning the fundamentals to build a solid foundation to adapt to new tools with ease.
Q. When it comes to DevOps hiring, what role does certification play?
A. Certification in DevOps is not the primary factor for hiring managers or clients. However, this doesn’t mean certifications hold no value. They may not directly help to secure a job, but can serve as a validation of one’s learning and commitment. They represent the skills and knowledge acquired outside of formal education or work experience. They can be beneficial in scenarios where individuals are transitioning into new roles or fields, such as moving into DevOps from another area. For hiring managers, this means that the individual has dedicated time and effort to upskilling. For startups or mid-sized companies lacking the resources to extensively assess candidates, certifications can serve as a useful screening tool. They provide assurance that candidates possess fundamental knowledge and can quickly become productive members of the team.
However, certifications should not be viewed as a guarantee of employment. Building a strong portfolio of practical projects and demonstrating real-world experience often holds more weight in today’s competitive job market.
Q. How has the adoption of DevOps practices evolved over the past few years and where do you see this trend going?
A. DevOps practices have certainly evolved. Initially, automation was minimal or non-existent, with tasks like server setup and software testing being performed manually. However, with the rise of DevOps, organisations have transitioned from standalone automation frameworks to comprehensive DevOps ecosystems, optimising not only internal testing and validation but also providing automated tools for customers. This shift has drastically reduced manual testing and infrastructure setup, leading to faster delivery cycles and improved efficiency.
This evolution has come with its own set of challenges, particularly around ensuring backward compatibility and addressing security concerns. However, tools and technologies have emerged to address these challenges, with open source solutions playing a significant role in enabling rollback mechanisms and enhancing security in DevOps pipelines.
Looking ahead, DevOps is likely to continue evolving at a rapid pace, driven by the need to deploy software faster and more efficiently, with the goal of shortening release cycles to minutes or even seconds. Companies are increasingly focused on further streamlining their processes, emphasising not just automation but also backward compatibility and security. Professionals need to adapt and upgrade their skills to integrate automation, security, and backward compatibility, to deliver software at breakneck speeds while maintaining reliability and security standards.
Q. How important are security, compliance and regulations for somebody in the open source ecosystem?
A. Understanding security, compliance, and regulations is important for anyone in the open source ecosystem, even if it’s not their primary focus. While individuals may not need to grasp the legal intricacies of every company or product, having a basic understanding of security vulnerabilities is essential. This means being aware of potential risks when using open source tools and staying informed about any security breaches or updates.
Considerations around security become especially important when dealing with sensitive data, as demonstrated by regulations like GDPR. In today’s landscape, data is at the core of nearly every aspect of technology, from DevOps to machine learning. Therefore, even a basic understanding of security principles can be invaluable. Practically, this means being mindful of best practices, such as avoiding storing access keys in public repositories or saving passwords in plain text