Automating a shift-left CI/CD security workflow to track, report, and analyze software security vulnerabilities

Enhancing Software Security: Streamlining and Analyzing Vulnerabilities with Shift-Left CI/CD Automation

Introduction:

Security is a crucial aspect of any product, and it is essential to identify and address vulnerabilities in a timely manner. In this post, we will discuss an automated shift-left CI/CD security workflow that can help you quickly identify the top ten Open Web Application Security Project (OWASP) vulnerabilities in your applications. Our workflow utilizes industry technologies such as Jenkins, Postgres, JIRA, Python, SQL, ZAP, and our own SAS products to track and report Dynamic Application Security Testing (DAST) security vulnerabilities. By implementing this workflow, you can enhance the efficiency of your work, ensure the security of your product, and earn the confidence and trust of your customers. Before implementing this workflow, SAS faced challenges in identifying OWASP vulnerabilities, gathering accurate and up-to-date information, and reporting and tracking vulnerabilities across applications. To overcome these challenges, we designed a schema and view in a shared Postgres database to store security data, developed a Python script to parse security scan results, and utilized our own SAS products to report and analyze vulnerability status and trends. With this automated process, we were able to improve our work process, ensure accountability, and reduce manual intervention. In addition, we integrated OWASP ZAP into our workflow to run security scans. ZAP is an open-source security tool that complies with strict security standards. We also used SAS Studio to query security data and generate SAS data sets, and integrated the SAS code into our Job Execution Service to process and load the data into CAS. This allowed us to design a SAS Visual Analytics report that provides a visual analytical view of the security data. By implementing a Jenkins pipeline, we were able to initiate ZAP, configure settings, run test cases, parse security XML reports, send records to our database, and trigger SAS Job Execution. Overall, our shift-left automated security workflow has provided us with the knowledge and data necessary to improve our products, troubleshoot issues, and make informed decisions. The SAS Visual Analytics report allows interested parties to access and analyze security data, and the retention and collection of this data is valuable for auditing purposes. The report provides information such as elapsed time, types of alerts, deployable units, and their endpoints with security violations. It also allows for filtering by release year, severity, or deployable units. Security champions can search historical records in JIRA tickets, and security experts and developers can use the data for security design reviews. Data analysts can even build models and predict security vulnerabilities. Our shift-left automated workflow has been well received and has allowed security champions to focus on other priorities. We have shared our Jenkins job with other teams and continue to collect security data regularly. The SAS Visual Analytics report keeps us up-to-date with security findings and trends, and we have learned valuable lessons throughout our DevOps journey. We have learned to manage security scan resources wisely, fine-tune ZAP settings to exclude irrelevant endpoints, and maintain consistent formatting and layout in JIRA. Our shift-left automated workflow has made our DevOps journey more relevant and has opened up numerous opportunities for improvement.

You May Also Like to Read  Top 10 Remarkable Women to Follow in Business 2023: The Women Who Are Powering Success

Full Article: Enhancing Software Security: Streamlining and Analyzing Vulnerabilities with Shift-Left CI/CD Automation

Shift-Left CI/CD Security Workflow – Ensuring Secure Products with Automated Processes

Security is a crucial aspect when it comes to developing and deploying software products. As technology continues to evolve, the need for robust security measures becomes even more important. Open Web Application Security Project (OWASP) vulnerabilities are a common concern that developers face. However, identifying and mitigating these vulnerabilities can be time-consuming and challenging without appropriate tools and processes in place.

In this news report, we will discuss how SAS, a leading analytics software company, implemented an automated shift-left CI/CD security workflow to address these challenges. By leveraging industry technologies like Jenkins, Postgres, JIRA, Python, SQL, ZAP, and their own SAS products, SAS has made significant strides in improving their security measures. Let’s dive into the details.

Challenges Addressed by SAS

Before implementing the automated shift-left CI/CD security workflow, SAS faced several challenges. One of the primary challenges was identifying the top OWASP vulnerabilities efficiently. This required spending a significant amount of time searching for and gathering accurate and up-to-date information. Additionally, there was no cohesive way to track and report vulnerabilities across different applications.

To overcome these challenges, SAS set out to collect Dynamic Application Security Testing (DAST) scan results and analyze them using their own SAS product stack. The goal was to create a centralized database, named Continuous Quality Metrics (CQM), to store and analyze security vulnerability data. This automated process would not only save time but also optimize false positives over time.

Implementation of the Automated Security Workflow

SAS implemented a step-by-step process to establish their automated security workflow. Here’s an overview of the implementation:

1. Designing the Database: SAS created a schema and view in a shared Postgres database (CQM) to store security data. The schema included data such as test case name, elapsed time, product name, and release number.

2. Parsing Security Scan Results: A Python script was developed to parse the security scan results into an XML format. This script also implemented an alert filter to flag known vulnerabilities and automatically generate Jira tickets for newly identified vulnerabilities.

3. Storing Security Data: The Python script sent all alert records to the CQM database, allowing for cross-referencing, accountability, and minimizing manual intervention.

4. Data Analysis: SAS Studio was used to develop code to query the CQM database and produce a SAS data set with the security results.

5. Creating Visual Analytics Report: A SAS Visual Analytics report was designed to aggregate the security data collected from the scans. This report offered a visual analytical view of the security data of the products.

You May Also Like to Read  Revolutionizing the Microsoft 365 Suite: Introducing an R Interface for Enhanced User Experience

6. Jenkins Pipeline: A Jenkins pipeline was built to tie all the steps together. This pipeline initiated ZAP, configured ZAP settings, ran test cases on a deployed server, parsed the security XML report, and triggered the SAS Job Execution Service.

Benefits of the Automated Workflow

Implementing this automated shift-left CI/CD security workflow provided several benefits for SAS. Here are some notable advantages:

1. Efficient Vulnerability Detection and Reporting: The automated process allowed SAS to quickly identify and report vulnerabilities without manual intervention. By using ZAP for security scans and Jira for ticket generation and triaging, the workflow streamlined the vulnerability management process.

2. SAS Visual Analytics: The integration of SAS Visual Analytics enabled the creation of comprehensive reports on security data. These reports provided valuable insights, including elapsed time of scans, types of alerts, and historical records for troubleshooting.

3. Data Accessibility and Auditing: Interested parties had access to the SAS Visual Analytics report, facilitating auditing processes. The retention and collection of security data proved invaluable for analyzing trends and making informed decisions.

4. Enhanced Collaboration: The automated workflow was shared among teams within the organization, facilitating collaboration in identifying and resolving security vulnerabilities.

Lessons Learned and Future Endeavors

SAS’s DevOps journey saw significant improvement through the implementation of their automated shift-left CI/CD security workflow. The process empowered security champions to focus on other priorities, reduced manual responsibilities, and enabled efficient resource management.

SAS continues to expand their security data collection efforts and utilize SAS Visual Analytics to explore more opportunities for improving product security. Lessons learned from this journey include the importance of resource management, continuous data collection, and collaboration across teams.

Conclusion

Security is a shared responsibility, and SAS recognized the need for an automated shift-left CI/CD security workflow to ensure the security of their products. By utilizing industry technologies and their own SAS products, SAS successfully implemented an automated process to identify, track, and report vulnerabilities efficiently. The integration of ZAP, Jira, Postgres, and SAS Visual Analytics proved instrumental in achieving their security goals.

The shift-left CI/CD security workflow not only enhanced the efficiency of SAS’s work processes but also instilled confidence and trust in their customers. With the right tools and processes in place, SAS continues to prioritize security in their product development life cycle, making it a win-win situation for everyone involved.

Summary: Enhancing Software Security: Streamlining and Analyzing Vulnerabilities with Shift-Left CI/CD Automation

In this post, we explore the implementation of an automated shift-left CI/CD security workflow that tracks and reports the top ten OWASP vulnerabilities in applications. The workflow utilizes technologies such as Jenkins, Postgres, JIRA, Python, SQL, ZAP, and SAS products to make the process more efficient and secure. Before implementing this workflow, gathering accurate and up-to-date information on vulnerabilities was time-consuming and challenging. To overcome these challenges, a schema and view were designed to store security data, and a Python script was created to parse security scan results. Automated JIRA tickets are generated for new vulnerabilities, and SAS Studio is used to query and analyze the data. The workflow is integrated into Jenkins pipeline stages, and a SAS Visual Analytics report provides a visual analytical view of the security data collected. By implementing this workflow, organizations can improve their product security, gain the trust of customers, and make their work processes more efficient.

You May Also Like to Read  Episode 06 of the "Becoming a Data Scientist" Podcast: Unveiling the Journey of Erin Shellman

Frequently Asked Questions:

1. What is Data Science and why is it important?
Data Science is the interdisciplinary field that involves the extraction of knowledge and insights from various types of data. It combines computer science, statistics, and domain expertise to analyze and interpret data to drive informed decision-making. Data Science is important as it helps organizations make sense of vast amounts of data, uncover patterns, and derive valuable insights to improve their operations, enhance efficiency, and gain a competitive edge.

2. What are some key skills required for a career in Data Science?
A successful Data Scientist needs to possess a combination of technical and analytical skills. Some key skills required for a career in Data Science include proficiency in programming languages such as Python or R, strong statistical analysis and predictive modeling abilities, data visualization skills, proficiency in working with big data frameworks like Hadoop or Spark, and domain knowledge expertise.

3. How is Machine Learning related to Data Science?
Machine Learning is a subset of Data Science that focuses on the development of algorithms and models that can learn and make predictions or decisions without being explicitly programmed. Data Science includes the broader domain of collecting, processing, and analyzing vast amounts of data, while Machine Learning specifically deals with using algorithms to automatically learn from data and make predictions or decisions.

4. What are the steps involved in the Data Science process?
The Data Science process generally involves the following steps:

1. Define the problem: Clearly define the problem that needs to be solved or the question that needs to be answered using data.

2. Collect and preprocess data: Gather relevant data from various sources, clean and preprocess it to remove inconsistencies or errors.

3. Explore and analyze data: Perform exploratory data analysis to identify patterns, correlations, and potential insights using statistical techniques and data visualization.

4. Develop and evaluate models: Build predictive models based on the problem and the available data. Evaluate their performance, iterate, and fine-tune as necessary.

5. Communicate results: Present the findings and insights obtained from the analysis in a clear and concise manner to stakeholders, using visualizations or reports.

5. How is Data Science used in real-world applications?
Data Science is used in a wide range of real-world applications across industries. Some common examples include:

– Financial institutions use Data Science to detect fraudulent transactions and assess credit risk.

– Healthcare organizations leverage Data Science to analyze patient data and predict disease outcomes.

– E-commerce companies employ Data Science to personalize product recommendations based on user behavior and preferences.

– Transportation and logistics companies use Data Science to optimize routes and manage efficient supply chains.

– Social media platforms utilize Data Science to analyze user behavior, personalize content, and target advertisements.

Overall, Data Science is a valuable discipline that helps organizations make data-driven decisions, improve processes, and gain competitive advantages in today’s data-driven economy.