Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Qualys Parser: Support for Monthly PCI Scan #9328

Merged
merged 1 commit into from
Jan 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 49 additions & 37 deletions dojo/tools/qualys/csv_parser.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import csv

Check warning on line 1 in dojo/tools/qualys/csv_parser.py

View check run for this annotation

DryRunSecurity / AI-powered Sensitive Files Check

Possible Sensitive File

Our AI-Powered Sensitive File checker believes it has discovered a sensitive file being modified in this PR. Extra care must be taken when modifying a file that is potentially security-sensitive. The following reason was provided: Contains parsing logic for CSV files from Qualys
import io
import logging
import re
Expand Down Expand Up @@ -43,33 +43,34 @@

report_findings = []

for row in csv_reader:
if row.get("Title") and row["Title"] != "Title":
report_findings.append(row)

Check warning on line 48 in dojo/tools/qualys/csv_parser.py

View check run for this annotation

DryRunSecurity / AI-powered Sensitive Function Check

Possible Sensitive Function

Our AI-Powered Sensitive Function checker believes it has discovered a sensitive function being modified in this PR. The name of the function is `undefined`. Extra care must be taken when modifying a function that is potentially security-sensitive. The following reason was provided for why this function was flagged as sensitive: The function build_findings_from_dict() appears to be related to authentication or authorization

elif row.get("VULN TITLE"):
report_findings.append(row)
return report_findings


def _extract_cvss_vectors(cvss_base, cvss_temporal):
"""
Parses the CVSS3 Vectors from the CVSS3 Base and CVSS3 Temporal fields and returns as a single string.

This is done because the raw values come with additional characters that cannot be parsed with the cvss library.
Example: 6.7 (AV:L/AC:L/PR:H/UI:N/S:U/C:H/I:H/A:H)
Args:
cvss_base:
cvss_temporal:
Returns:
A CVSS3 Vector including both Base and Temporal if available
"""

vector_pattern = r"^\d{1,2}.\d \((.*)\)"
cvss_vector = "CVSS:3.0/"

if cvss_base:
try:
cvss_vector += re.search(vector_pattern, cvss_base).group(1)
except IndexError:

Check warning on line 73 in dojo/tools/qualys/csv_parser.py

View check run for this annotation

DryRunSecurity / AI-powered Sensitive Function Check

Possible Sensitive Function

Our AI-Powered Sensitive Function checker believes it has discovered a sensitive function being modified in this PR. The name of the function is `undefined`. Extra care must be taken when modifying a function that is potentially security-sensitive. The following reason was provided for why this function was flagged as sensitive: The function build_findings_from_dict() appears to be related to authentication or authorization
_logger.error(f"CVSS3 Base Vector not found in {cvss_base}")
except AttributeError:
_logger.error(f"CVSS3 Base Vector not found in {cvss_base}")
Expand Down Expand Up @@ -108,7 +109,6 @@
"5": "Critical",
}
dojo_findings = []

for report_finding in report_findings:
if report_finding.get("FQDN"):
endpoint = Endpoint.from_uri(report_finding.get("FQDN"))
Expand All @@ -129,44 +129,56 @@
if finding_with_id:
finding = finding_with_id
else:
finding = Finding(
title=f"QID-{report_finding['QID']} | {report_finding['Title']}",
mitigation=report_finding["Solution"],
description=f"{report_finding['Threat']}\nResult Evidence: \n{report_finding.get('Threat', 'Not available')}",
severity=severity_lookup.get(report_finding["Severity"], "Info"),
impact=report_finding["Impact"],
date=parser.parse(
report_finding["Last Detected"].replace("Z", "")
),
vuln_id_from_tool=report_finding["QID"],
cvssv3=cvssv3
)

cve_data = report_finding.get("CVE ID")
finding.unsaved_vulnerability_ids = (
cve_data.split(",") if "," in cve_data else [cve_data]
)
if report_finding.get("Title"):
finding = Finding(
title=f"QID-{report_finding['QID']} | {report_finding['Title']}",
mitigation=report_finding["Solution"],
description=f"{report_finding['Threat']}\nResult Evidence: \n{report_finding.get('Threat', 'Not available')}",
severity=severity_lookup.get(report_finding["Severity"], "Info"),
impact=report_finding["Impact"],
date=parser.parse(
report_finding["Last Detected"].replace("Z", "")
),
vuln_id_from_tool=report_finding["QID"],
cvssv3=cvssv3
)
cve_data = report_finding.get("CVE ID")
# Qualys reports regression findings as active, but with a Date Last
# Fixed.
if report_finding["Date Last Fixed"]:
finding.mitigated = datetime.strptime(
report_finding["Date Last Fixed"], "%m/%d/%Y %H:%M:%S"
)
finding.is_mitigated = True
else:
finding.is_mitigated = False

finding.active = report_finding["Vuln Status"] in (
"Active",
"Re-Opened",
"New",
)

# Qualys reports regression findings as active, but with a Date Last
# Fixed.
if report_finding["Date Last Fixed"]:
finding.mitigated = datetime.strptime(
report_finding["Date Last Fixed"], "%m/%d/%Y %H:%M:%S"
)
finding.is_mitigated = True
else:
finding.is_mitigated = False
if finding.active:
finding.mitigated = None
finding.is_mitigated = False
elif report_finding.get("VULN TITLE"):
finding = Finding(
title=f"QID-{report_finding['QID']} | {report_finding['VULN TITLE']}",
mitigation=report_finding["SOLUTION"],
description=f"{report_finding['THREAT']}\nResult Evidence: \n{report_finding.get('THREAT', 'Not available')}",
severity=report_finding["SEVERITY"],
impact=report_finding["IMPACT"],
date=parser.parse(
report_finding["LAST SCAN"].replace("Z", "")
),
vuln_id_from_tool=report_finding["QID"]
)
cve_data = report_finding.get("CVEID")

finding.active = report_finding["Vuln Status"] in (
"Active",
"Re-Opened",
"New",
finding.unsaved_vulnerability_ids = (
cve_data.split(",") if "," in cve_data else [cve_data]
)

if finding.active:
finding.mitigated = None
finding.is_mitigated = False

finding.verified = True
finding.unsaved_endpoints.append(endpoint)
if not finding_with_id:
Expand Down
2 changes: 2 additions & 0 deletions unittests/scans/qualys/monthly_pci_issue6932.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
IP,HOSTNAME,LAST SCAN,QID,VULN TITLE,TYPE,SEVERITY,PORT,PROTOCOL,OPERATING SYSTEM,IS_PCI,FALSE POSITIVE STATUS,CVSS_BASE,Q_SEVERITY,THREAT,IMPACT,SOLUTION,CVSS_TEMPORAL,CATEGORY,RESULT,BUGTRAQID,CVEID

Check warning on line 1 in unittests/scans/qualys/monthly_pci_issue6932.csv

View check run for this annotation

DryRunSecurity / AI-powered Sensitive Files Check

Possible Sensitive File

Our AI-Powered Sensitive File checker believes it has discovered a sensitive file being modified in this PR. Extra care must be taken when modifying a file that is potentially security-sensitive. The following reason was provided: Contains sensitive PCI data
192.168.0.1,abv.xyw.com.fj,22/09/2022 13:01,86476,Web Server Stopped Responding,POTENTIAL,Medium,80,tcp,Linux 2.x,Fail,Requested,6.4,3,The Web server stopped responding to 3 consecutive connection attempts and/or more than 3 consecutive HTTP / HTTPS requests. Consequently the service aborted testing for HTTP / HTTPS vulnerabilities. The vulnerabilities already detected are still posted. For more details about this QID please review the following Qualys KB article:<BR> <A HREF= https://success.qualys.com/support/s/article/000003057#:~:text=The%20exhaustive%20Web%20Testing%20Skipped network%20bandwidth%20is%20being%20overloaded TARGET= _blank ></A><P>,The service was unable to complete testing for HTTP / HTTPS vulnerabilities since the Web server stopped responding.,Check the Web server status. <P> If the Web server was crashed during the scan please restart the server report the incident to Customer Support and stop scanning the Web server until the issue is resolved. <P> If the Web server is unable to process multiple concurrent HTTP / HTTPS requests please lower the scan harshness level and launch another scan. If this vulnerability continues to be reported please contact Customer Support.,6.1,Web server,The web server did not respond for 4 consecutive HTTP requests. After these the service was still unable to connect to the web server 2 minutes later.,-,
8 changes: 8 additions & 0 deletions unittests/tools/test_qualys_parser.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from ..dojo_test_case import DojoTestCase, get_unit_tests_path

Check warning on line 1 in unittests/tools/test_qualys_parser.py

View check run for this annotation

DryRunSecurity / AI-powered Sensitive Files Check

Possible Sensitive File

Our AI-Powered Sensitive File checker believes it has discovered a sensitive file being modified in this PR. Extra care must be taken when modifying a file that is potentially security-sensitive. The following reason was provided: Contains test cases for the Qualys parser
from dojo.models import Test
from dojo.tools.qualys.parser import QualysParser

Expand Down Expand Up @@ -130,3 +130,11 @@
self.assertEqual(
finding.severity, "Critical"
)

def test_parse_file_monthly_pci_issue6932(self):
testfile = open(
get_unit_tests_path() + "/scans/qualys/monthly_pci_issue6932.csv"
)
parser = QualysParser()
findings = parser.get_findings(testfile, Test())
self.assertEqual(1, len(findings))
Loading