Hi folks,
Thank you to all that participated in today s call and Happy Thanksgiving to those that observe it.
We discussed Eric s suggested changes to the CSAF 2.0 schema via the previously mentioned pull request.
https://github.com/oasis-tcs/csaf/pull/21 Patrick (AT&T) observed that there may be a typo in the schema. A JSON array should have square brackets "[ ]" and the current schema has it with "{ .
Patrick, after we adjourned I did a little more research and you are correct that a JSON *document* array should have [ ] . However, in a JSON schema the type : array is correctly represented as an item:
"type" : "array" ,
"contains" : {
"type" : "number"
}
}
This document includes detailed information and examples:
https://json-schema.org/understanding-json-schema/reference/array.html I will merge the pull request today. Thank you for all your collaboration and feedback!
Regards,
Omar Santos
Cisco PSIRT
Email:
os@cisco.com PGP:
https://keybase.io/santosomar On Nov 18, 2019, at 5:50 PM, Eric Johnson <
eric@tibco.com > wrote:
Hi Luke,
Thanks so much for replying.
My reply here ended up being longer than I intended. Even so, I suggest that reading through to the end is useful, because I have proposed directions at the end for how to resolve this. Not sure which one is right, and hope that people chime in.
On Mon, Nov 18, 2019 at 1:22 PM Luke Tamagna-Darr <
ltamagnadarr@tenable.com > wrote:
Lucas Tamagna-Darr Director of Engineering - Detection Automation
Tenable Network Security
ltamagnadarr@tenable.com On Thu, Nov 14, 2019 at 11:38 PM Eric Johnson <
eric@tibco.com > wrote:
Hi CSAF-TC
See previous email for issue #1 related to using JSON schema from
first.org . This email raises a 2nd issue.
To wit: first.org does not define any compliance criteria, at least not that I could find. CVSS score structures could be valid according to the schema, but still incorrect.
Questions:
Do we care if the score is inconsistent - for example, the score does not match the vector, or the severity does not match the score?
Yes, if only because it is fairly simple to create a vector->score->severity calculator and there should be an expectation that the suppliers of the data should be specifying accurate information rather than the consumers validating the consistency
of the vector/score/severity.
What are the conformance criteria? Do we leave it unspecified, leave it up to the implementation to check, or do we require that implementations check for score data consistency? If we allow implementations to continue with inconsistent data, do we require that actual values be generated from the vector?
If we're going to allow for inconsistent data, I suggest we make the score and severity optional and only the vector required.
The regular _expression_ in the
first.org JSON -schema allows for bogus vectors. Do we expect implementations to catch those bogus vectors?
This seems like a reasonable expectation.
I really want to agree with all of these compliance expectations, except I don't know how to resolve this question. The following concerns come up:
The CVRF specification doesn't identify these constraints as requirements, Is it presumptuous for us (well, me, at least) to decide out what to normatively require, when
first.org doesn't state any normative requirements either, The conformance clause of CVRF (surprisingly!) doesn't require enforcement of these constraints, The approach implemented allows for extensibility, and We already cannot account for consistency.
CVRF doesn't identify these constraints as requirements:
Looking at the language of the CVRF specification for some clarification: "The vuln:BaseScoreV3 element MUST be present exactly once inside every vuln:ScoreSetV3 and contains the numeric value of the computed CVSS version 3 base score."
Unfortunately, the statement is unclear. Perhaps it should have been written as "The vuln:BaseScoreV3 element MUST be present exactly once inside every vuln:ScoreSetV3 and
MUST contain the numeric value of the computed CVSS version 3 base score." As the original is written, the "MUST" only applies to "be present exactly once ...".
So this statement is not a normative requirement that the base score correspond to the vector.
Or this, about the CVSS v3 vector: "The specific notation is expected to follow the guidelines set forth in the CVSS v3 documentation at [CVSS3]". "Is expected" != "MUST", especially since section 1.2 specifically calls out that RFC 2119 is in
use here. While CVRF does require conformance to a regex for the score, there are lots of valid regex strings that are not valid CVSS scores.
My conclusion --> no normative statements in CVRF about compliance currently address the fact that CVSS scores and vectors could be inconsistent or wrong.
First.org doesn't state normative requirements
Validating against the schema is useful, but I don't see any normative requirements beyond that. Assuming the experts at first know more than I, I'd love to have some explicit direction from them. As it is, I don't want to presumptively try to
impose a misguided requirement.
The CVRF
conformance requirements don't enforce these constraints
Nothing of the constraints in section 6 are mentioned in the conformance requirements. I was surprised by this. Maybe this is an error in the specification?
Extensibility
The new model allows for multiple different scoring systems, and I added support for CVSS v3.1 to the pull request I submitted. In the future, someone may wish to add a cvss_v40 property. Existing implementations will not have the means to validate
that CVSS 4.0 data is compliant. I believe we want people to be able to express scoring extensions without having to go back and change / update the underlying CSAF specification. Not sure how we can do both, if we add normative requirements about validating
scoring data....
Inconsistency already allowed
The moment we allow for multiple scores from different scoring systems / versions of those systems for the same product (CVSS v2, v3, v3.1) within a vulnerability, those scores risk being inconsistent. If someone assigns a CVSS v3.0 and v3.1 score
to a list of products, and the CVSS vectors aren't the same, what does that mean? Is one of the scores wrong? I don't fundamentally see the differences between having two scores that disagree with each other, and one score that is internally inconsistent.
--------------------------------
Question, then, is if CVRF and
First.org are not our guide, then what do we do?
I'm comfortable with raising the conformance requirements to make the CVSS scoring information consistent. However, TIBCO doesn't have a legacy of CVRF documents that might want to migrate to the new JSON format, so I'm not a good judge of this
question. Other companies do have that data. If CISCO looks to convert their hundreds of CVRF documents, and discovers that 20% have malformed CVSS scoring information, then what? Do we expect them to go back over all those documents and fix them? The counter-argument
is that nobody has identified the discrepancy until now, so maybe we don't really care?
Of course, the question of legacy documents can be resolved, based on real-world data. Someone could add specific compliance logic to a fork of the project I've shared, and run it over the documents they have, to see if that causes problems. If
it doesn't, we could just move forward with stricter compliance.
Three proposal directions that I can think of:
Do nothing - validating with the schema is sufficient for CSAF purposes, even if the scores are inconsistent Document the problem, and the suggested solution Normatively require the intended approach to process malformed data ("in case the base score is inconsistent with the provided vector, an implementation MUST produce an error, and MUST recompute the base score based on the provided vector.")
Eric.