CARVIEW |
Data Integrity ECDSA Cryptosuites v1.0
Achieving Data Integrity using ECDSA with NIST-compliant curves
W3C Candidate Recommendation Snapshot
More details about this document
- This version:
- https://www.w3.org/TR/2023/CR-vc-di-ecdsa-20231121/
- Latest published version:
- https://www.w3.org/TR/vc-di-ecdsa/
- Latest editor's draft:
- https://w3c.github.io/vc-di-ecdsa/
- History:
- https://www.w3.org/standards/history/vc-di-ecdsa/
- Commit history
- Implementation report:
- https://w3c.github.io/vc-data-integrity/implementations/
- Editors:
- Manu Sporny (Digital Bazaar)
- Marty Reed (RANDA Solutions)
- Greg Bernstein (Invited Expert)
- Sebastian Crane (Invited Expert)
- Authors:
- Dave Longley (Digital Bazaar)
- Manu Sporny (Digital Bazaar)
- Feedback:
- GitHub w3c/vc-di-ecdsa (pull requests, new issue, open issues)
- Related Specifications
- The Verifiable Credentials Data Model v2.0
- Verifiable Credential Data Integrity v1.0
- The Edwards Digital Signature Algorithm Cryptosuites v1.0
- The BBS Digital Signature Algorithm Cryptosuites v1.0
Copyright © 2023 World Wide Web Consortium. W3C® liability, trademark and permissive document license rules apply.
Abstract
This specification describes a Data Integrity Cryptosuite for use when generating a digital signature using the Elliptic Curve Digital Signature Algorithm (ECDSA).
Status of This Document
This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
The Working Group is actively seeking implementation feedback for this specification. In order to exit the Candidate Recommendation phase, the Working Group has set the requirement of at least two independent implementations for each mandatory feature in the specification. For details on the conformance testing process, see the test suites listed in the implementation report.
This document was published by the Verifiable Credentials Working Group as a Candidate Recommendation Snapshot using the Recommendation track.
Publication as a Candidate Recommendation does not imply endorsement by W3C and its Members. A Candidate Recommendation Snapshot has received wide review, is intended to gather implementation experience, and has commitments from Working Group members to royalty-free licensing for implementations.
This Candidate Recommendation is not expected to advance to Proposed Recommendation any earlier than 24 January 2024.
This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 03 November 2023 W3C Process Document.
This specification defines a cryptographic suite for the purpose of creating, and verifying proofs for ECDSA signatures in conformance with the Data Integrity [VC-DATA-INTEGRITY] specification. ECDSA signatures are specified in [FIPS-186-5] with elliptic curves P-256 and P-384 specified in [NIST-SP-800-186]. [FIPS-186-5] includes the deterministic ECDSA algorithm which is also specified in [RFC6979].
This specification uses either the RDF Dataset Canonicalization Algorithm [RDF-CANON] or the JSON Canonicalization Scheme [RFC8785] to transform the input document into its canonical form. It uses one of two mechanisms to digest and sign: SHA-256 [RFC6234] as the message digest algorithm and ECDSA with Curve P-256 as the signature algorithm, or SHA-384 [RFC6234] as the message digest algorithm and ECDSA with Curve P-384 as the signature algorithm.
The elliptic curves P-256 and P-384 of [NIST-SP-800-186] are referred to as secp256r1 and secp384r1 respectively in [SECG2]. In addition, this notation is sometimes used in ECDSA software libraries.
This section defines the terms used in this specification. A link to these terms is included whenever they appear in this specification.
- data integrity proof
- A set of attributes that represent a digital proof and the parameters required to verify it.
- private key
- Cryptographic material that can be used to generate digital proofs.
- challenge
- A random or pseudo-random value used by some authentication protocols to mitigate replay attacks.
- domain
-
A string value that specifies the operational domain of a digital proof.
This could be an Internet domain name like
example.com
, an ad-hoc value such asmycorp-level3-access
, or a very specific transaction value like8zF6T8J34qP3mqP
. A signer could include a domain in its digital proof to restrict its use to particular target, identified by the specified domain. - cryptographic suite
- A specification defining the usage of specific cryptographic primitives in order to achieve a particular security goal. These documents are often used to specify verification methods, digital signature types, their identifiers, and other related properties.
- decentralized identifier (DID)
- A globally unique persistent identifier that does not require a centralized registration authority and is often generated and/or registered cryptographically. The generic format of a is defined in [DID-CORE]. Many—but not all—methods make use of distributed ledger technology (DLT) or some other form of decentralized network.
- controller
- An entity that has the capability to make changes to a controller document.
- controller document
- A set of data that specifies one or more relationships between a controller and a set of data, such as a set of public cryptographic keys.
- subject
-
The entity identified by the
id
property in a controller document. Anything can be a subject: person, group, organization, physical thing, digital thing, logical thing, etc. - distributed ledger (DLT)
- A non-centralized system for recording events. These systems establish sufficient confidence for participants to rely upon the data recorded by others to make operational decisions. They typically use distributed databases where different nodes use a consensus protocol to confirm the ordering of cryptographically signed transactions. The linking of digitally signed transactions over time often makes the history of the ledger effectively immutable.
- verifier
- A role an entity performs by receiving data containing one or more data integrity proofs and then determining whether or not the proof is valid.
- verifiable credential
- A standard data model and representation format for expressing cryptographically-verifiable digital credentials, as defined by the W3C Verifiable Credentials specification [VC-DATA-MODEL-2.0].
- verification method
-
A set of parameters that can be used together with a process to independently verify a proof. For example, a cryptographic public key can be used as a verification method with respect to a digital signature; in such usage, it verifies that the signer possessed the associated cryptographic private key.
"Verification" and "proof" in this definition are intended to apply broadly. For example, a cryptographic public key might be used during Diffie-Hellman key exchange to negotiate a shared symmetric key for encryption. This guarantees the integrity of the key agreement process. It is thus another type of verification method, even though descriptions of the process might not use the words "verification" or "proof."
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key words MAY, MUST, MUST NOT, and SHOULD in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document contains examples of JSON and JSON-LD data. Some of these examples
are invalid JSON, as they include features such as inline comments (//
)
explaining certain portions and ellipses (...
) indicating the omission of
information that is irrelevant to the example. Such parts need to be
removed if implementers want to treat the examples as valid JSON or JSON-LD.
The following sections outline the data model that is used by this specification to express verification methods, such as cryptographic public keys, and data integrity proofs, such as digital signatures.
These verification methods are used to verify Data Integrity Proofs [VC-DATA-INTEGRITY] produced using Elliptic Curve cryptographic key material that is compliant with [FIPS-186-5]. The encoding formats for these key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used during the processing of digital signatures.
The Multikey format, as defined in [VC-DATA-INTEGRITY], is used to express public keys for the cryptographic suites defined in this specification.
The publicKeyMultibase
property represents a Multibase-encoded Multikey
expression of a P-256 or P-384 public key.
The Multikey encoding of a P-256
public key MUST start with the two-byte prefix 0x8024
(the varint expression
of 0x1200
) followed by the 33-byte compressed public key data. The resulting
35-byte value MUST then be encoded using the base-58-btc alphabet, according to
the Multibase section in the
[VC-DATA-INTEGRITY] specification, and then prepended with the base-58-btc
Multibase header (z
).
The encoding of a P-384 public key MUST start with the
two-byte prefix 0x8124
(the varint expression of 0x1201
) followed by the
49-byte compressed public key data. The resulting 51-byte value is then encoded
using the base-58-btc alphabet, according to the
Multibase section in the
[VC-DATA-INTEGRITY] specification, and then prepended with the base-58-btc
Multibase header (z
). Any other encodings MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private
key. Implementations of this specification will raise errors in the event of a
Multicodec value other than 0x1200
or 0x1201
being used in a
publicKeyMultibase
value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv" }
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ Uz2sG9FE42shbn2xkZJh54" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/data-integrity/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "https://example.com/issuer/123#key-1", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv" }, { "id": "https://example.com/issuer/123#key-2", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ Uz2sG9FE42shbn2xkZJh54" }], "authentication": [ "did:example:123#key-1" ], "assertionMethod": [ "did:example:123#key-2" ], "capabilityDelegation": [ "did:example:123#key-2" ], "capabilityInvocation": [ "did:example:123#key-2" ] }
The secretKeyMultibase
property represents a Multibase-encoded Multikey
expression of a P-256 or P-384 secret key (also sometimes referred to as a
private key).
The encoding of a P-256 secret key MUST start with the two-byte prefix 0x8626
(the varint expression of 0x1306
) followed by the 32-byte secret key data. The
34-byte value MUST then be encoded using the base-58-btc alphabet, according to
the Multibase section in the
[VC-DATA-INTEGRITY] specification, and then prepended with the base-58-btc
Multibase header (z
). Any other encodings MUST NOT be allowed.
The encoding of a P-384 secret key is the two-byte prefix 0x8726
(the varint
expression of 0x1307
) followed by the 48-byte secret key data. The 50-byte
value MUST then be encoded using the base-58-btc alphabet, according to the
Multibase section in the
[VC-DATA-INTEGRITY] specification, and then prepended with the base-58-btc
Multibase header (z
). Any other encodings MUST NOT be allowed.
Developers are advised to prevent accidental publication of a representation of a secret
key, and to not export the secretKeyMultibase
property by default, when serializing
key pairs as Multikey.
This section details the proof representation formats that are defined by this specification.
The verificationMethod
property of the proof MUST be a URL.
Dereferencing the verificationMethod
MUST result in an object
containing a type
property with the value set to
Multikey
.
The type
property of the proof MUST be DataIntegrityProof
.
The cryptosuite
property of the proof MUST be ecdsa-rdfc-2019
or ecdsa-jcs-2019
.
The created
property of the proof MUST be an [XMLSCHEMA11-2]
formatted date string.
The proofPurpose
property of the proof MUST be a string, and MUST
match the verification relationship expressed by the verification method
controller
.
The value of the proofValue
property of the proof MUST be an ECDSA signature
produced according to [FIPS-186-5] and SHOULD use the deterministic
ECDSA signature variant, produced according to [FIPS-186-5] using the curves
and hashes as specified in section 3. Algorithms, encoded according
to section 7 of [RFC4754] (sometimes referred to as the IEEE P1363 format),
and encoded using the base-58-btc header and
alphabet as described in the
Multibase section of [VC-DATA-INTEGRITY].
{ "@context": [ {"myWebsite": "https://vocabulary.example/myWebsite"}, "https://www.w3.org/ns/credentials/v2" ], "myWebsite": "https://hello.world.example/", "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8 fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "z2iAR3F2Sk3mWfYyrinKzSQpSbvfxnz9kkv7roxxumB5RZDP9JUw5QAXuchUd huiwE18hyyZTjiEreKmhH3oj9Q8" } }
The following section describes multiple Data Integrity cryptographic suites that utilize the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5]. When generating ECDSA signatures, the deterministic ECDSA algorithm variant SHOULD be used.
Implementations SHOULD fetch and cache verification method information as early as possible when adding or verifying proofs. Parameters passed to functions in this section use information from the verification method — such as the public key size — to determine function parameters — such as the cryptographic hashing algorithm.
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function that is passed to the algorithm MUST be determined by the size of the associated public key. For P-256 keys, SHA-2 with 256 bits of output is utilized. For P-384 keys, SHA-2 with 384-bits of output is utilized.
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used, implementations of that algorithm will detect dataset poisoning by default, and abort processing upon detection.
The ecdsa-rdfc-2019
cryptographic suite takes an input document, canonicalizes
the document using the Universal RDF Dataset Canonicalization Algorithm
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.1.3 Transformation (ecdsa-rdfc-2019), the hashing algorithm is defined in Section 3.1.4 Hashing (ecdsa-rdfc-2019), and the proof serialization algorithm is defined in Section 3.1.6 Proof Serialization (ecdsa-rdfc-2019).
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.1.3 Transformation (ecdsa-rdfc-2019), the hashing algorithm is defined in Section 3.1.4 Hashing (ecdsa-rdfc-2019), and the proof verification algorithm is defined in Section 3.1.7 Proof Verification (ecdsa-rdfc-2019).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.1.4 Hashing (ecdsa-rdfc-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
-
If options.type is not set to the string
DataIntegrityProof
and options.cryptosuite is not set to the stringecdsa-rdfc-2019
then aPROOF_TRANSFORMATION_ERROR
MUST be raised. - Let canonicalDocument be the result of applying the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON] to the unsecuredDocument.
- Set output to the value of canonicalDocument.
- Return canonicalDocument as the transformed data document.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.1.6 Proof Serialization (ecdsa-rdfc-2019) or Section 3.1.7 Proof Verification (ecdsa-rdfc-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256 and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
- Let transformedDocumentHash be the result of applying the SHA-256 (SHA-2 with 256-bit output) or SHA-384 (SHA-2 with 384-bit output) cryptographic hashing algorithm [RFC6234] to the respective curve P-256 or curve P-384 transformedDocument. Respective transformedDocumentHash will be exactly 32 or 48 bytes in size.
- Let proofConfigHash be the result of applying the SHA-256 (SHA-2 with 256-bit output) or SHA-384 (SHA-2 with 384-bit output) cryptographic hashing algorithm [RFC6234] to the respective curve P-256 or curve P-384 canonicalProofConfig. Respective proofConfigHash will be exactly 32 or 48 bytes in size.
- Let hashData be the result of joining proofConfigHash (the first hash) with transformedDocumentHash (the second hash).
- Return hashData as the hash data.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
- Let proofConfig be an empty object.
- Set proofConfig.type to options.type.
- If options.cryptosuite is set, set proofConfig.cryptosuite to its value.
-
If options.type is not set to
DataIntegrityProof
and proofConfig.cryptosuite is not set toecdsa-rdfc-2019
, anINVALID_PROOF_CONFIGURATION
error MUST be raised. -
Set proofConfig.created to
options.created. If the value is not a valid
[XMLSCHEMA11-2] datetime, an
INVALID_PROOF_DATETIME
error MUST be raised. - Set proofConfig.verificationMethod to options.verificationMethod.
- Set proofConfig.proofPurpose to options.proofPurpose.
- Set proofConfig.@context to unsecuredDocument.@context.
- Let canonicalProofConfig be the result of applying the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON] to the proofConfig.
- Return canonicalProofConfig.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
- Let privateKeyBytes be the result of retrieving the private key bytes associated with the options.verificationMethod value as described in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Retrieving Cryptographic Material.
- Let proofBytes be the result of applying the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with hashData as the data to be signed using the private key specified by privateKeyBytes. proofBytes will be exactly 64 bytes in size for a P-256 key, and 96 bytes in size for a P-384 key.
- Return proofBytes as the digital proof.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes) and proof options (options). A verification result represented as a boolean value is produced as output.
- Let publicKeyBytes be the result of retrieving the public key bytes associated with the options.verificationMethod value as described in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Retrieving Cryptographic Material.
- Let verificationResult be the result of applying the verification algorithm Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with hashData as the data to be verified against the proofBytes using the public key specified by publicKeyBytes.
- Return verificationResult as the verification result.
The ecdsa-jcs-2019
cryptographic suite takes an input document, canonicalizes
the document using the JSON Canonicalization Scheme [RFC8785], and then
cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
To generate a proof, the algorithm in Section 4.1: Add Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite-specific transformation algorithm is defined in Section 3.2.3 Transformation (ecdsa-jcs-2019), the hashing algorithm is defined in Section 3.2.4 Hashing (ecdsa-jcs-2019), and the proof serialization algorithm is defined in Section 3.2.6 Proof Serialization (ecdsa-jcs-2019).
To verify a proof, the algorithm in Section 4.2: Verify Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite-specific transformation algorithm is defined in Section 3.2.3 Transformation (ecdsa-jcs-2019), the hashing algorithm is defined in Section 3.2.4 Hashing (ecdsa-jcs-2019), and the proof verification algorithm is defined in Section 3.2.7 Proof Verification (ecdsa-jcs-2019).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.2.4 Hashing (ecdsa-jcs-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
-
If options.type is not set to the string
DataIntegrityProof
and options.cryptosuite is not set to the stringecdsa-jcs-2019
, then aPROOF_TRANSFORMATION_ERROR
MUST be raised. - Let canonicalDocument be the result of applying the JSON Canonicalization Scheme [RFC8785] to the unsecuredDocument.
- Set output to the value of canonicalDocument.
- Return canonicalDocument as the transformed data document.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.2.6 Proof Serialization (ecdsa-jcs-2019) or Section 3.2.7 Proof Verification (ecdsa-jcs-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256, and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and a canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
- Let transformedDocumentHash be the result of applying the SHA-256 (SHA-2 with 256-bit output) or SHA-384 (SHA-2 with 384-bit output) cryptographic hashing algorithm [RFC6234] to the respective curve P-256 or curve P-384 transformedDocument. Respective transformedDocumentHash will be exactly 32 or 48 bytes in size.
- Let proofConfigHash be the result of applying the SHA-256 (SHA-2 with 256-bit output) or SHA-384 (SHA-2 with 384-bit output) cryptographic hashing algorithm [RFC6234] to the respective curve P-256 or curve P-384 canonicalProofConfig. Respective proofConfigHash will be exactly 32 or 48 bytes in size.
- Let hashData be the result of concatenating proofConfigHash (the first hash) followed by transformedDocumentHash (the second hash).
- Return hashData as the hash data.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
- Let proofConfig be an empty object.
- Set proofConfig.type to options.type.
- If options.cryptosuite is set, set proofConfig.cryptosuite to its value.
-
If options.type is not set to
DataIntegrityProof
and proofConfig.cryptosuite is not set toecdsa-jcs-2019
, anINVALID_PROOF_CONFIGURATION
error MUST be raised. -
Set proofConfig.created to
options.created. If the value is not a valid
[XMLSCHEMA11-2] datetime, an
INVALID_PROOF_DATETIME
error MUST be raised. - Set proofConfig.verificationMethod to options.verificationMethod.
- Set proofConfig.proofPurpose to options.proofPurpose.
- Let canonicalProofConfig be the result of applying the JSON Canonicalization Scheme [RFC8785] to the proofConfig.
- Return canonicalProofConfig.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
- Let privateKeyBytes be the result of retrieving the private key bytes associated with the options.verificationMethod value as described in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Retrieving Cryptographic Material.
- Let proofBytes be the result of applying the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with hashData as the data to be signed using the private key specified by privateKeyBytes. proofBytes will be exactly 64 bytes in size for a P-256 key, and 96 bytes in size for a P-384 key.
- Return proofBytes as the digital proof.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes), and proof options (options). A verification result represented as a boolean value is produced as output.
- Let publicKeyBytes be the result of retrieving the public key bytes associated with the options.verificationMethod value as described in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Retrieving Cryptographic Material.
- Let verificationResult be the result of applying the verification algorithm, Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with hashData as the data to be verified against the proofBytes using the public key specified by publicKeyBytes.
- Return verificationResult as the verification result.
The Working Group is seeking implementer feedback on these generalized selective disclosure functions as well as horizonal security review on the features from parties at W3C and IETF. Those reviews might result in significant changes to these functions, migration of these functions to the core Data Integrity specification (for use by other cryptographic suites), or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The following section contains a set of functions that are used throughout cryptographic suites that perform selective disclosure.
The following algorithm canonicalizes an array of N-Quad strings and replaces any blank node identifiers in the canonicalized result using a label map factory function, labelMapFactoryFunction. The required inputs are an array of N-Quad strings (nquads), and a label map factory function (labelMapFactoryFunction). Any custom options can also be passed. An N-Quads representation of the canonicalNQuads as an array of N-Quad strings, with the replaced blank node labels, and a map from the old blank node IDs to the new blank node IDs, labelMap, is produced as output.
- Run the RDF Dataset Canonicalization Algorithm [RDF-CANON] on the joined nquads, passing any custom options, and as output, get the canonicalized dataset, which includes a canonical bnode identifier map, canonicalIdMap.
- Pass canonicalIdMap to labelMapFactoryFunction to produce a new bnode identifier map, labelMap.
- Use the canonicalized dataset and labelMap to produce the canonical N-Quads representation as an array of N-Quad strings, canonicalNQuads.
- Return an object containing labelMap and canonicalNQuads.
The following algorithm canonicalizes a JSON-LD document and replaces any blank node identifiers in the canonicalized result using a label map factory function, labelMapFactoryFunction. The required inputs are a JSON-LD document (document) and a label map factory function (labelMapFactoryFunction). Additional custom options (such as a document loader) can also be passed. An N-Quads representation of the canonicalNQuads as an array of N-Quad strings, with the replaced blank node labels, and a map from the old blank node IDs to the new blank node IDs, labelMap, is produced as output.
- Deserialize the JSON-LD document to RDF, rdf, using the Deserialize JSON-LD to RDF algorithm, passing any custom options (such as a document loader).
- Serialize rdf to an array of N-Quad strings, nquads.
- Return the result of calling the algorithm in Section 3.3.1 labelReplacementCanonicalizeNQuads, passing nquads, labelMapFactoryFunction, and any custom options.
The following algorithm creates a label map factory function that uses an input label map to replace canonical blank node identifiers with another value. The required input is a label map, labelMap. A function, labelMapFactoryFunction, is produced as output.
-
Create a function, labelMapFactoryFunction, with one required input
(a canonical node identifier map, canonicalIdMap), that will
return a blank node identifier map, bnodeIdMap, as output. Set the
function's implementation to:
- Generate a new empty bnode identifier map, bnodeIdMap.
-
For each map entry, entry, in canonicalIdMap:
- Use the canonical identifier from the value in entry as a key in labelMap to get the new label, newLabel.
- Add a new entry, newEntry, to bnodeIdMap using the key from entry and newLabel as the value.
- Return bnodeIdMap.
- Return labelMapFactoryFunction.
The following algorithm creates a label map factory function that uses an HMAC to replace canonical blank node identifiers with their encoded HMAC digests. The required input is an HMAC (previously initialized with a secret key), HMAC. A function, labelMapFactoryFunction, is produced as output.
-
Create a function, labelMapFactoryFunction, with one required input
(a canonical node identifier map, canonicalIdMap), that will
return a blank node identifier map, bnodeIdMap, as output. Set the
function's implementation to:
- Generate a new empty bnode identifier map, bnodeIdMap.
-
For each map entry, entry, in canonicalIdMap:
- HMAC the canonical identifier from the value in entry to get an HMAC digest, digest.
- Generate a new string value, b64urlDigest, and initialize it to "u" followed by appending a base64url-no-pad encoded version of the digest value.
- Add a new entry, newEntry, to bnodeIdMap using the key from entry and b64urlDigest as the value.
- Return bnodeIdMap.
- Return labelMapFactoryFunction.
A different primitive could be created that returned a label map factory function that would instead sort the resulting HMAC digests and assign labels in the produced label map using a prefix and integers based on their sorted order. This primitive might be useful for selective disclosure schemes, such as BBS, that favor unlinkability over minimizing unrevealed data leakage.
The following algorithm replaces all blank node identifiers in an array of N-Quad strings with custom scheme URNs. The required inputs are an array of N-Quad strings (inputNQuads) and a URN scheme (urnScheme). An array of N-Quad strings, skolemizedNQuads, is produced as output. This operation is intended to be reversible through the use of the algorithm in Section 3.3.6 deskolemizeNQuads.
- Create a new array of N-Quad strings, skolemizedNQuads.
-
For each N-Quad string, s1, in inputNQuads:
-
Create a new string, s2, that is a copy of s1 replacing any
occurrence of a blank node identifier with a URN ("urn:"), plus the input
custom scheme (urnScheme), plus a colon (":"), and
the value of the blank node identifier. For example, a regular expression
of a similar form to the following would achieve the desired result:
s1.replace(/(_:([^\s]+))/g, '<urn:custom-scheme:$2>')
. - Append s2 to skolemizedNQuads.
-
Create a new string, s2, that is a copy of s1 replacing any
occurrence of a blank node identifier with a URN ("urn:"), plus the input
custom scheme (urnScheme), plus a colon (":"), and
the value of the blank node identifier. For example, a regular expression
of a similar form to the following would achieve the desired result:
- Return skolemizedNQuads.
The following algorithm replaces all custom scheme URNs in an array of N-Quad statements with a blank node identifier. The required inputs are an array of N-Quad strings (inputNQuads) and a URN scheme (urnScheme). An array of N-Quad strings, deskolemizedNquads, is produced as output. This operation is intended to reverse use of the algorithm in Section 3.3.6 deskolemizeNQuads.
- Create a new array of N-Quad strings, deskolemizedNQuads.
-
For each N-Quad string, s1, in inputNQuads:
-
Create a new string, s2, that is a copy of s1 replacing any
occurrence of a URN ("urn:"), plus the input
custom scheme (urnScheme), plus a colon (":"), and
the value of the blank node identifier with a blank node prefix ("_:"), plus
the value of the blank node identifier. For example, a regular expression
of a similar form to the following would achieve the desired result:
s1.replace(/(<urn:custom-scheme:([^>]+)>)/g, '_:$2').
. - Append s2 to deskolemizedNQuads.
-
Create a new string, s2, that is a copy of s1 replacing any
occurrence of a URN ("urn:"), plus the input
custom scheme (urnScheme), plus a colon (":"), and
the value of the blank node identifier with a blank node prefix ("_:"), plus
the value of the blank node identifier. For example, a regular expression
of a similar form to the following would achieve the desired result:
- Return deskolemizedNQuads.
The following algorithm replaces all blank node identifiers in an expanded JSON-LD document with custom-scheme URNs, including assigning such URNs to blank nodes that are unlabeled. The required inputs are an expanded JSON-LD document (expanded), a custom URN scheme (urnScheme), a UUID string or other comparably random string (randomString), and reference to a shared integer (count). Any additional custom options (such as a document loader) can also be passed. It produces the expanded form of the skolemized JSON-LD document (skolemizedExpandedDocument as output. The skolemization used in this operation is intended to be reversible through the use of the algorithm in Section 3.3.9 toDeskolemizedNQuads.
- Initialize skolemizedExpandedDocument to an empty array.
-
For each element in expanded:
- If either element is not an object or it contains the key @value, append a copy of element to skolemizedExpandedDocument and continue to the next element.
-
Otherwise, initialize skolemizedNode to an object, and for
each property and value in element:
- If value is an array, set the value of property in skolemizedNode to the result of calling this algorithm recursively passing value for expanded and keeping the other parameters the same.
- Otherwise, set the value of property in skolemizedNode to the first element in the array result of calling this algorithm recursively passing an array with value as its only element for expanded and keeping the other parameters the same.
- If skolemizedNode has no @id property, set the value of the @id property in skolemizedNode to the concatenation of "urn:", urnScheme, "_", randomString, "_" and the value of count, incrementing the value of count afterwards.
- Otherwise, if the value of the @id property in skolemizedNode starts with "_:", preserve the existing blank node identifier when skolemizing by setting the value of the @id property in skolemizedNode to the concatenation of "urn:", urnScheme, and the existing value of the @id property.
- Append skolemizedNode to skolemizedExpandedDocument.
- Return skolemizedExpandedDocument.
The following algorithm replaces all blank node identifiers in a compact JSON-LD document with custom-scheme URNs. The required inputs are a compact JSON-LD document (document) and a custom URN scheme (urnScheme). The document is assumed to use only one @context property at the top level of the document. Any additional custom options (such as a document loader) can also be passed. It produces both an expanded form of the skolemized JSON-LD document (skolemizedExpandedDocument and a compact form of the skolemized JSON-LD document (skolemizedCompactDocument) as output. The skolemization used in this operation is intended to be reversible through the use of the algorithm in Section 3.3.9 toDeskolemizedNQuads.
- Initialize expanded to the result of the JSON-LD Expansion Algorithm, passing document and any custom options.
- Initialize skolemizedExpandedDocument to the result of the algorithm in Section 3.3.7 skolemizeExpandedJsonLd.
- Initialize skolemizedCompactDocument to the result of the JSON-LD Compaction Algorithm, passing skolemizedExpandedDocument and any custom options.
- Return an object with both skolemizedExpandedDocument and skolemizedCompactDocument.
The following algorithm converts a skolemized JSON-LD document, such as one created using the algorithm in Section 3.3.8 skolemizeCompactJsonLd, to an array of deskolemized N-Quads. The required input is a JSON-LD document, skolemizedDocument. Additional custom options (such as a document loader) can be passed. An array of deskolemized N-Quad strings (deskolemizedNQuads) is produced as output.
-
Initialize
skolemizedDataset
to the result of the Deserialize JSON-LD to RDF algorithm, passing any custom options (such as a document loader), to convertskolemizedDocument
from JSON-LD to RDF in N-Quads format. -
Split
skolemizedDataset
into an array of individual N-Quads,skolemizedNQuads
. -
Set deskolemizedNQuads to the result of the algorithm in Section
3.3.6 deskolemizeNQuads with
skolemizedNQuads
and "custom-scheme:" as parameters. Implementations MAY choose a different urnScheme that is different than "custom-scheme:" so long as the same scheme name was used to generate skolemizedDocument. - Return deskolemizedNQuads.
The following algorithm converts a JSON Pointer [RFC6901] to an array of paths into a JSON tree. The required input is a JSON Pointer string (pointer). An array of paths (paths) is produced as output.
-
Initialize
paths
to an empty array. -
Initialize
splitPath
to an array by splitting pointer on the "/" character and skipping the first, empty, split element. In Javascript notation, this step is equivalent to the following code:pointer.split('/').slice(1)
-
For each
path
insplitPath
:-
If
path
does not include~
, then addpath
topaths
, converting it to an integer if it parses as one, leaving it as a string if it does not. -
Otherwise, unescape any JSON pointer escape sequences in
path
and add the result topaths
.
-
If
-
Return
paths
.
The following algorithm creates an initial selection (a fragment of a JSON-LD document) based on a JSON-LD object. This is a helper function used within the algorithm in Section 3.3.13 selectJsonLd. The required input is a JSON-LD object (source). A JSON-LD document fragment object (selection) is produced as output.
- Initialize selection to an empty object.
-
If source has an
id
that is not a blank node identifier, setselection.id
to its value. Note: All non-blank node identifiers in the path of any JSON Pointer MUST be included in the selection, this includes any root document identifier. -
If source.
type
is set, set selection.type
to its value. Note: The selection MUST include alltype
s in the path of any JSON Pointer, including any root documenttype
. - Return selection.
The following algorithm selects a portion of a compact JSON-LD document using paths parsed from a parsed JSON Pointer. This is a helper function used within the algorithm in Section 3.3.13 selectJsonLd. The required inputs are an array of paths (paths) parsed from a JSON Pointer, a compact JSON-LD document (document), a selection document (selectionDocument) to be populated, and an array of arrays (arrays) for tracking selected arrays. This algorithm produces no output; instead it populates the given selectionDocument with any values selected via paths.
-
Initialize
parentValue
todocument
. -
Initialize
value
toparentValue
. -
Initialize
selectedParent
toselectionDocument
. -
Initialize
selectedValue
toselectedParent
. -
For each
path
inpaths
:-
Set
selectedParent
toselectedValue
. -
Set
parentValue
tovalue
. -
Set
value
toparentValue[path]
. Ifvalue
is now undefined, throw an error indicating that the JSON pointer does not match the givendocument
. -
Set
selectedValue
toselectedParent[path]
. -
If
selectedValue
is now undefined:-
If
value
is an array, setselectedValue
to an empty array and appendselectedValue
to arrays. -
Otherwise, set
selectedValue
to an initial selection passingvalue
assource
to the algorithm in Section 3.3.11 createInitialSelection. -
Set
selectedParent[path]
toselectedValue
.
-
If
-
Set
- Note: With path traversal complete at the target value, the selected value will now be computed.
-
If
value
is a literal, setselectedValue
tovalue
. -
If
value
is an array, SetselectedValue
to a copy ofvalue
. -
In all other cases, set
selectedValue
to an object that merges a shallow copy ofselectedValue
with a deep copy ofvalue
, e.g.,{...selectedValue, …deepCopy(value)}
. -
Get the last
path
,lastPath
, frompaths
. -
Set
selectedParent[lastPath]
toselectedValue
.
The following algorithm selects a portion of a compact JSON-LD document using
an array of JSON Pointers. The required inputs are an array of JSON Pointers
(pointers) and a compact JSON-LD document (document). The
document is assumed to use a JSON-LD context that aliases @id
and @type
to id
and type
, respectively, and to use only one
@context
property at the top level of the document. A new JSON-LD
document that represents a selection (selectionDocument) of the
original JSON-LD document is produced as output.
-
If
pointers
is empty, returnnull
. This indicates nothing has been selected from the original document. -
Initialize
arrays
to an empty array. This variable will be used to track selected sparse arrays to make them dense after all pointers have been processed. -
Initialize
selectionDocument
to an initial selection passingdocument
assource
to the algorithm in Section 3.3.11 createInitialSelection. -
Set the value of the
@context
property inselectionDocument
to a copy of the value of the@context
property indocument
. -
For each
pointer
inpointers
, walk the document from root to the pointer target value, building theselectionDocument
:-
Parse the
pointer
into an array of paths, paths, using the algorithm in Section 3.3.10 jsonPointerToPaths. - Use the algorithm in Section 3.3.12 selectPaths, passing document, paths, selectionDocument, and arrays.
-
Parse the
-
For each array in arrays:
- Make array dense by removing any undefined elements between elements that are defined.
-
Return
selectionDocument
.
The following algorithm relabels the blank node identifiers in an array of N-Quad strings using a blank node label map. The required inputs are an array of N-Quad strings (nquads) and a blank node label map (labelMap). An array of N-Quad strings with relabeled blank node identifiers (relabeledNQuads) is produced as output.
- Create a new array of N-Quad strings, relabeledNQuads.
-
For each N-Quad string, s1, in nquads:
- Create a new string, s2, such it that is a copy of s1 except each blank node identifier therein has been replaced with the value associated with it as a key in labelMap.
- Append s2 to relabeledNQuads.
- Return relabeledNQuads.
The following algorithm selects a portion of a skolemized compact JSON-LD
document using an array of JSON Pointers, and outputs the resulting canonical
N-Quads with any blank node labels replaced using the given label map. The
required inputs are an array of JSON Pointers (pointers), a
skolemized compact JSON-LD document (skolemizedCompactDocument),
and a blank node label map (labelMap). Additional custom options
(such as a document loader) can be passed. The document is assumed
to use a JSON-LD context that aliases @id
and @type
to id
and type
,
respectively, and to use only one @context
property at the top level
of the document. An object containing the new JSON-LD document that represents
a selection of the original JSON-LD document (selectionDocument), an
array of deskolemized N-Quad strings (deskolemizedNQuads), and an
array of canonical N-Quads with replacement blank node labels (nquads)
is produced as output.
- Initialize selectionDocument to the result of the algorithm in Section 3.3.13 selectJsonLd, passing pointers, and skolemizedCompactDocument as document.
- Initialize deskolemizedNQuads to the result of the algorithm in Section 3.3.9 toDeskolemizedNQuads, passing selectionDocument as skolemizedCompactDocument, and any custom options.
- Initialize nquads to the result of the algorithm in Section 3.3.14 relabelBlankNodes, passing labelMap, and deskolemizedNQuads as nquads.
- Return an object containing selectionDocument, deskolemizedNQuads, and nquads.
The following algorithm is used to output canonical N-Quad strings that match custom selections of a compact JSON-LD document. It does this by canonicalizing a compact JSON-LD document (replacing any blank node identifiers using a label map) and grouping the resulting canonical N-Quad strings according to the selection associated with each group. Each group will be defined using an assigned name and array of JSON pointers. The JSON pointers will be used to select portions of the skolemized document, such that the output can be converted to canonical N-Quads to perform group matching.
The required inputs are a compact JSON-LD document (document),
a label map factory function (labelMapFactoryFunction), and a map
of named group definitions (groupDefinitions). Additional custom
options (such as a document loader) can be passed. The document is
assumed to use a JSON-LD context that aliases @id
and @type
to id
and
type
, respectively, and to use only one @context
property at the top
level of the document. An object containing the created groups
(groups), the skolemized compact JSON-LD document
(skolemizedCompactDocument), the skolemized expanded JSON-LD
document (skolemizedExpandedDocument), the deskolemized N-Quad
strings (deskolemizedNQuads), the blank node label map
(labelMap), and the canonical N-Quad strings nquads, is
produced as output.
- Initialize skolemizedExpandedDocument and skolemizedCompactDocument to their associated values in the result of the algorithm in Section 3.3.8 skolemizeCompactJsonLd, passing document and any custom options.
- Initialize deskolemizedNQuads to the result of the algorithm in Section 3.3.9 toDeskolemizedNQuads, passing skolemizedExpandedDocument and any custom options.
- Initialize nquads and labelMap to their associated values in the result of the algorithm in Section 3.3.1 labelReplacementCanonicalizeNQuads, passing labelMapFactoryFunction, deskolemizedNQuads as nquads, and any custom options.
- Initialize selections to a new map.
-
For each key (name) and value (pointers) entry in
groupDefinitions:
- Add an entry with a key of name and a value that is the result of the algorithm in Section 3.3.15 selectCanonicalNQuads, passing pointers, labelMap, skolemizedCompactDocument as document, and any custom options.
- Initialize groups to an empty object.
-
For each key (name) and value (selectionResult) entry in
selections:
- Initialize matching to an empty map.
- Initialize nonMatching to an empty map.
- Initialize selectedNQuads to nquads from selectionResult.
- Initialize selectedDeskolemizedNQuads from deskolemizedNQuads from selectionResult.
-
For each element (nq) and index (index) in
nquads:
- Create a map entry, entry, with a key of index and a value of nq.
- If selectedNQuads includes nq then add entry to matching; otherwise, add entry to nonMatching.
- Set name in groups to an object containing matching, nonMatching, and selectedDeskolemizedNQuads as deskolemizedNQuads.
- Return an object containing groups, skolemizedExpandedDocument, skolemizedCompactDocument, deskolemizedNQuads, labelMap, and nquads.
The following algorithm cryptographically hashes an array of mandatory to disclose N-Quads using a provided hashing API. The required input is an array of mandatory to disclose N-Quads (mandatory) and a hashing function (hasher). A cryptographic hash (mandatoryHash) is produced as output.
-
Initialize
bytes
to the UTF-8 representation of the joinedmandatory
N-Quads. -
Initialize
mandatoryHash
to the result of usinghasher
to hashbytes
. -
Return
mandatoryHash
.
The Working Group is seeking implementer feedback on these cryptographic suite functions as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to these algorithms, or the removal of the algorithms from the specification during the Candidate Recommendation phase.
This section contains subalgorithms that are useful to the ecdsa-sd-2023
cryptographic suite.
The following algorithm serializes the data that is to be signed by the private key associated with the base proof verification method. The required inputs are the proof options hash (proofHash), the proof-scoped multikey-encoded public key (publicKey), and the mandatory hash (mandatoryHash). A single sign data value, represented as series of bytes, is produced as output.
- Return the concatenation of proofHash, publicKey, and mandatoryHash, in that order, as sign data.
The following algorithm serializes the base proof value, including the base signature, public key, HMAC key, signatures, and mandatory pointers. The required inputs are a base signature baseSignature, a public key publicKey, an HMAC key hmacKey, an array of signatures, and an array of mandatoryPointers. A single base proof string value is produced as output.
-
Initialize a byte array,
proofValue
, that starts with the ECDSA-SD base proof header bytes 0xd9, 0x5d, and 0x00. -
Initialize
components
to an array with five elements containing the values of:baseSignature
,publicKey
,hmacKey
,signatures
, andmandatoryPointers
. -
CBOR-encode
components
and append it toproofValue
. -
Initialize
baseProof
to a string with the Multibase base64url-no-pad-encoding ofproofValue
as described in the Multibase section of [VC-DATA-INTEGRITY]. That is, return a string starting with "u
" and ending with the base64url-no-pad-encoded value ofproofValue
. -
Return
baseProof
as base proof.
The following algorithm parses the components of an ecdsa-sd-2023
selective
disclosure base proof value. The required inputs are a proof value
(proofValue). A single object parsed base proof, containing
five elements, using the names "baseSignature", "publicKey", "hmacKey",
"signatures", and "mandatoryPointers", is produced as output.
-
Ensure the
proofValue
string starts withu
, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error if it does not. -
Initialize
decodedProofValue
to the result of base64url-no-pad-decoding the substring after the leadingu
inproofValue
. -
Ensure that the
decodedProofValue
starts with the ECDSA-SD base proof header bytes 0xd9, 0x5d, and 0x00, throwing an error if it does not. -
Initialize
components
to an array that is the result of CBOR-decoding the bytes that follow the three-byte ECDSA-SD base proof header. Ensure the result is an array of five elements. - Return an object with properties set to the five elements, using the names "baseSignature", "publicKey", "hmacKey", "signatures", and "mandatoryPointers", respectively.
The following algorithm creates data to be used to generate a derived proof. The inputs include a JSON-LD document (document), an ECDSA-SD base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options, such as a document loader). A single object, disclosure data, is produced as output, which contains the "baseSignature", "publicKey", "signatures" for "filteredSignatures", "labelMap", "mandatoryIndexes", and "revealDocument" fields.
-
Initialize
baseSignature
,publicKey
,hmacKey
,signatures
, andmandatoryPointers
to the values of the associated properties in the object returned when calling the algorithm in Section 3.4.3 parseBaseProofValue, passing theproofValue
fromproof
. -
Initialize
hmac
to an HMAC API usinghmacKey
. The HMAC uses the same hash algorithm used in the signature algorithm, i.e., SHA-256 for a P-256 curve. -
Initialize
labelMapFactoryFunction
to the result of calling thecreateHmacIdLabelMapFunction
algorithm passinghmac
asHMAC
. -
Initialize
combinedPointers
to the concatenation ofmandatoryPointers
andselectivePointers
. -
Initialize
groupDefinitions
to a map with the following entries: key of the string"mandatory"
and value ofmandatoryPointers
, key of the string"selective"
and value ofselectivePointers
, and key of the string"combined"
and value ofcombinedPointers
. -
Initialize
groups
andlabelMap
to their associated values in the result of calling the algorithm in Section 3.3.16 canonicalizeAndGroup, passingdocument
,labelMapFactoryFunction
,groupDefinitions
, and any custom JSON-LD API options as parameters. Note: This step transforms the document into an array of canonical N-Quad strings with pseudorandom blank node identifiers based onhmac
, and groups the N-Quad strings according to selections based on JSON pointers. -
Initialize
relativeIndex
to zero. -
Initialize
mandatoryIndexes
to an empty array. -
For each
absoluteIndex
in the keys ingroups.combined.matching
, convert the absolute index of any mandatory N-Quad to an index relative to the combined output that is to be revealed:-
If
groups.mandatory.matching
hasabsoluteIndex
as a key, then appendrelativeIndex
tomandatoryIndexes
. -
Increment
relativeIndex
.
-
If
-
Determine which signatures match a selectively disclosed statement, which
requires incrementing an index counter while iterating over all
signatures
, skipping over any indexes that match the mandatory group.-
Initialize
index
to0
. -
Initialize
filteredSignatures
to an empty array. -
For each
signature
insignatures
:-
While
index
is ingroups.mandatory.matching
, incrementindex
. -
If
index
is ingroups.selective.matching
, addsignature
tofilteredSignatures
. -
Increment
index
.
-
While
-
Initialize
-
Initialize revealDocument to the result of the "selectJsonLd"
algorithm, passing
document
, andcombinedPointers
aspointers
. - Run the RDF Dataset Canonicalization Algorithm [RDF-CANON] on the joined combinedGroup.deskolemizedNQuads, passing any custom options, and get the canonical bnode identifier map, canonicalIdMap. Note: This map includes the canonical blank node identifiers that a verifier will produce when they canonicalize the reveal document.
- Initialize verifierLabelMap to an empty map. This map will map the canonical blank node identifiers the verifier will produce when they canonicalize the revealed document to the blank node identifiers that were originally signed in the base proof.
-
For each key (
inputLabel
) and value (verifierLabel
) in `canonicalIdMap:-
Add an entry to
verifierLabelMap
usingverifierLabel
as the key and the value associated withinputLabel
as a key inlabelMap
as the value.
-
Add an entry to
-
Return an object with properties matching
baseSignature
,publicKey
, "signatures" forfilteredSignatures
, "verifierLabelMap" forlabelMap
,mandatoryIndexes
, andrevealDocument
.
The following algorithm compresses a label map. The required inputs are label map (labelMap). The output is a compressed label map.
-
Initialize
map
to an empty map. -
For each entry (
k
,v
) inlabelMap
:-
Add an entry to
map
with a key that is a base-10 integer parsed from the characters following the "c14n" prefix ink
and a value that is a byte array resulting from base64url-no-pad-decoding the characters after the "u" prefix inv
.
-
Add an entry to
-
Return
map
as compressed label map.
The following algorithm decompresses a label map. The required input is a compressed label map (compressedLabelMap). The output is a decompressed label map.
-
Initialize
map
to an empty map. -
For each entry (
k
,v
) incompressedLabelMap
:-
Add an entry to
map
with a key that adds the prefix "c14n" tok
and a value that adds a prefix of "u" to the base64url-no-pad-encoded value forv
.
-
Add an entry to
-
Return
map
as decompressed label map.
The following algorithm serializes a derived proof value. The required inputs are a base signature (baseSignature), public key (publicKey), an array of signatures (signatures), a label map (labelMap), and an array of mandatory indexes (mandatoryIndexes). A single derived proof value, serialized as a byte string, is produced as output.
-
Initialize
compressedLabelMap
to the result of calling the algorithm in Section 3.4.5 compressLabelMap, passinglabelMap
as the parameter. -
Initialize a byte array,
proofValue
, that starts with the ECDSA-SD disclosure proof header bytes0xd9
,0x5d
, and0x01
. -
Initialize
components
to an array with five elements containing the values of:baseSignature
,publicKey
,signatures
,compressedLabelMap
, andmandatoryIndexes
. -
CBOR-encode
components
and append it toproofValue
. -
Return the derived proof as a string with the
base64url-no-pad-encoding of
proofValue
as described in the Multibase section of [VC-DATA-INTEGRITY]. That is, return a string starting with "u
" and ending with the base64url-no-pad-encoded value ofproofValue
.
The following algorithm parses the components of the derived proof value. The required inputs are a derived proof value (proofValue). A A single derived proof value value object is produced as output, which contains a set to five elements, using the names "baseSignature", "publicKey", "signatures", "labelMap", and "mandatoryIndexes".
-
Ensure the
proofValue
string starts withu
, indicating that it is a multibase-base64url-no-pad-encoded value, throwing an error if it does not. -
Initialize
decodedProofValue
to the result of base64url-no-pad-decoding the substring after the leadingu
inproofValue
. -
Ensure that the
decodedProofValue
starts with the ECDSA-SD disclosure proof header bytes0xd9
,0x5d
, and0x01
, throwing an error if it does not. -
Initialize
components
to an array that is the result of CBOR-decoding the bytes that follow the three-byte ECDSA-SD disclosure proof header. Ensure the result is an array of five elements. Ensure the result is an array of five elements: a byte array of length 64, a byte array of length 36, an array of byte arrays, each of length 64, a map of integers to byte arrays of length 32, and an array of integers, throwing an error if not. -
Replace the fourth element in
components
using the result of calling the algorithm in Section 3.4.6 decompressLabelMap, passing the existing fourth element ofcomponents
ascompressedLabelMap
. - Return derived proof value as an object with properties set to the five elements, using the names "baseSignature", "publicKey", "signatures", "labelMap", and "mandatoryIndexes", respectively.
The following algorithm creates the data needed to perform verification of an ECDSA-SD-protected verifiable credential. The inputs include a JSON-LD document (document), an ECDSA-SD disclosure proof (proof), and any custom JSON-LD API options, such as a document loader. A single verify data object value is produced as output containing the following fields: "baseSignature", "proofHash", "publicKey", "signatures", "nonMandatory", and "mandatoryHash".
-
Initialize
proofHash
to the result of perform RDF Dataset Canonicalization [RDF-CANON] on the proof options. The hash used is the same as the one used in the signature algorithm, i.e., SHA-256 for a P-256 curve. Note: This step can be performed in parallel; it only needs to be completed before this algorithm needs to use theproofHash
value. -
Initialize
baseSignature
,publicKey
,signatures
,labelMap
, andmandatoryIndexes
, to the values associated with their property names in the object returned when calling the algorithm in Section 3.4.8 parseDerivedProofValue, passingproofValue
fromproof
. -
Initialize
labelMapFactoryFunction
to the result of calling the "createLabelMapFunction" algorithm. -
Initialize
nquads
to the result of calling the "labelReplacementCanonicalize" algorithm, passingdocument
,labelMapFactoryFunction
, and any custom JSON-LD API options. Note: This step transforms the document into an array of canonical N-Quads with pseudorandom blank node identifiers based onlabelMap
. -
Initialize
mandatory
to an empty array. -
Initialize
nonMandatory
to an empty array. -
For each entry (
index
,nq
) innquads
, separate the N-Quads into mandatory and non-mandatory categories:-
If
mandatoryIndexes
includesindex
, addnq
tomandatory
. -
Otherwise, add
nq
tononMandatory
.
-
If
-
Initialize
mandatoryHash
to the result of calling the "hashMandatory" primitive, passingmandatory
. -
Return an object with properties matching
baseSignature
,proofHash
,publicKey
,signatures
,nonMandatory
, andmandatoryHash
.
The Working Group is seeking implementer feedback on this cryptographic suite as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to this algorithm, or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The ecdsa-sd-2023
cryptographic suite takes an input document, canonicalizes
the document using the Universal RDF Dataset Canonicalization Algorithm
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
To generate a base proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.5.2 Base Proof Transformation (ecdsa-sd-2023), the hashing algorithm is defined in Section 3.5.3 Base Proof Hashing (ecdsa-sd-2023), and the proof serialization algorithm is defined in Section 3.5.5 Base Proof Serialization (ecdsa-sd-2023).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.5.3 Base Proof Hashing (ecdsa-sd-2023).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type), a cryptosuite identifier (cryptosuite), and a verification method (verificationMethod). The transformation options MUST contain an array of mandatory JSON pointers (mandatoryPointers) and MAY contain additional options, such as a JSON-LD document loader. A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
-
Initialize
hmac
to an HMAC API using a locally generated and exportable HMAC key. The HMAC uses the same hash algorithm used in the signature algorithm, which is detected via the verificationMethod provided to the function. i.e., SHA-256 for a P-256 curve. -
Initialize
labelMapFactoryFunction
to the result of calling thecreateHmacIdLabelMapFunction
algorithm passinghmac
asHMAC
. -
Initialize
groupDefinitions
to a map with an entry with a key of the string "mandatory" and a value of mandatoryPointers. -
Initialize
groups
to the result of calling the algorithm in Section 3.3.16 canonicalizeAndGroup, passinglabelMapFactoryFunction
,groupDefinitions
,unsecuredDocument
asdocument
, and any custom JSON-LD API options. Note: This step transforms the document into an array of canonical N-Quads with pseudorandom blank node identifiers based onhmac
, and groups the N-Quad strings according to selections based on JSON pointers. -
Initialize
mandatory
to the values in thegroups.mandatory.matching
map. -
Initialize
nonMandatory
to the values in thegroups.mandatory.nonMatching
map. -
Initialize
hmacKey
to the result of exporting the HMAC key fromhmac
. -
Return an object with "mandatoryPointers" set to
mandatoryPointers
, "mandatory" set tomandatory
, "nonMandatory" set tononMandatory
, and "hmacKey" set tohmacKey
.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.5.5 Base Proof Serialization (ecdsa-sd-2023).
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A hash data value represented as an object is produced as output.
-
Initialize
proofHash
to the result of calling the RDF Dataset Canonicalization algorithm [RDF-CANON] oncanonicalProofConfig
and then cryptographically hashing the result using the same hash that is used by the signature algorithm, i.e., SHA-256 for a P-256 curve. Note: This step can be performed in parallel; it only needs to be completed before this algorithm terminates as the result is part of the return value. -
Initialize
mandatoryHash
to the result of calling the the algorithm in Section 3.3.17 hashMandatoryNQuads, passing transformedDocument.mandatory
. -
Initialize
hashData
as a deep copy of transformedDocument and addproofHash
as "proofHash" andmandatoryHash
as "mandatoryHash" to that object. -
Return
hashData
as hash data.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the base proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
- Let proofConfig be an empty object.
- Set proofConfig.type to options.type.
- If options.cryptosuite is set, set proofConfig.cryptosuite to its value.
-
If options.type is not set to
DataIntegrityProof
and proofConfig.cryptosuite is not set toecdsa-sd-2023
, anINVALID_PROOF_CONFIGURATION
error MUST be raised. -
Set proofConfig.created to
options.created. If the value is not a valid
[XMLSCHEMA11-2] datetime, an
INVALID_PROOF_DATETIME
error MUST be raised. - Set proofConfig.verificationMethod to options.verificationMethod.
- Set proofConfig.proofPurpose to options.proofPurpose.
- Set proofConfig.@context to unsecuredDocument.@context.
- Let canonicalProofConfig be the result of applying the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON] to the proofConfig.
- Return canonicalProofConfig.
The following algorithm specifies how to create a base proof; called by an issuer of an ECDSA-SD-protected Verifiable Credential. The base proof is to be given only to the holder, who is responsible for generating a derived proof from it, exposing only selectively disclosed details in the proof to a verifier. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
-
Initialize
proofHash
,mandatoryPointers
,mandatoryHash
,nonMandatory
, andhmacKey
to the values associated with their property names hashData. -
Initialize
proofScopedKeyPair
to a locally generated P-256 ECDSA key pair. Note: This key pair is scoped to the specific proof; it is not used for anything else and the private key will be destroyed when this algorithm terminates. -
Initialize
signatures
to an array where each element holds the result of digitally signing the UTF-8 representation of each N-Quad string innonMandatory
, in order. The digital signature algorithm is ES256, i.e., uses a P-256 curve over a SHA-256 digest, and uses the private key fromproofScopedKeyPair
. Note: This step generates individual signatures for each statement that can be selectively disclosed using a local, proof-scoped key pair that binds them together; this key pair will be bound to the proof by a signature over its public key using the private key associated with the base proof verification method. -
Initialize
publicKey
to the multikey expression of the public key exported fromproofScopedKeyPair
. That is, an array of bytes starting with the bytes 0x80 and 0x24 (which is the multikey p256-pub header (0x1200) expressed as a varint) followed by the compressed public key bytes (the compressed header with2
for an eveny
coordinate and3
for an odd one followed by thex
coordinate of the public key). -
Initialize
toSign
to the result of calling the algorithm in Section 3.4.1 serializeSignData, passingproofHash
,publicKey
, andmandatoryHash
as parameters to the algorithm. -
Initialize
baseSignature
to the result of digitally signingtoSign
using the private key associated with the base proof verification method. -
Initialize `proofValue to the result of calling the algorithm in Section
3.4.2 serializeBaseProofValue, passing
baseSignature
,publicKey
,hmacKey
,signatures
, andmandatoryPointers
as parameters to the algorithm. -
Return
proofValue
as digital proof.
The following algorithm creates a selective disclosure derived proof; called by
a holder of an ecdsa-sd-2023
-protected verifiable credential.
The derived proof is to be given to the verifier. The inputs include a
JSON-LD document (document), an ECDSA-SD base proof
(proof), an array of JSON pointers to use to selectively disclose
statements (selectivePointers), and any custom JSON-LD API options,
such as a document loader. A single selectively revealed document
value, represented as an object, is produced as output.
-
Initialize
baseSignature
,publicKey
,signatures
,labelMap
,mandatoryIndexes
,revealDocument
to the values associated with their property names in the object returned when calling the algorithm in Section 3.4.4 createDisclosureData, passing thedocument
,proof
,selectivePointers
, and any custom JSON-LD API options, such as a document loader. -
Initialize
newProof
to a shallow copy ofproof
. -
Replace
proofValue
innewProof
with the result of calling the algorithm in Section 3.4.7 serializeDerivedProofValue, passingbaseSignature
,publicKey
,signatures
,labelMap
, andmandatoryIndexes
. -
Set the value of the "proof" property in
revealDocument
tonewProof
. -
Return
revealDocument
as the selectively revealed document.
The following algorithm attempts verification of an ecdsa-sd-2023
derived
proof. This algorithm is called by a verifier of an ECDSA-SD-protected
verifiable credential. The inputs include a JSON-LD document
(document), an ECDSA-SD disclosure proof (proof), and any
custom JSON-LD API options, such as a document loader. A single boolean
verification result value is produced as output.
-
Initialize
baseSignature
,proofHash
,publicKey
,signatures
,nonMandatory
, andmandatoryHash
to the values associated with their property names in the object returned when calling the algorithm in Section 3.4.9 createVerifyData, passing thedocument
,proof
, and any custom JSON-LD API options, such as a document loader. -
If the length of
signatures
does not match the length ofnonMandatory
, throw an error indicating that the signature count does not match the non-mandatory message count. -
Initialize
publicKeyBytes
to the public key bytes expressed inpublicKey
. Instructions on how to decode the public key value can be found in Section 2.1.1 Multikey. -
Initialize
toVerify
to the result of calling the algorithm in Setion 3.4.1 serializeSignData, passingproofHash
,publicKey
, andmandatoryHash
. -
Initialize
verificationResult
be the result of applying the verification algorithm of the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], withtoVerify
as the data to be verified against thebaseSignature
using the public key specified bypublicKeyBytes
. IfverificationResult
isfalse
, returnfalse
. -
For every entry (
index
,signature
) insignatures
, verify every signature for every selectively disclosed (non-mandatory) statement:-
Initialize
verificationResult
to the result of applying the verification algorithm Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5], with the UTF-8 representation of the value atindex
ofnonMandatory
as the data to be verified againstsignature
using the public key specified bypublicKeyBytes
. -
If
verificationResult
isfalse
, returnfalse
.
-
Initialize
-
Return
verificationResult
as verification result.
This section is non-normative.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification.
The integrity and authenticity of a secured document that is protected by this cryptographic suite is dependent on a number of factors including the following:
- correct application of a digital signature to the document
- choice of an appropriate signature algorithm (ECDSA) and its parameters (P-256, P-384)
- correct implementation and usage of the digital signature algorithm, particularly with respect to well-known problem areas
- proper management of the private and public keys used for signing and verification
In the following sections, we review these important points and direct the reader to additional information.
This section is non-normative.
The ECDSA signature scheme has the EUF-CMA (existential unforgeability under chosen message attacks) security property. This property guarantees that any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner) cannot output a valid signature for a new message (except with negligible probability).
SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice, it cannot output a new valid signature pair for a new message nor a new signature for an old message (except with negligible probability). ECDSA signature scheme does not have the SUF-CMA property, while other schemes such as EdDSA [FIPS-186-5] do.
Per [NIST-SP-800-57-Part-1] in the absence of large scale quantum computers a security strength level of 128 bits requires a key size of approximately 256 bits while a security strength level of 192 bits requires a key size of 384 bits. [NIST-SP-800-186] recommendations includes curves P-256 and P-384 at these respective security strength levels.
This section is non-normative.
The ECDSA algorithm as detailed in [FIPS-186-5] states: "A new secret random number k, 0 < k < n, shall be generated prior to the generation of each digital signature for use during the signature generation process." The failure to properly generate this k value has lead to some highly publicized integrity breaches in widely deployed systems. To counter this problem, a hash-based method of determining the secret number k, called deterministic ECDSA, is given in [FIPS-186-5] and [RFC6979].
Verification of a ECDSA signature is independent of the method of generating k. Hence it is generally recommended to use deterministic ECDSA unless other requirements dictate otherwise. For example, using different k values results in different signature values for the same document which might be a desirable property in some privacy enhancing situations.
This section is non-normative.
The security of the ECDSA algorithm is dependent on the quality and protection of its private signing key. Guidance in the management of cryptographic keys is a large subject and the reader is referred to [NIST-SP-800-57-Part-1] for more extensive recommendations and discussion. As strongly recommended in both [FIPS-186-5] and [NIST-SP-800-57-Part-1], an ECDSA private signing key is not to be used for any other purpose than ECDSA signatures.
ECDSA private signing keys and public verification keys are strongly advised to have limited cryptoperiods [NIST-SP-800-57-Part-1], where a cryptoperiod is "the time span during which a specific key is authorized for use by legitimate entities or the keys for a given system will remain in effect." [NIST-SP-800-57-Part-1] gives extensive guidance on cryptoperiods for different key types under different situations and generally recommends a 1-3 year cryptoperiod for a private signing key.
To deal with potential private key compromises, [NIST-SP-800-57-Part-1] gives recommendations for protective measures, harm reduction, and revocation. Although we have been emphasizing the security of the private signing key, assurance of public key validity is highly recommended on all public keys before using them, per [NIST-SP-800-57-Part-1].
Ensuring that cryptographic suites are versioned and tightly scoped to a very small set of possible key types and signature schemes (ideally one key type and size and one signature output type) is a design goal for most Data Integrity cryptographic suites. Historically, this has been done by defining both the key type and the cryptographic suite that uses the key type in the same specification. The downside of doing so, however, is that there might be a proliferation of different key types in multikey that result in different cryptosuites defining the same key material differently. For example, one cryptosuite might use compressed Curve P-256 keys while another uses uncompressed values. If that occurs, it will harm interoperability. It will be important in the coming months to years to ensure that this does not happen by fully defining the multikey format in a separate specification so cryptosuite specifications, such as this one, can refer to the multikey specification, thus reducing the chances of multikey type proliferation and improving the chances of maximum interoperability for the multikey format.
Before reading this section, readers are urged to familiarize themselves with general privacy advice provided in the Privacy Considerations section of the Data Integrity specification.
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
This section is non-normative.
All test vectors are produced using deterministic ECDSA. The implementation was validated against the test vectors in [RFC6979].
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "privateKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<https://www.w3.org/2001/XMLSchema#dateTime> .
517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ] }
_:c14n0 <https://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<https://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-rdfc-2019"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP> .
1e00437865de4485028892c7da6f5e95de2fefe6ad72d684d2bec55e870ba9a0
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
1e00437865de4485028892c7da6f5e95de2fefe6ad72d684d2bec55e870ba9a0517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
a5d86febb7125f5e964de2be5a49048c5fdbca1516b3c41cca836199d645a1b4105a9af3525893ba09cff76e5c43b2b4dcb61e2018fa3c47d646510b15824a6d
z4KKHqaD4F7GHyLA6f3wK9Ehxtogv5jQRFpQBM4sXkSf7Bozd7bAf7dF6UkfM2aSCBMm24mPvaFXmzQmimzaEC3SL
Assemble the signed credential with the following two steps:
-
Add the
proofValue
field with the previously computed base-58-btc value to the proof options document. -
Set the
proof
field of the credential to the augmented proof option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "z4KKHqaD4F7GHyLA6f3wK9Ehxtogv5jQRFpQBM4sXkSf7Bozd7bAf7dF6UkfM2aSCBMm24mPvaFXmzQmimzaEC3SL" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "privateKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<https://www.w3.org/2001/XMLSchema#dateTime> .
8bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ] }
_:c14n0 <https://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<https://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-rdfc-2019"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ> .
496154e093b85e177b218821cb7f0307fe7062d0aec5b7d31bad9c44c4e6c32c4943f18cb81b9b515412636b715e43e0
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
496154e093b85e177b218821cb7f0307fe7062d0aec5b7d31bad9c44c4e6c32c4943f18cb81b9b515412636b715e43e08bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
8b0de12071420d74ea7f88e95f3b29048372dfbba62003d2384e1b960a3197b99316d4af8488358f8409570397867be8cc4190892b8e12b157d5edf335aa051a9c0bf5df5e4ffe1cfd7b9577e6ef8b5b034234963b8ae7688888835cb15b08a2
zpuEu1cJ7Wpb453b4RiV3ex7SKGYm3fdAd4WUTVpR8Me3ZXkCCVUfd4M4TvHF9Wv1tRNWe5SkZhQTGYLUxdugFRGC2uyYRNTnimS6UMN6wkenTViRK1Mei7DooSBpumHHjYu
Assemble the signed credential with the following two steps:
-
Add the
proofValue
field with the previously computed base-58-btc value to the proof options document. -
Set the
proof
field of the credential to the augmented proof option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "proofValue": "zpuEu1cJ7Wpb453b4RiV3ex7SKGYm3fdAd4WUTVpR8Me3ZXkCCVUfd4M4TvHF9Wv1tRNWe5SkZhQTGYLUxdugFRGC2uyYRNTnimS6UMN6wkenTViRK1Mei7DooSBpumHHjYu" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "privateKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
59b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod" }
{"created":"2023-02-24T23:36:38Z","cryptosuite":"ecdsa-jcs-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP"}
76a77cf0331cef09562cb471efb7513ead132a07b83b3d9aea2a5149ba8ac342
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
76a77cf0331cef09562cb471efb7513ead132a07b83b3d9aea2a5149ba8ac34259b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
05e27fa5aa9cc1bb37c18794e35c35d588d30b839e4f7f59c81bfbd81047c9c66d1d31063135b38860e66265586582f6521550aa4f22fa9558666532439a38cd
z7pnwfec5k9N26YDUjjDxjJEijAdEoAbJY2n3CTx3CYvzzRxcV5UkmECmLmQcA8eYTsDQ6GHCFDSk7Yb1hd4uN5a
Assemble the signed credential with the following two steps:
-
Add the
proofValue
field with the previously computed base-58-btc value to the proof options document. -
Set the
proof
field of the credential to the augmented proof option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "z7pnwfec5k9N26YDUjjDxjJEijAdEoAbJY2n3CTx3CYvzzRxcV5UkmECmLmQcA8eYTsDQ6GHCFDSk7Yb1hd4uN5a" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "privateKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod" }
{"created":"2023-02-24T23:36:38Z","cryptosuite":"ecdsa-jcs-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ"}
8ba1ac588bdcb2675b84d55abeda3352504bcd190d8028ecece9de84288b8d69499c10c65ed76c821a1b4c51588b371d
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
8ba1ac588bdcb2675b84d55abeda3352504bcd190d8028ecece9de84288b8d69499c10c65ed76c821a1b4c51588b371d3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
2a3a6157c6d26ed20574155b6884916d0528b2057d89b7b855e40ee471708058d0bd7e39559bc0586e6b460f74f69d5d85d5dcee2cdbaea3febf8241ef425e974a9486025fca0c1dc1c7b1fedd25a17032d98bae98bd23e6962865c2fb1d7b19
zFYhRwKuucKxM7dnL69VpnwmU9UD2wc5HfFjXfxKH82pEybv18EfxaT8m53kyMfrDQneYnsLCZ35UE2KwZTkd4zN7vNHdVseyjW5apJJ9NkfpUiTGUayG2yaZvWu6Gd8EDYk
Assemble the signed credential with the following two steps:
-
Add the
proofValue
field with the previously computed base-58-btc value to the proof options document. -
Set the
proof
field of the credential to the augmented proof option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "proofValue": "zFYhRwKuucKxM7dnL69VpnwmU9UD2wc5HfFjXfxKH82pEybv18EfxaT8m53kyMfrDQneYnsLCZ35UE2KwZTkd4zN7vNHdVseyjW5apJJ9NkfpUiTGUayG2yaZvWu6Gd8EDYk" } }
To demonstrate selective disclosure features including mandatory disclosure, selective disclosure, and overlap between mandatory and selective disclosure requires an input credential document with more content than previous test vectors. To avoid excessively long test vectors the starting document test vector is based on a purely ficticious windsurfing (sailing) competition scenario. In addition we break the test vectors into two groups based on those that would be generated by the issuer (base proof) and those that would be generated by the holder (derived proof).
In order to add a selective disclosure base proof to a document the issuer needs the following cryptographic key material:
- The issuers private/public key pair, i.e., the key pair corresponding to the verification method that will be part of the proof.
- A per proof private/public key pair created by the issuer just for this proof. This is an ephemeral, single use key pair where the private key is not kept after the proof has been generated.
- An HMAC key. This used to randomize the order of the blank node ids to avoid potential information leakage from the blank node id ordering. This is used only once and is shared between issuer and holder. The HMAC in this case is functioning as a pseudorandom function (PRF).
The key material used for generating the add base proof test vectors is shown below. Multibase representation is use for the P-256 key pairs and the HMAC key is given as a hexadecimal string.
{ "baseKeyPair": { "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "secretKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }, "proofKeyPair": { "publicKeyMultibase": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "secretKeyMultibase": "z42tqvNGyzyXRzotAYn43UhcFtzDUVdxJ7461fwrfhBPLmfY" }, "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
In our scenario a sailor is registering with a race organizer for a series of windsurfing races to be held over a number of days on Maui. The organizer will inspect the sailors equipment to certify that what has been declared is accurate. The sailors unsigned equipment inventory is shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7.0, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
In addition to let other sailors know what kinds of equipment their competitors maybe sailing on it is mandatory that each sailor disclose the year of their most recent windsurfing board and full details on two of their sails. Note that all sailors are identified by a sail number that is printed on all their equipment. This mandatory information is specified via an array of JSON pointers as shown below.
["/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2"]
The result of applying the above JSON pointers to the sailors equipment document is shown below.
[ { "pointer": "/credentialSubject/sailNumber", "value": "Earth101" }, { "pointer": "/credentialSubject/sails/1", "value": { "size": 6.1, "sailName": "Lahaina", "year": 2023 } }, { "pointer": "/credentialSubject/boards/0/year", "value": 2022 }, { "pointer": "/credentialSubject/sails/2", "value": { "size": 7, "sailName": "Lahaina", "year": 2020 } } ]
Transformation of the unsigned document begins with canonicalizing the document as shown below.
[ "_:c14n0 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n2 <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:c14n2 <https://www.w3.org/2018/credentials#credentialSubject> _:c14n6 .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n0 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n3 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n1 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n4 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n5 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n7 .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ]
To prevent possible information leakage from the ordering of the blank node ids these are processed through a PRF, i.e., the HMAC to give the canonized HMAC document shown below. This represents an ordered list of statements that will be subject to mandatory and selective disclosure, i.e., it is from this list that statements are grouped.
[ "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://www.w3.org/2018/credentials#credentialSubject> _:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#boards> _:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#boards> _:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ]
The above canonical document gets grouped in to mandatory and non-mandatory statements. The final output of the selective disclosure transformation process is shown below. Each statement is now grouped as mandatory and non-mandatory and its index in the previous list of statements is remembered.
{ "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ], "mandatory": { "dataType": "Map", "value": [ [ 0, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 1, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n" ], [ 2, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 8, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 12, "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n" ], [ 13, "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://www.w3.org/2018/credentials#credentialSubject> _:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk .\n" ], [ 14, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 15, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 16, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 17, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#boards> _:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 .\n" ], [ 19, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n" ], [ 20, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY .\n" ], [ 23, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ .\n" ] ] }, "nonMandatory": { "dataType": "Map", "value": [ [ 3, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 4, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n" ], [ 5, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 6, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n" ], [ 7, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 9, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n" ], [ 10, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<https://www.w3.org/2001/XMLSchema#double> .\n" ], [ 11, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 18, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#boards> _:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc .\n" ], [ 21, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg .\n" ], [ 22, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sails> _:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 .\n" ], [ 24, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n" ], [ 25, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 26, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<https://www.w3.org/2001/XMLSchema#integer> .\n" ] ] }, "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
The next step is to create the base proof configuration and canonicalize it. This is shown in the following two examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ] }
_:c14n0 <https://purl.org/dc/terms/created> "2023-08-15T23:36:38Z"^^<https://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <https://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-sd-2023"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP> .
In the hashing step we compute the SHA-256 hash of the canonicalized proof
options to produce the proofHash
and we compute the SHA-256 hash of the join of
all the mandatory nquads to produce the mandatoryHash
. These are shown below
in hexadecimal format.
{ "proofHash": "9c5c9b189f06cfa9d9f21a838ccb9b04316f07ad1a517bfd4955ee28c6a8229c", "mandatoryHash": "68046f063278f8c5aa1b3880d96bb9c41fc259e7538ca7a4ffa47143739465a4" }
We compute the baseSignature
over the concatenation of the proofHash
,
proofPublicKey
, and mandatoryHash
using the issuers long term privateKey
.
We compute the signatures
array by signing each non-mandatory nquad using the
per proofPrivateKey
. These signatures, the proofPublicKey
, and
mandatoryPointers
which are fed to the final serialization step are
shown below.
{ "baseSignature": "aa97b51f9d595388f666a44d3fe3a303473517b3d9ffba2740fb6f7d26cadedefd4c31519978517c11a64d22bd45a3dff9e1432a758a0a6ce7604d6544c9d548", "publicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "signatures": [ "29560a336e740c370d58e610a549980b567935984947089ef2f5547aff78406b2df375a1a16eff50ce8950bb0a8cd5478f1665650ca8bcde31c3533f9a19fa4c", "b2a312833d0479a3ded079a8e5237524d6660a3899f8234907849c9b22b8eacee10e8b4d66e187c4a19a0050b8dced05c5c29e502f7aabfcfd3c6785e42dd5a5", "99ea0453c235673c45c91f90331c284a4a911bd13cfc2692ae11fc4c3dadfad577d872800b884990a97ac47bb3e972f61be574726e9dfeb5a8ce386699568c8d", "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "88fbd1f623059d6ec7066a505b0e98ef02206431b10a6da2f9363b54861225fe758a1150ad9d9704e533ef6c47a9adb81e7e7757e916236537c83dfc30865850", "f265d93a396dd61bf53c81be22b00571bdc0f5f1f34cb464676f936ee67c763a0d0c85262de8953f723b96a017f7edc53ab56afce72df15067b4b2915ba4b34a", "b51cd241db4dcabc6bc9316aad8b84d1faaccb564917d3761eb6a85b59b8a5e198d47f997adda7cfa1ac61deb5dad6514c335121e42ed1b3f32cff87b4794e21", "a8b4dd55ed4929196a3f718423aa27772106b113bb0b93600398a8c470d7b9f58f22f9a98d87038c7b4d4fcd3bc63e293ecb785f2f4b7b006594ea09898fd03c", "e7dd74deab1b23d508cdc537604a5a070a678809f28de9b0ffb11a3cdddfdfbb221fbf51c2897ab5fef6e9050f5226ab3729a085265e5ef21b8e9c6a94acfe50", "9ba4c197083703da56c811f33829a96537d2ea85848ec3c4c1067de6387fd96e2ee9e9058cb256b502a0450933798ce3a54a4dcf72038546c1b9c735514b2b68", "8121f67d303babc42991177e53ebf36f5bdf11998a0e9283d39222986fe7f7b32da8e3089f490256b0af8eebc5747fc3db36d1eb4a2bf8eae5cde0a9b9c63801", "b4e3f3753d4ff6466fee9b4f40d14a3ff1f693407982ec4ced6d56fe97d3dccd19257e1a96946be4cfc6c01b30d3d983be64360c67c34c917b3662e9131f7efb", "81c0573e5a47bf25f4d475d72cf9f1645ae72afd676ada0f51d2af30c32c978103efbd522e1841b7adc7422bbd0b9ffeb9c4c69482adf71ac8850a6190b3428f" ], "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Finally, the values above are run through the algorithm of Section
3.4.2 serializeBaseProofValue to produce the proofValue
which is
used in the signed based document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "u2V0AhdhAWECql7UfnVlTiPZmpE0_46MDRzUXs9n_uidA-299Jsre3v1MMVGZeFF8EaZNIr1Fo9_54UMqdYoKbOdgTWVEydVI2EBYI4AkAipyzhm2PxbjPgEqUpJDsbCEdhPJ-zJdqtVEOrRMM4uT2EBYIAARIjNEVWZ3iJmqu8zd7v8AESIzRFVmd4iZqrvM3e7_jthAWEApVgozbnQMNw1Y5hClSZgLVnk1mElHCJ7y9VR6_3hAay3zdaGhbv9QzolQuwqM1UePFmVlDKi83jHDUz-aGfpM2EBYQLKjEoM9BHmj3tB5qOUjdSTWZgo4mfgjSQeEnJsiuOrO4Q6LTWbhh8ShmgBQuNztBcXCnlAveqv8_TxnheQt1aXYQFhAmeoEU8I1ZzxFyR-QMxwoSkqRG9E8_CaSrhH8TD2t-tV32HKAC4hJkKl6xHuz6XL2G-V0cm6d_rWozjhmmVaMjdhAWEBtAwxySlwwASjXlYLoLwyjdsIRYUa05OQzE0P4skx1-QJKi8HtGcJHtJfOTn7RhWKC0nkXODvUAChvnKDVY02T2EBYQKUB5WERpVFZL_ll9ToyWCdfTsO4qb1cFL5vdEp9yIzUS7svaa-Qx5t8FZVTD1aS6o0vhPP4yQ3iVeaWNG3yvwPYQFhAiPvR9iMFnW7HBmpQWw6Y7wIgZDGxCm2i-TY7VIYSJf51ihFQrZ2XBOUz72xHqa24Hn53V-kWI2U3yD38MIZYUNhAWEDyZdk6OW3WG_U8gb4isAVxvcD18fNMtGRnb5Nu5nx2Og0MhSYt6JU_cjuWoBf37cU6tWr85y3xUGe0spFbpLNK2EBYQLUc0kHbTcq8a8kxaq2LhNH6rMtWSRfTdh62qFtZuKXhmNR_mXrdp8-hrGHetdrWUUwzUSHkLtGz8yz_h7R5TiHYQFhAqLTdVe1JKRlqP3GEI6ondyEGsRO7C5NgA5ioxHDXufWPIvmpjYcDjHtNT807xj4pPst4Xy9LewBllOoJiY_QPNhAWEDn3XTeqxsj1QjNxTdgSloHCmeICfKN6bD_sRo83d_fuyIfv1HCiXq1_vbpBQ9SJqs3KaCFJl5e8huOnGqUrP5Q2EBYQJukwZcINwPaVsgR8zgpqWU30uqFhI7DxMEGfeY4f9luLunpBYyyVrUCoEUJM3mM46VKTc9yA4VGwbnHNVFLK2jYQFhAgSH2fTA7q8QpkRd-U-vzb1vfEZmKDpKD05IimG_n97MtqOMIn0kCVrCvjuvFdH_D2zbR60or-OrlzeCpucY4AdhAWEC04_N1PU_2Rm_um09A0Uo_8faTQHmC7EztbVb-l9PczRklfhqWlGvkz8bAGzDT2YO-ZDYMZ8NMkXs2YukTH3772EBYQIHAVz5aR78l9NR11yz58WRa5yr9Z2raD1HSrzDDLJeBA--9Ui4YQbetx0IrvQuf_rnExpSCrfcayIUKYZCzQo-EeB0vY3JlZGVudGlhbFN1YmplY3Qvc2FpbE51bWJlcngaL2NyZWRlbnRpYWxTdWJqZWN0L3NhaWxzLzF4IC9jcmVkZW50aWFsU3ViamVjdC9ib2FyZHMvMC95ZWFyeBovY3JlZGVudGlhbFN1YmplY3Qvc2FpbHMvMg" } }
In order to create a derived proof a holder starts with a signed document
containing a base proof. The base document we will use for these test vectors is
the final example from Section A.5.1 Base Proof above. The first
step is to run the algorithm of Section 3.4.3 parseBaseProofValue to
recover baseSignature
, publicKey
, hmacKey
, signatures
, and
mandatoryPointers
as shown below.
{ "baseSignature": "aa97b51f9d595388f666a44d3fe3a303473517b3d9ffba2740fb6f7d26cadedefd4c31519978517c11a64d22bd45a3dff9e1432a758a0a6ce7604d6544c9d548", "proofPublicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "hmacKey": "00112233445566778899aabbccddeeff00112233445566778899aabbccddeeff", "signatures": [ "29560a336e740c370d58e610a549980b567935984947089ef2f5547aff78406b2df375a1a16eff50ce8950bb0a8cd5478f1665650ca8bcde31c3533f9a19fa4c", "b2a312833d0479a3ded079a8e5237524d6660a3899f8234907849c9b22b8eacee10e8b4d66e187c4a19a0050b8dced05c5c29e502f7aabfcfd3c6785e42dd5a5", "99ea0453c235673c45c91f90331c284a4a911bd13cfc2692ae11fc4c3dadfad577d872800b884990a97ac47bb3e972f61be574726e9dfeb5a8ce386699568c8d", "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "88fbd1f623059d6ec7066a505b0e98ef02206431b10a6da2f9363b54861225fe758a1150ad9d9704e533ef6c47a9adb81e7e7757e916236537c83dfc30865850", "f265d93a396dd61bf53c81be22b00571bdc0f5f1f34cb464676f936ee67c763a0d0c85262de8953f723b96a017f7edc53ab56afce72df15067b4b2915ba4b34a", "b51cd241db4dcabc6bc9316aad8b84d1faaccb564917d3761eb6a85b59b8a5e198d47f997adda7cfa1ac61deb5dad6514c335121e42ed1b3f32cff87b4794e21", "a8b4dd55ed4929196a3f718423aa27772106b113bb0b93600398a8c470d7b9f58f22f9a98d87038c7b4d4fcd3bc63e293ecb785f2f4b7b006594ea09898fd03c", "e7dd74deab1b23d508cdc537604a5a070a678809f28de9b0ffb11a3cdddfdfbb221fbf51c2897ab5fef6e9050f5226ab3729a085265e5ef21b8e9c6a94acfe50", "9ba4c197083703da56c811f33829a96537d2ea85848ec3c4c1067de6387fd96e2ee9e9058cb256b502a0450933798ce3a54a4dcf72038546c1b9c735514b2b68", "8121f67d303babc42991177e53ebf36f5bdf11998a0e9283d39222986fe7f7b32da8e3089f490256b0af8eebc5747fc3db36d1eb4a2bf8eae5cde0a9b9c63801", "b4e3f3753d4ff6466fee9b4f40d14a3ff1f693407982ec4ced6d56fe97d3dccd19257e1a96946be4cfc6c01b30d3d983be64360c67c34c917b3662e9131f7efb", "81c0573e5a47bf25f4d475d72cf9f1645ae72afd676ada0f51d2af30c32c978103efbd522e1841b7adc7422bbd0b9ffeb9c4c69482adf71ac8850a6190b3428f" ], "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Next, the holder needs to indicate what, if anything else, they wish to reveal to the verifiers by specifying JSON pointers for selective disclosure. In our windsurfing competition scenario a sailor (the holder) has just completed their first day of racing and wishes to reveal to the general public (the verifiers) all the details of the windsurfing boards they used in the competition. These are shown below. Note that this slightly overlaps with the mandatory disclosed information which included only the year of their most recent board.
["/credentialSubject/boards/0", "/credentialSubject/boards/1"]
To produce the revealDocument
, i.e., the unsigned document that will
eventually be signed and sent to the verifier, we append the selective pointers
to the mandatory pointers and input these combined pointers along with the
document without proof to the algorithm of Section 3.3.13 selectJsonLd
to give the result shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
Now that we know what the revealed document looks like, we need to furnish appropriately updated information to the verifier on which statements are mandatory, the signatures for the selected non-mandatory statements, and the mapping between canonical blank node ids for the reveal document and a subset of the HMAC blank node ids. Running step 6 of the 3.4.4 createDisclosureData yields an abundance of information about various statement groups relative to the original document. Below we show a portion of the indexes for those groups.
{ "combinedIndexes": [0, 1, 2, 6, 7, 8, 12, 13, 14, 15, 16, 17, 18, 19, 20, 23, 24, 25, 26], "mandatoryIndexes": [0, 1, 2, 8, 12, 13, 14, 15, 16, 17, 19, 20, 23], "nonMandatoryIndexes": [3, 4, 5, 6, 7, 9, 10, 11, 18, 21, 22, 24, 25, 26], "selectiveIndexes": [6, 7, 8, 12, 13, 17, 18, 24, 25, 26] }
The verifier needs to be able to aggregate and hash the mandatory statements. To
enable this we furnish them with a list of indexes of the mandatory statements
adjusted to their positions in the reveal document. In the previous example the
combinedIndexes
show the indexes of all the original nquads (statements) that
make up the reveal document, in order. To come up with the adjusted mandatory
indexes shown below we obtain the index of each of original mandatory indexes
relative to the combinedIndexes
as shown below.
{"adjMandatoryIndexes":[0,1,2,5,6,7,8,9,10,11,13,14,15]}
We have to furnish the verifier with a list of signatures for those selective statements (nquads) that are not mandatory. The original list of signatures corresponds to every non-mandatory statement and the indexes of these in the original document are given above. We now compute a list of adjusted signature indexes by computing the index of each selective index in the non-mandatory index list, ignoring any selective index not present in the list. We then use the adjusted signature indexes to obtain the filtered signature list. These lists are shown below.
{"adjSignatureIndexes":[3,4,8,11,12,13],"filteredSignatures":["6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93","a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03","a8b4dd55ed4929196a3f718423aa27772106b113bb0b93600398a8c470d7b9f58f22f9a98d87038c7b4d4fcd3bc63e293ecb785f2f4b7b006594ea09898fd03c","8121f67d303babc42991177e53ebf36f5bdf11998a0e9283d39222986fe7f7b32da8e3089f490256b0af8eebc5747fc3db36d1eb4a2bf8eae5cde0a9b9c63801","b4e3f3753d4ff6466fee9b4f40d14a3ff1f693407982ec4ced6d56fe97d3dccd19257e1a96946be4cfc6c01b30d3d983be64360c67c34c917b3662e9131f7efb","81c0573e5a47bf25f4d475d72cf9f1645ae72afd676ada0f51d2af30c32c978103efbd522e1841b7adc7422bbd0b9ffeb9c4c69482adf71ac8850a6190b3428f"]}
The last important piece of disclosure data is a mapping of canonical blank node
ids to HMAC based ids, the labelMap
, computed according to Section
3.4.4 createDisclosureData steps 12-14. This is shown below along with
the rest of the disclosure data minus the reveal document.
{ "baseSignature": "aa97b51f9d595388f666a44d3fe3a303473517b3d9ffba2740fb6f7d26cadedefd4c31519978517c11a64d22bd45a3dff9e1432a758a0a6ce7604d6544c9d548", "publicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "signatures": [ "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "a8b4dd55ed4929196a3f718423aa27772106b113bb0b93600398a8c470d7b9f58f22f9a98d87038c7b4d4fcd3bc63e293ecb785f2f4b7b006594ea09898fd03c", "8121f67d303babc42991177e53ebf36f5bdf11998a0e9283d39222986fe7f7b32da8e3089f490256b0af8eebc5747fc3db36d1eb4a2bf8eae5cde0a9b9c63801", "b4e3f3753d4ff6466fee9b4f40d14a3ff1f693407982ec4ced6d56fe97d3dccd19257e1a96946be4cfc6c01b30d3d983be64360c67c34c917b3662e9131f7efb", "81c0573e5a47bf25f4d475d72cf9f1645ae72afd676ada0f51d2af30c32c978103efbd522e1841b7adc7422bbd0b9ffeb9c4c69482adf71ac8850a6190b3428f" ], "labelMap": { "dataType": "Map", "value": [ [ "c14n0", "u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38" ], [ "c14n1", "uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw" ], [ "c14n2", "ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc" ], [ "c14n3", "uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk" ], [ "c14n4", "ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ" ], [ "c14n5", "u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY" ] ] }, "mandatoryIndexes": [ 0, 1, 2, 5, 6, 7, 8, 9, 10, 11, 13, 14, 15 ] }
Finally using the disclosure data above with the algorithm of Section 3.4.7 serializeDerivedProofValue we obtain the signed derived (reveal) document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "u2V0BhdhAWECql7UfnVlTiPZmpE0_46MDRzUXs9n_uidA-299Jsre3v1MMVGZeFF8EaZNIr1Fo9_54UMqdYoKbOdgTWVEydVI2EBYI4AkAipyzhm2PxbjPgEqUpJDsbCEdhPJ-zJdqtVEOrRMM4uThthAWEBtAwxySlwwASjXlYLoLwyjdsIRYUa05OQzE0P4skx1-QJKi8HtGcJHtJfOTn7RhWKC0nkXODvUAChvnKDVY02T2EBYQKUB5WERpVFZL_ll9ToyWCdfTsO4qb1cFL5vdEp9yIzUS7svaa-Qx5t8FZVTD1aS6o0vhPP4yQ3iVeaWNG3yvwPYQFhAqLTdVe1JKRlqP3GEI6ondyEGsRO7C5NgA5ioxHDXufWPIvmpjYcDjHtNT807xj4pPst4Xy9LewBllOoJiY_QPNhAWECBIfZ9MDurxCmRF35T6_NvW98RmYoOkoPTkiKYb-f3sy2o4wifSQJWsK-O68V0f8PbNtHrSiv46uXN4Km5xjgB2EBYQLTj83U9T_ZGb-6bT0DRSj_x9pNAeYLsTO1tVv6X09zNGSV-GpaUa-TPxsAbMNPZg75kNgxnw0yRezZi6RMffvvYQFhAgcBXPlpHvyX01HXXLPnxZFrnKv1natoPUdKvMMMsl4ED771SLhhBt63HQiu9C5_-ucTGlIKt9xrIhQphkLNCj6YA2EBYIOGCDmZ9TBxEtWeCI9oVmRt0eHRGAaoOXx08gxL2IQt_AdhAWCBWRS4GuU5oQsZVBYkPgz-pbltwoQdYY1s6s8D-oA4orALYQFggkR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSScD2EBYIJNAHl4CeHuptV7FefsRbndC_9ZlI2fTH34YOWmRk8g5BNhAWCB9RYlEdD2PWOZQocsvyTqbQJnyDi_veA6lozmQvdkAJAXYQFgg2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOaNAAECBQYHCAkKCw0ODw" } }
This section is non-normative.
This section contains the substantive changes that have been made to this specification over time.
Changes since the First Public Working Draft:
- Added cryptography suite that uses JSON Canonicalization Scheme.
- Added test vectors for ECDSA (rdfc and jcs).
- Moved normative definition of Multikey to Data Integrity specification.
- Added selective disclosure functions.
- Added cryptography suite for ecdsa-sd-2023.
- Mitigated poison datasets in ECDSA when canonicalizing.
- Renamed cryptosuite names to align across all cryptosuites.
- Ensured that strong guidance to use Deterministic ECDSA is used, if available.
- Specify how to send cryptographic hash function to [RDF-CANON] if used.
-
Add
secretKeymultibase
representation.
- [FIPS-186-5]
- FIPS PUB 186-5: Digital Signature Standard (DSS). U.S. Department of Commerce/National Institute of Standards and Technology. 3 February 2023. National Standard. URL: https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.186-5.pdf
- [NIST-SP-800-186]
- Recommendations for Discrete Logarithm-based Cryptography: Elliptic Curve Domain Parameters. Lily Chen; Dustin Moody; Karen Randall; Andrew Regenscheid; Angela Robinson. National Institute of Standards and Technology. February 2023.
- [RDF-CANON]
- RDF Dataset Canonicalization. Gregg Kellogg; Dave Longley; Dan Yamamoto. W3C. 31 October 2023. W3C Candidate Recommendation. URL: https://www.w3.org/TR/rdf-canon/
- [RFC2119]
- Key words for use in RFCs to Indicate Requirement Levels. S. Bradner. IETF. March 1997. Best Current Practice. URL: https://www.rfc-editor.org/rfc/rfc2119
- [RFC3986]
- Uniform Resource Identifier (URI): Generic Syntax. T. Berners-Lee; R. Fielding; L. Masinter. IETF. January 2005. Internet Standard. URL: https://www.rfc-editor.org/rfc/rfc3986
- [RFC4754]
- IKE and IKEv2 Authentication Using the Elliptic Curve Digital Signature Algorithm (ECDSA). D. Fu; J. Solinas. IETF. January 2007. Proposed Standard. URL: https://www.rfc-editor.org/rfc/rfc4754
- [RFC6234]
- US Secure Hash Algorithms (SHA and SHA-based HMAC and HKDF). D. Eastlake 3rd; T. Hansen. IETF. May 2011. Informational. URL: https://www.rfc-editor.org/rfc/rfc6234
- [RFC6901]
- JavaScript Object Notation (JSON) Pointer. P. Bryan, Ed.; K. Zyp; M. Nottingham, Ed.. IETF. April 2013. Proposed Standard. URL: https://www.rfc-editor.org/rfc/rfc6901
- [RFC6979]
- Deterministic Usage of the Digital Signature Algorithm (DSA) and Elliptic Curve Digital Signature Algorithm (ECDSA). T. Pornin. IETF. August 2013. Informational. URL: https://www.rfc-editor.org/rfc/rfc6979
- [RFC8174]
- Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words. B. Leiba. IETF. May 2017. Best Current Practice. URL: https://www.rfc-editor.org/rfc/rfc8174
- [RFC8785]
- JSON Canonicalization Scheme (JCS). A. Rundgren; B. Jordan; S. Erdtman. IETF. June 2020. Informational. URL: https://www.rfc-editor.org/rfc/rfc8785
- [VC-DATA-INTEGRITY]
- Verifiable Credential Data Integrity 1.0. Manu Sporny; Dave Longley; Greg Bernstein; Dmitri Zagidulin; Sebastian Crane. W3C. 21 October 2023. W3C Working Draft. URL: https://www.w3.org/TR/vc-data-integrity/
- [XMLSCHEMA11-2]
- W3C XML Schema Definition Language (XSD) 1.1 Part 2: Datatypes. David Peterson; Sandy Gao; Ashok Malhotra; Michael Sperberg-McQueen; Henry Thompson; Paul V. Biron et al. W3C. 5 April 2012. W3C Recommendation. URL: https://www.w3.org/TR/xmlschema11-2/
- [DID-CORE]
- Decentralized Identifiers (DIDs) v1.0. Manu Sporny; Amy Guy; Markus Sabadello; Drummond Reed. W3C. 19 July 2022. W3C Recommendation. URL: https://www.w3.org/TR/did-core/
- [NIST-SP-800-57-Part-1]
- Recommendation for Key Management: Part 1 – General. Elaine Barker. National Institute of Standards and Technology. May 2020. URL: https://doi.org/10.6028/NIST.SP.800-57pt1r5
- [SECG2]
- SEC 2: Recommended Elliptic Curve Domain Parameters. Certicom Research. January 27, 2010. URL: https://www.secg.org/sec2-v2.pdf
- [VC-DATA-MODEL-2.0]
- Verifiable Credentials Data Model v2.0. Manu Sporny; Orie Steele; Michael Jones; Gabe Cohen; Oliver Terbu. W3C. 4 November 2023. W3C Working Draft. URL: https://www.w3.org/TR/vc-data-model-2.0/
Referenced in: