Solid QA
More details about this document
- This version
- https://solidproject.org/ED/qa
- Latest version
- https://solidproject.org/ED/qa
- History
- Commit history
- Editors
- Sarven Capadisli
- Created
- Published
- Modified
- Feedback
- GitHub solid/specification (pull requests, new issue, open issues)
- Language
- English
- Document Type
- Specification
- Version
- 0.3.0
- In Reply To
- About Solid
- Test Suite Panel Charter
- Policy
-
- Rule
- Offer
- Unique Identifier
- https://solidproject.org/ED/qa#document-policy-offer
- Target
- https://solidproject.org/ED/qa
- Permission
Copyright © 2023–2024 the Contributors to QA, Version 0.3.0, published by the Solid Community Group under the W3C Community Contributor License Agreement (CLA). A human-readable summary is available.
Abstract
This document describes the Solid Quality Assurance (QA) policy, processes, and procedures. It details the requirements and recommendations for the publication and use of Solid technical reports, test suites, test cases, test assessments, and test reports to improve the quality of technical reports at critical development stages, promote wide deployment and proper implementation of technical reports through open implementation reports, help produce quality test suites, and advance the development and assessment of test cases.
Status of This Document
This report was published by the Solid Community Group. It is not a W3C Standard nor is it on the W3C Standards Track. Please note that under the W3C Community Contributor License Agreement (CLA) there is a limited opt-out and other conditions apply. Learn more about W3C Community and Business Groups.
Introduction
This section is non-normative.
Specifications in the Solid ecosystem Solid Technical Reports [SOLID-TECHNICAL-REPORTS] describe how implementations can be interoperable by using Web communication protocols, global identifiers, authentication and authorization mechanisms, data formats and shapes, notifications, and query interfaces.
Writing tests in a way that allows implementations to conform to the requirements of a technical report gives Solid projects confidence that their software is compatible with other implementations. This in turn gives authors of technical reports and software implementers confidence that they can rely on the Solid ecosystem to deliver on the promise of interoperability based on open standards. Implementation and interoperability experience can be verified by reporting on implementations passing open test suites.
The goal of this document is to describe the Solid Quality Assurance (QA) policy, processes, and procedures. The document details the requirements and advisements towards the publication and consumption of Solid technical reports, test suites, test cases, test assessments, and test reports to:
- improve the quality of technical reports at critical stages of their development;
- promote wide deployment and proper implementation of technical reports with open implementation reports;
- help produce quality test suites;
- advance the development and assessment of test cases.
This document is influenced by the W3C Quality Assurance activity work encompassing W3C processes, specification authoring and publishing, and quality assurance, including: W3C Process Document, Variability in Specifications, QA Framework: Specification Guidelines, The QA Handbook, Test Metadata, Evaluation and Report Language (EARL) 1.0 Schema.
This specification is for:
Terminology
This section is non-normative.
The Solid QA defines the following terms. These terms are referenced throughout this document.
- URI
- A Uniform Resource Identifier (URI) provides the means for identifying resources [RFC3986].
Namespaces
Prefix | Namespace | Description |
---|---|---|
dcterms |
http://purl.org/dc/terms/ | [DC-TERMS] |
doap |
http://usefulinc.com/ns/doap# | DOAP |
earl |
http://www.w3.org/ns/earl# | [EARL10-Schema] |
prov |
http://www.w3.org/ns/prov# | [prov-o] |
rdf |
http://www.w3.org/1999/02/22-rdf-syntax-ns# | [rdf-schema] |
skos |
http://www.w3.org/2004/02/skos/core# | [skos-reference] |
solid |
http://www.w3.org/ns/solid/terms# | Solid Terms |
spec |
http://www.w3.org/ns/spec# | Spec Terms |
td |
http://www.w3.org/2006/03/test-description# | Test Description |
Data Formats
This specification uses the RDF language to describe technical reports, test reports, test suites, and test cases. Implementers are encouraged to produce human-visible and machine-readable representations with RDFa in host languages such as HTML and SVG.
Conformance
This section describes the conformance model of the Solid QA.
Normative and Informative Content
All assertions, diagrams, examples, and notes are non-normative, as are all sections explicitly marked non-normative. Everything else is normative.
The key words “MUST”, “MUST NOT”, “SHOULD”, and “MAY” are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
The key words “strongly encouraged”, “strongly discouraged”, “encouraged", “discouraged", “can", “cannot”, “could”, “could not”, “might”, and “might not” are used for non-normative content.
Specification Category
The Solid QA identifies the following Specification Category to distinguish the types of conformance: API, notation/syntax, set of events, processor behaviour, protocol.
Classes of Products
The Solid QA identifies the following Classes of Products for conforming implementations. These products are referenced throughout this specification.
- Technical Report
- A Technical Report is a document outlining recommendations and the level of conformance that various classes of products, processes, or services can achieve [QA GLOSSARY].
- Test Case
- A Test Cases is an individual test with a purpose that maps to measurable or testable behaviours, actions, or conditions in a Technical Report [QA GLOSSARY].
- Test Report
- A Test Report is a resource that describes the level of conformance of a project’s tests against Test Cases.
- Test Suite
- A Test Suite is collection of documents and software designed to verify an implementation's degree of conformance by using Technical Reports and Test Cases, and it generates Test Reports [QA GLOSSARY].
Interoperability
Interoperability of implementations for Test Suite and Technical Report is tested by evaluating an implementation’s ability to consume and process data that conform to this specification.
Technical Report
Technical Report Description
The Solid Technical Reports Contributing Guide provides the recommendations for publishing technical reports following the Linked Data design principles, where significant units of information, such as concepts and requirements, are given an identifier, and described with a concrete RDF syntax. The Spec Terms vocabulary provides classes and properties that can be used to describe any significant unit of information in technical reports, as well as supporting the description of test cases and test reports. The SKOS data model can be used to identify, describe, and link concepts and definitions across technical reports.
Add other requirements from Spec Terms and SKOS.
- One
spec:testSuite
property to refer to a test suite. - One
spec:implementationReport
property to refer to an implementation report.
Defining URI Templates for implementation report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/summary
, and test report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/{uuid}
.
Test Report
This section describes the requirements for publishing test reports and summaries.
Test Report Description
solid/test-suite-panel/issues/5
The Test Report Description includes information pertaining to conformance and interoperability of an implementation.
Document metadata:
- One
dcterms:license
property to indicate the license of the report. - One
dcterms:created
property to indicate the date and time of the report. - One publication status property (TBD) to state the publication status of the report.
- One activity association property (TBD) to indicate the agent that had a role in the production of the report.
- One submitted by property (TBD) to indicate the entity that proposed the report for publication.
- One approved by (TBD) property to indicate the entity that accepted the report for publication.
- Zero or one
dcterms:description
property to include additional notes about the report from the submitter.
Additional notes about the report can be expressed as a Web Annotation (oa:Annotation
).
Agent (person or software) that had a role in the production of the test report can be expressed as a PROV-O activity association (prov:wasAssociatedWith
).
For publication status, some TRs use pso:holdsStatusInTime
with pso:withStatus
- is there something from a common vocabulary? What visual cues should be required to indicate publication status at a glance?
Implementations that are software projects MUST be described with the Description of a Project vocabulary [DOAP].
Description of a project:
- One
rdf:type
property whose object isdoap:Project
. - Zero or one
doap:name
property to state the name of the project. - Zero or one
doap:repository
property to indicate the location of of project's source code. - Zero or one
doap:release
property to indicate the version information of a project release. - Zero or one
doap:maintainer
property to refer to the maintainers of a project.
Note: Test Report Description: Description of a Project
The subject of the doap:Project
for Test Report Description coincides with the object of the earl:subject
in Test Assertion Description.
To help distinguish project releases in test reports, it is encouraged to use versioned URIs for projects.
While some information about the project can be accompanied with the test report, it is encouraged that projects are self-describing documents.
What should be the recommendation for implementations that are not software projects? Perhaps equivalent to a top concept of spec:ClassesOfProducts
?
Test assertions:
Test reports MUST incorporate additional information about test criteria indicating test authors and reviewers, review status, version of the test criterion, software and setup used to run the tests, provenance and coverage and test suite (see also Test Assertion Description).
Test reports with approved status MUST NOT include assertions related to test criteria with rejected review status (td:rejected
).
Note: Test Report Description: Test Assertion
To help distinguish test criteria, it is encouraged to use versioned URIs for criteria.
To convey association between a test criterion and its reviews, it is encouraged to use the Web Annotation Vocabulary with the oa:assessing
motivation.
Should Web Annotation Vocabulary be required to convey the relationship between test criteria and reviews?
Test Case Description
- One
rdf:type
whose object istd:TestCase
. - One
spec:requirementReference
property to refer to the specification requirement that the test case is testing. - One
td:reviewStatus
property to indicate the status of a test (at the time when the test was run) with an object one of:td:unreviewed
,td:onhold
,td:assigned
,td:accepted
,td:approved
,td:rejected
. - Zero or one
td:input
property to indicate the parameter or data that are needed for the test execution. - One
td:expectedResults
property to indicate the results that a conformant implementation is expected to produce when this test is executed. - Zero or one
td:preCondition
property to indicate that a condition must be met before the test is executed. - Zero or one
td:purpose
property to state the reason for the test with additional context. - Zero or one
dcterms:title
property to provide a human-oriented name for the test. - Zero or one
dcterms:description
property to provide a description of the nature and characteristic of the test. - One or more
dcterms:contributor
s to indicate individuals or organisations that contributed to this test.
spec:testScript
may need to be rdfs:subPropertyOf
td:input
or td:informationResourceInput
, or use those properties instead.
When referencing a requirement with spec:requirementReference
, it is strongly encouraged to use versioned URLs of technical reports where available with preference to URLs covered by a persistence policy.
Test Assertion Description
A Test Assertion indicates measurable or testable statements of behaviour, action, or condition derived from specification's requirements. A test assertion is stated by an entity carrying out the test; indicating contextual result of a test; with a particular process; based on a criterion; that is used to evaluate an implementation.
Test assertions MUST use the Evaluation and Report Language 1.0 Schema [EARL10-Schema].
Test Suite implementers are encouraged to follow the Developer Guide for Evaluation and Report Language 1.0 [EARL10-Guide].
Test Report Notification
GitHub repo/directory and/or report sent as an LDN (TBD: either including or requesting maintainer’s approval.)
Should Activity Vocabulary and Linked Data Notifications be one of the ways to submit test reports as a notification?
Publication of reports can be pre-authorized for approved tests.
Project Maintainer Input and Review
Project maintainer will be notified to review the publication of a report, and if no objection within a certain time (TBD), it can be approved for publication by the submitter (or test suite panel). During the review process, maintainers will be given the opportunity to provide explanatory notes to go with the report.
Implementation Report Description
The Implementation Report Description refers to the criteria and resolutions set forth by the Group publishing a Technical Report to demonstrate implementation experience; refers to test reports; provides a summary.
Document metadata:
Similar to Test Report Description document metadata. Reuse or redefine?
Referencing individual test reports:
- One or more
spec:testReport
s to refer to test reports.
Test Suite
This section describes the requirements for test suite and test metadata.
solid/test-suite-panel/issues/6
Test Suite Description
Description of test suites (e.g., license, date, written by, reviewed by), the software (e.g., repository, version, maintainers), setup (e.g., platform, configuration), provenance (e.g., activity, entity, agent), input about environment (e.g., combination of subject and setup). Meet the requirements of reporting (issue 5) and test review checklist (issue 7).
- One
rdf:type
property whose object isspec:TestSuite
(TBD). - One or more
spec:testCase
s to refer to test cases.
Test Environment
Documentation on:
- how to set up a test environment, how to build the test system (if necessary).
- how to write tests that are in general, short, self-contained, and maybe provide best practices and guidelines.
- how to run tests (provide a set of steps to test and obtain the result). Tests may be run in different ways depending on the specification requirement, e.g., command-line, containers, Web browser.
Do preliminary checks against multiple implementations to catch anomalies or to tease out issues with the tests themselves. Tests authors and specification authors should coordinate regarding the test design.
PR the updated test, mark what it replaces, mark it as to be reviewed, request reviews.
Notify specification authors and editors (and other group of contributors) about new tests and request reviews, e.g., tagging on GitHub.
Tagging the project maintainer (issue 5).
Provide information indicating to what extent the test suite completely or proportionally covers the specification it aims to support (issue 5).
Link to project maintainer’s WebID or GitHub account.
Test Assessment
This section describes the process for authoring and reviewing tests.
solid/test-suite-panel/issues/7
Test Review Policy
- Test review has a URI and its contents are publicly accessible when dereferenced.
- Test reviewer can be anyone (other than the original test author) that has the required experience with the specification. TBD whether at least one reviewer must be the author of the specification.
Test Review Criteria
- The test has a URI and its contents are publicly accessible when dereferenced.
- The test links to specification requirements.
- The CI jobs on the pull request have passed. (TBD)
- It is obvious what the test is trying to test.
- The test passes when it’s supposed to pass.
- The test fails when it’s supposed to fail.
- The test is testing what it thinks it’s testing.
- The specification backs up the expected behaviour in the test.
- The test is automated as - TBD - unless there’s a very good reason for it not to be.
- The test does not use external resources. (TBD)
- The test does not use proprietary features (vendor-prefixed or otherwise).
- The test does not contain commented-out code
- The test is placed in the relevant location.
- The test has a reasonable and concise (file)name.
- If the test needs to be run in some non-standard configuration or needs user interaction, it is a manual test.
- The title is descriptive but not too wordy.
Considerations
This section details security, privacy, accessibility and internationalization considerations.
Some of the normative references with this specification point to documents with a Living Standard or Draft status, meaning their contents can still change over time. It is advised to monitor these documents, as such changes might have implications.
Security Considerations
This section is non-normative.
Privacy Considerations
This section is non-normative.
Accessibility Considerations
This section is non-normative.
Internationalization Considerations
This section is non-normative.
Security and Privacy Review
This section is non-normative.
These questions provide an overview of security and privacy considerations for this specification as guided by [SECURITY-PRIVACY-QUESTIONNAIRE].
- What information might this feature expose to Web sites or other parties, and for what purposes is that exposure necessary?
- ..
- Do features in your specification expose the minimum amount of information necessary to enable their intended uses?
- ..
- How do the features in your specification deal with personal information, personally-identifiable information (PII), or information derived from them?
- ..
- How do the features in your specification deal with sensitive information?
- ..
- Do the features in your specification introduce new state for an origin that persists across browsing sessions?
- ..
- Do the features in your specification expose information about the underlying platform to origins?
- ..
- Does this specification allow an origin to send data to the underlying platform?
- ..
- Do features in this specification allow an origin access to sensors on a user’s device
- ..
- What data do the features in this specification expose to an origin? Please also document what data is identical to data exposed by other features, in the same or different contexts.
- ..
- Do features in this specification enable new script execution/loading mechanisms?
- ..
- Do features in this specification allow an origin to access other devices?
- ..
- Do features in this specification allow an origin some measure of control over a user agent’s native UI?
- ..
- What temporary identifiers do the features in this specification create or expose to the web?
- ..
- How does this specification distinguish between behaviour in first-party and third-party contexts?
- ..
- How do the features in this specification work in the context of a browser’s Private Browsing or Incognito mode?
- ..
- Does this specification have both "Security Considerations" and "Privacy Considerations" sections?
- ..
- Do features in your specification enable origins to downgrade default security protections?
- ..
- How does your feature handle non-"fully active" documents?
- ..
Change Log
This section is non-normative.
The summary of editorial and substantive changes in this section are based on W3C Process Document Classes of Changes [W3C-PROCESS].
Acknowledgements
The Community Group gratefully acknowledges the work that led to the creation of this specification, and extends sincere appreciation to those individuals that worked on technologies and specifications that deeply influenced our work.
The Community Group would like to thank the following individuals for their useful comments, both large and small, that have led to changes to this specification over the years:
- Alain Bourgeois
- April Daly
- Emmet Townsend
- Hadrian Zbarcea
- Kjetil Kjernsmo
- Michiel de Jong
- Pete Edwards
- Ted Thibodeau Jr
- Tim Berners-Lee
- Wouter Termont
- Yvo Brevoort
References
Normative References
- [DC-TERMS]
- Dublin Core Metadata Terms, version 1.1. DCMI Usage Board. DCMI. 11 October 2010. DCMI Recommendation. URL: http://dublincore.org/documents/2010/10/11/dcmi-terms/
- [DOAP]
- DOAP: Description of a Project. URL: https://github.com/ewilderj/doap
- [EARL10-Schema
- Evaluation and Report Language (EARL) 1.0 Schema. Shadi Abou-Zahra. W3C. 2 February 2017. W3C Working Group Note. URL: https://www.w3.org/TR/EARL10-Schema/
- [LDN]
- Linked Data Notifications. Sarven Capadisli; Amy Guy. W3C. 2 May 2017. W3C Recommendation. URL: https://www.w3.org/TR/ldn/
- [prov-o]
- PROV-O: The PROV Ontology. Timothy Lebo; Satya Sahoo; Deborah McGuinness. W3C. 30 April 2013. W3C Recommendation. URL: https://www.w3.org/TR/prov-o/
- [RDF-SCHEMA]
- RDF Schema 1.1. Dan Brickley; Ramanathan Guha. W3C. 25 February 2014. W3C Recommendation. URL: https://www.w3.org/TR/rdf-schema/
- [RDF11-CONCEPTS]
- RDF 1.1 Concepts and Abstract Syntax. Richard Cyganiak; David Wood; Markus Lanthaler. W3C. 25 February 2014. W3C Recommendation. URL: https://www.w3.org/TR/rdf11-concepts/
- [RFC2119]
- Key words for use in RFCs to Indicate Requirement Levels. S. Bradner. IETF. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
- [RFC3986]
- Uniform Resource Identifier (URI): Generic Syntax. T. Berners-Lee; R. Fielding; L. Masinter. IETF. January 2005. Internet Standard. URL: https://datatracker.ietf.org/doc/html/rfc3986
- [RFC8174]
- Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words. B. Leiba. IETF. May 2017. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc8174
- [RFC9110]
- HTTP Semantics. R. Fielding, M. Nottingham, J. Reschke, Ed.. IETF. June 2022. Internet Standard. URL: https://www.rfc-editor.org/rfc/rfc9110
- [RFC9113]
- HTTP/2. M. Thomson, C. Benfield, Ed.. IETF. June 2022. Internet Standard. URL: https://www.rfc-editor.org/rfc/rfc9113
- [qa-glossary]
- Quality Assurance glossary. K. Dubost, M. Skall W3C. 28 April 2005. W3C Quality Assurance and Conformance activity. URL: https://www.w3.org/QA/glossary
- [skos-reference]
- SKOS Simple Knowledge Organization System Reference. Alistair Miles; Sean Bechhofer. W3C. 18 August 2009. W3C Recommendation. URL: https://www.w3.org/TR/skos-reference/
- [test-metadata]
- Test Metadata. Patrick Curran; Karl Dubost. W3C. 14 September 2005. W3C Working Group Note. URL: https://www.w3.org/TR/test-metadata/
- [W3C-HTML]
- HTML. W3C. 28 January 2021. W3C Recommendation. URL: https://www.w3.org/TR/html/
- [WEBARCH]
- Architecture of the World Wide Web, Volume One. Ian Jacobs; Norman Walsh. W3C. 15 December 2004. W3C Recommendation. URL: https://www.w3.org/TR/webarch/
- [WEBID]
- WebID 1.0. Andrei Sambra; Stéphane Corlosquet. W3C WebID Community Group. 5 March 2014. W3C Editor’s Draft. URL: https://www.w3.org/2005/Incubator/webid/spec/identity/
Informative References
- [EARL10-Guide]
- Developer Guide for Evaluation and Report Language (EARL) 1.0. Carlos A. Velasco; Shadi Abou-Zahra. W3C. 2 February 2017. W3C Working Group Note. URL: https://www.w3.org/TR/EARL10-Guide/
- [qa-handbook]
- The QA Handbook. Lofton Henderson. W3C. 6 September 2005. W3C Working Group Note. URL: https://www.w3.org/TR/qa-handbook/
- [qaframe-spec]
- QA Framework: Specification Guidelines. Karl Dubost; Lynne Rosenthal; Dominique Hazaël-Massieux; Lofton Henderson et al. W3C. 17 August 2005. W3C Recommendation. URL: https://www.w3.org/TR/qaframe-spec/
- [SECURITY-PRIVACY-QUESTIONNAIRE]
- Self-Review Questionnaire: Security and Privacy. Theresa O'Connor; Peter Snyder. W3C. 16 December 2021. W3C Group Note. URL: https://www.w3.org/TR/security-privacy-questionnaire/
- [SOLID-TECHNICAL-REPORTS]
- Solid Technical Reports. Sarven Capadisli. W3C Solid Community Group. 16 March 2024. Living Document. URL: https://solidproject.org/TR/
- [W3C-PROCESS]
- W3C Process Document. Elika J. Etemad / fantasai; Florian Rivoal; W3C Process Community Group. 3 November 2023. URL: https://www.w3.org/policies/process/