Audit requires to verify the operation of the Assurance or Registration Authority business of CAcert (in PKI language, the Assurer is known as the Registration Authority or RA). See DRC: "A.2.y The CP details how the CA verifies that RAs operate in accord with the CA's policies." The trivial audit control is to have an external auditor verify the operation of Assurers. This is impractical for cost reasons.
Because of the difficulties in the scaleability of the overall assurance process, it is considered necessary and accepted practice to make assurance self-verifying. This goal was established at 20090517 MiniTOP on Assurance. Audit therefore instructs the Assurance team to prepare and operate a reliable process of self-verification, and to report those results reliably up to Board and Auditor.
The basic control is to have a co-auditor be assured by each Assurer. A co-auditor is an Assurer who has special qualifications, below.
Reporting is on the basis of Assurers (not assurances). Ideally, each Assurer should be processed at least once per season, but in practice we will attempt to reach a small minority such as 10%. Where an Assurer is processed by different co-auditors, this represents an opportunity to confirm the results, but the records should be treated as duplicates.
The objective of the co-audit process is to measure the confidence that the Assurers future and past assurances are reliable, in the aggregate, as per the AP 1.1 Assurance Statement. The test is not over the assurance per se, nor is it explicitly directed at any one Assurer. Hence, results are taken to provide a view over all assurances and Assurers. They are not a direct reflection over the individual Assurer.
During the co-audited assurance, the co-auditor checks for the quality of the assurance by means of a standardised set of tests.
The assurance should aim to cover a basic set of points, which each co-auditor is required to know. Each set should include from 6 to 10 points. Most or all should be explicitly checked during the assurance. The results should be recorded, statistically summarised and reported, including positive, negative, and invalid (e.g., not done or not applicable). See A2.
For the purpose of mantaining statistical consistency, the set should be agreed in advance and should be standardised across a large group of assurances. E.g., one set of tests over a stated season. See A1.
The set of checks should not be widely publicised. Although there is no decision to make it secret, it should be kept reasonably hidden to ensure the validity of the statistical results.
Prepare a set of CAP forms before the event to prepare the check over assurer. The co-auditor is encouraged to use blank forms, and to let the Assurer guide him in filling it out (or however desired).
Any use of false doco or conflicting information should be planned carefully. It can be non-specific, with no indication of which events, which uses. It needs to be approved by Assurance Officer or delegate, documented in advance, and revealed afterwards.
Notes on process, for you the Co-Auditor:
Each test is an assurance by the Assurer over co-auditor. Mutual assurance is now the norm so that should be done too, but for auditing process, the co-auditor should instruct the Assurer to go first.
Assurers will make mistakes, due to the deliberate use of confusing inputs, or the pressure of the process. If an Assurer makes a mistake this can be seen as a chance to improve the quality. After the co-audit is completed, go back through the assurance and point out the errors, and the solutions. Disputes should not be filed against the Assurer for making normal mistakes.
On the report form, note:
Before the end of the event, the co-auditor must hand over results to the Event Coordinator. The co-auditor's report should be signed and dated. Your report is a CAcert Assurer Reliable Statement and should be signalled as that with CARS next to the signature.
Each co-audited assurance should be entered into Casper at the end of the event so as to capture the information for statistical presentation. See A2. Entry into Casper is also a CARS.
On the day, the Event Coordinator leads and is responsible for collating the day's results up to Assurance Officer, Board and Auditor (CARS).
The role of co-auditor was defined at Brussels MiniTOP as
At an organised event with co-auditing, the Assurance Officer my deputise the Event Coordinator or other Senior Assurer to handle the results of the co-auditing.
As all results are CARS up to Assurance Officer, and from the Assurance Officer to the Board and Auditor, the Assurance Officer is responsible for the overall quality of the programme.
The Event Coordinator's documentation of the process should consist of:
The Event Coordinator is responsible for selecting one result for each Assurer from the data supplied by each co-auditors. At the discretion of the Event Coordinator, any earlier checks can be seen as a training run, with emphasis on covering all the needed points. The last is the full test run, which produces the result.
The report should be sent to Assurance Officer, for visibility to External Auditor and Board. The report itself does not include personal details (name, email, individual results). It should probably be published.
The Assurance Officer and the Events Team Leader should coordinate the process and collate group results via CASPER. Auditor should be consulted and kept informed of progress.
At the end of the season, the Assurance Officer delivers a final statement (CARS) over the Assurance of CAcert as a whole. A (private) record of the tests needs to be kept by AO so as to ensure goals and consistency is met across the seasons.
The 2010 series is in full swing! Tests were discussed at meetings in Brussels. The final set was agreed in Hannover, and trialled there at CeBIT. Available co-auditors were tested (4).
The checks conducted in 2009 by Auditor was: D, a/em, N, C, Dx, AP. Additional questions asked were: how many EPs, Challenge or not?
Documents used were a set of 4: one good but unfamiliar passport, two valid but minor documents, one unacceptable document. All were unfamiliar (or meant to be unfamiliar). Auditor used 2 complete sets which were alternated on the day in order to sew confusion and chaos among the ranks of gossipers.
....
Casper ("Co-Auditing System for Periodic Evaluation of RAs") is a small database system to collect the list of tested assurances by co-auditors, collate them and display the custom statistical presentations needed. As of this moment, it provides entry, search and presentation (total, by country and by co-auditor).
It is located at fiddle.it and forms one mini-app in a grab bag on that site (also there is a mini-challenge and survey apps).Conditions. The site's access conditions include CCA, etc. Access is only permitted using CAcert client-certificates. The default access allows only the summarised presentation of data.
Access Control. Those who are marked as co-auditors in the system can search and enter new records on co-audited assurances. Those who are marked as admins in the system can set the co-auditing flag.
Who. Current admins are Ulrich (Assurance Officer) and Iang (site author). SSH access is available to Iang and Philipp Dunkel. Backups are collected by Iang (escrowed on laptop). All are Assurers.
Database Structure. The site does not use a SQL database.
Table | Fields | Comments |
---|---|---|
Certificate | email address, serial number, common name, expiry, member-Id | Cert is stored and matched to name, email details in Member |
Member | member-Id, nickname, status, since, certs (by S/N), flags = {Admin, Co-Auditor, FindMe, Special}, status = {Assurer, Board, Policy, Arbitrator, Events, etc }, mini-challenge scores, | member-Id (mid) is a unique number stored locally, internally |
Co-audit | co-auditor's mid, assurer's email, date, location, tags = {DE, AT, NL, etc}, tests response = (to 6 tests, pass, fail, not done), series = {2009, 2010}, experience points {0-50}, | The assurer is only known by the email address, generally unknown to fiddle |