Skip to content

Field practice

Five documentation patterns that weaken sealed field review reports - drawn from 729 real FRRs

After analysing 729 field review reports from an Ontario structural practice, five recurring documentation patterns emerged that make reports harder to defend under scrutiny. None are engineering errors - they are drafting errors that happen when the bottleneck between site visit and sealed document is too tight.

The corpus

We read 729 field review reports from a single Ontario structural engineering practice. The reports span 14 categories - garage repair, roofing, balcony rehabilitation, envelope specifications, curtain wall, underground parking, structural framing, and others. They cover projects across the Greater Toronto Area over several years of active practice.

This was not a survey. We did not ask firms to self-report their documentation habits. We read the reports - every observation paragraph, every photo caption, every code citation, every revision note. The goal was to understand how sealed field review reports are actually produced, not how they should be produced in theory.

Five patterns appeared often enough to be systemic. They appeared across categories, across projects, and across different authors within the same firm. Each pattern weakens the report's defensibility - its ability to withstand scrutiny from a code authority, a tribunal, or a professional engineering regulator. And each pattern has the same root cause: the time pressure between the site visit and the signed document.

Pattern 1: Missing scope boundaries

A strong field review report states two things with equal precision: what was reviewed and what was not reviewed. The "not reviewed" classification matters as much as the "conforming" classification, because it defines the boundary of the engineer's professional opinion.

In practice, we found that roughly one in four reports described what was inspected without explicitly stating what was excluded. A report covering a concrete slab pour would describe the rebar placement, the formwork, and the concrete mix - but would not state that the shoring, the vapour barrier, or the waterproofing membrane were not part of the review scope.

The risk is specific. If a dispute arises about the shoring and the report does not explicitly exclude shoring from its scope, the report can be read as implying that shoring was observed and found acceptable. The engineer's silence becomes an implicit opinion they never intended to offer.

Under OBC 1.2.2.2, the engineer is responsible for confirming that the work is in general conformance with the plans. Stating what falls outside the review scope is not hedging - it is accurate reporting of what the engineer observed and what they did not. A report that documents its boundaries is harder to misread than a report that leaves its boundaries to inference.

Pattern 2: Generic photo captions

Photo documentation is the evidentiary backbone of a sealed field review. The photographs anchor the engineer's observations to a specific time, place, and condition. When a report states "the rebar placement was observed to be in general conformance," the photograph is the evidence that the observation occurred.

The pattern we found across the corpus: many photo captions describe the subject generically rather than specifically. "Site photo" instead of "Second-floor slab, grid lines C–F / 3–7, 32 MPa concrete placement in progress." "Rebar detail" instead of "Bottom mat reinforcement at column C4, 20M bars at 300 mm spacing, chairs at 1200 mm centres."

Generic captions weaken the evidentiary chain in two ways. First, they make it harder for a reviewer - whether a code official, a tribunal, or the engineer's own principal - to verify that the photograph supports the stated observation. The reviewer has to look at the photograph and infer the connection rather than reading the caption and confirming it. Second, generic captions are harder to cross-reference when a report spans multiple visits. If three reports in a sequence all contain a photograph captioned "site photo," the reader cannot determine what each photograph documents without opening the image itself.

The root cause is time. Writing a specific photo caption - one that names the structural element, the grid location, the material, and the condition observed - takes 30 to 60 seconds per photograph. A report with 12 photographs needs 6 to 12 minutes of captioning time. At 9 PM, after a full day of site visits, that time gets compressed. The captions become generic because the engineer knows what the photo shows and forgets that the reader does not.

Pattern 3: Standard references without clause numbers

A defensible field review report ties its observations to specific regulatory standards. The Ontario Building Code, CSA material standards, and the approved drawings form the regulatory frame against which the engineer evaluates the work. Citing these standards is not decoration - it is the mechanism that connects the engineer's observation to a measurable, verifiable criterion.

The pattern: many reports cite standards by name without clause numbers. "Work was observed to be in general conformance with CSA standards" instead of "Bottom mat reinforcement spacing was verified against CSA A23.3-14, clause 7.4.1." "Concrete placement conformed to OBC requirements" instead of "Concrete placement procedures were observed in accordance with OBC Part 4 and CSA A23.1-14."

The distinction matters under scrutiny. A code official reviewing a report that cites "CSA standards" cannot determine which specific requirement the engineer evaluated against. If a structural element later fails and the question is whether the field review was adequate, the generic citation provides no evidence that the engineer checked the relevant clause. The specific citation - CSA A23.3-14, clause 7.4.1 - proves that the engineer knew the requirement, checked the work against it, and formed a professional opinion.

Across the 729-report corpus, the reports with clause-level citations tended to be authored by more senior engineers or by firms with explicit quality assurance templates that included citation fields. The reports with generic citations tended to come from free-form drafting - engineers writing prose from memory without a template that prompted for specificity.

Pattern 4: Findings without recommendations

A complete field review finding has three parts: the observation (what was seen), the finding classification (conforming, non-conforming, or not reviewed), and the recommendation (what should happen next). The observation describes the physical state. The classification compares it to the standard. The recommendation tells the contractor, the code authority, and the project file what action is required.

The pattern we observed: reports that identify non-conformances but omit the recommendation. A report might state that "the rebar spacing at the east wall was observed at 350 mm centres, exceeding the 300 mm spacing specified on drawing S-201." The non-conformance is clear. But the report does not state whether the contractor should remove and replace, whether an engineering assessment of the as-built condition is required, or whether the non-conformance is minor enough to accept with documentation.

The missing recommendation creates two problems. First, it leaves the contractor without clear direction. The contractor knows that something is wrong but not what the engineer expects them to do about it. This generates phone calls, email chains, and site meetings that could have been avoided by a single sentence in the report. Second, it weakens the report's defensibility at re-inspection. When the engineer returns for the follow-up visit, they must re-evaluate the non-conformance from scratch because the original report did not record what corrective action was expected.

The recommendation does not need to be lengthy. "Contractor to remove and replace reinforcement at east wall to achieve 300 mm spacing per drawing S-201" is one sentence and closes the loop. The finding without a recommendation is an observation. The finding with a recommendation is a professional opinion. The seal covers the opinion.

Pattern 5: Revision history gaps

Sealed field review reports are living documents. A firm producing 50 FRRs per month will re-issue 10 to 15 of them as construction progresses, deficiencies are addressed, and follow-up inspections occur. Each re-issue should clearly state what changed from the prior revision - which findings were updated, which non-conformances were resolved, which new observations were added.

The pattern: re-issued reports that increment the revision number (Rev 1 → Rev 2) without documenting what changed. The reader receives a new version of the report and must compare it line by line against the prior version to understand the update. In a complex project with dozens of reports across multiple visits, this line-by-line comparison is impractical. The revision history becomes opaque.

The risk is both practical and regulatory. Practically, a code official reviewing a re-issued report needs to know whether the previously noted non-conformance has been addressed. If the revision history does not state this explicitly, the code official must re-read the entire report and compare it to the prior version - work that should have been done once by the author, not repeatedly by every reader.

From a regulatory standpoint, PEO's standard of care includes maintaining adequate records of professional work. A revision lifecycle without documented changes is an incomplete record. If a dispute arises about when a deficiency was resolved and by whom, the revision history is the primary evidence. A revision history that says only "Rev 2 - updated" provides no evidence of what was updated or why.

The fix is straightforward: a revision table at the top of the report or a summary paragraph stating "Rev 2: Finding 3 (east wall rebar spacing) updated from non-conforming to conforming following contractor remediation on [date]. Photographs 8–10 updated to reflect current condition." Two sentences. Complete audit trail.

Why these are drafting problems, not engineering problems

None of these five patterns reflect failures of engineering judgment. The engineers who authored these reports observed the construction work competently, formed appropriate professional opinions, and took the right photographs. The engineering was sound.

The documentation was rushed.

These patterns emerge from the drafting bottleneck - the 45 to 90 minutes between returning from site and producing a signed document. When a principal is writing their third report of the day at 9 PM, the scope boundaries get assumed rather than stated, the photo captions get shortened, the code citations lose their clause numbers, the recommendations get deferred to a phone call, and the revision notes get truncated to "updated."

Each of these shortcuts saves two to five minutes of drafting time. Across a 50-report month, that is significant time recovered. But each shortcut also weakens the report's ability to withstand scrutiny - the exact quality that a sealed document is supposed to guarantee.

The solution is not to tell engineers to draft more carefully. They already know what a defensible report looks like. The solution is to reduce the time pressure that produces the shortcuts. When drafting time drops from 90 minutes to 15 minutes, the engineer has time to write the specific photo caption, include the clause number, state the recommendation, and document the revision history. The quality problems disappear not because the engineer changed their behaviour, but because the bottleneck that forced the shortcuts no longer exists.

The Fermito notes

One email when a new piece or template drops.

Practice notes on sealed engineering work, the regulation around it, and what AI drafting does and does not change. No cross-promotion, no re-selling the list.

One email per release. Unsubscribe any time.

← All articles