What makes a review defensible
A sealed field review report under Ontario Building Code section 1.2.2.2 carries a specific weight: a licensed professional is attesting that they observed construction work and formed a professional opinion about its conformance. If that opinion is later challenged - in a dispute, a building failure investigation, a PEO complaint, or a liability claim - the report is the primary evidence of what the engineer knew, when they knew it, and what they did about it.
Defensibility is not about legal boilerplate or disclaimer language. It is about the structural integrity of the document itself. A defensible review is one where every finding can be traced back to a specific observation, evaluated against a specific standard, supported by specific evidence, and connected to a specific recommendation. If any link in that chain is missing, the report has a gap - and gaps are what opposing counsel looks for.
The five elements
After reviewing hundreds of field review reports from Ontario structural practices, the pattern is consistent. Reports that would withstand scrutiny share five elements. Reports that would not are missing at least one.
1. Observation specificity
A defensible review names exactly what was observed, where on the site it was observed, and under what conditions. "Concrete pour observed" is not specific. "Second-floor suspended slab pour, grid lines C–F / 3–7, 32 MPa specified mix, observed during placement and initial finishing at 14:00" is specific.
Observation specificity matters because it establishes what the engineer actually saw versus what they assumed or inferred. A report that says "structural steel framing inspected" without specifying which bays, which connections, and which floor level leaves open the question of whether the engineer actually looked at the element that later failed.
The standard of care under PEO Regulation 941 does not require that an engineer observe every square metre of construction. OBC 1.2.2.2 field reviews are sampling-based - the professional selects representative elements and evaluates them. But the report must be clear about what was sampled. If bay A3 was inspected but bay A7 was not, the report should say so. Silence on scope is a defensibility gap.
2. Standard reference
Every finding in a defensible review is anchored to a specific standard. The standard might be the Ontario Building Code itself (Part 4 structural design, Part 9 small buildings, Part 3 fire protection), a referenced material standard (CSA A23.3 for concrete, CSA S16 for steel, CSA O86 for wood), the approved structural drawings, or the project specifications.
The standard reference serves two functions. First, it establishes the basis for the finding - the engineer is not expressing a personal preference but evaluating against an objective criterion. Second, it provides the regulatory hook that gives the finding its authority. A field review report under 1.2.2.2 is a regulatory document. Its findings carry weight because they reference the standards that the code itself incorporates by reference.
Common failures: citing "the Ontario Building Code" generically without a section number. Citing "CSA standards" without specifying which one. Stating that work "does not meet specifications" without identifying which specification clause. Each of these weakens the finding because it forces the reader to guess what the engineer was measuring against.
3. Finding classification
A defensible review classifies each observed element with a clear finding: conforming, non-conforming, or not reviewed. These are not the only possible classifications - some firms use "conforming with minor deviation," "requires further review," or condition-based scales - but the critical requirement is that every element has a classification, and every classification is defined.
The "not reviewed" classification is as important as the other two. It explicitly documents what the engineer did not inspect during this visit. A report that lists only conforming and non-conforming findings implicitly suggests that everything was reviewed, which may not be true. If the engineer inspected the concrete formwork but not the reinforcing steel because the rebar was not yet placed, the report should say "reinforcing steel: not reviewed - not installed at time of visit."
Defensibility problems arise when a finding is ambiguous. "Steel connections appear adequate" is not a finding - it is an impression. "Steel connections at grid line B4/level 2, high-strength bolted moment connection: conforming to CSA S16-19 clause 22 and drawing S-301 detail 4" is a finding. The difference is that the second version can be verified, challenged, or confirmed by a subsequent reviewer. The first version cannot.
4. Photo evidence
Photographs are the evidentiary backbone of a field review report. A finding without a photo is a professional opinion without physical evidence. That opinion may be correct, but it is harder to defend because it rests entirely on the engineer's credibility rather than on documented reality.
Defensible photo evidence has three properties:
Relevance. The photo shows the element being discussed in the finding. A wide-angle shot of a construction site is context, not evidence. A close-up of the specific connection, pour section, or deficiency cited in the finding is evidence.
Caption accuracy. The caption describes what the photo shows and ties it to the finding. "Photo 7: Second-floor slab pour, grid C–F / 3–7, 32 MPa placement in progress" connects the image to the observation. "Photo 7: Site photo" does not.
Temporal anchoring. The photo's metadata (or the report's narrative) establishes when the image was taken. Construction sites change daily. A photo taken on March 15 is evidence of conditions on March 15. If the report is dated March 18 and the non-conformance was remediated on March 16, the photo must clearly belong to the March 15 visit - otherwise the timeline is ambiguous and the evidence is weakened.
The most common defensibility gap in photo evidence is mismatched captions: a caption that describes one element paired with a photo that shows another. This happens when reports are drafted hours or days after the site visit and the engineer is working from memory to match 15 photos to 15 observations. It is a transcription error, not a judgment error - but it creates the appearance of sloppiness in a document whose entire purpose is to demonstrate diligence.
5. Recommendation chain
A defensible review does not just identify a non-conformance - it states what should happen next. The recommendation closes the loop between observation and action.
Recommendations in a field review report typically fall into three categories:
- Proceed. The work conforms. Construction may continue.
- Hold for resolution. A non-conformance was identified. The specific deficiency must be addressed before the affected work proceeds. The report should name what "addressed" means - repair to drawing specification, supplemental engineering review, submission of a revised shop drawing.
- Requires re-inspection. The engineer needs to return to site to verify that the non-conformance has been remediated. This triggers a re-issued report, which is why the revision lifecycle matters.
The recommendation must be proportional to the finding. A minor deviation from a non-structural specification does not warrant a stop-work recommendation. A structural deficiency in a load-bearing connection does. Proportionality is a judgment call, but the report should make the reasoning visible - why this finding warrants this recommendation.
The recommendation chain also connects to the revision lifecycle. When a non-conformance is found, the report is issued with a "hold" recommendation. When the contractor remediates, the engineer re-inspects and re-issues the report with an updated finding and a "proceed" recommendation. The two reports - the original and the re-issue - form a pair. A defensible documentation practice keeps both in the project record and cross-references them by report number and revision.
Where reports break down
The five elements above are individually straightforward. The defensibility failures I have seen across hundreds of reports almost never come from a lack of engineering competence. They come from the drafting process itself - the conversion of good engineering judgment into a written document under time pressure.
Template drift. A firm's template evolves over years as different engineers modify it. Section headers change names, photo numbering conventions diverge, regulatory citations go stale (referencing a superseded edition of a CSA standard). The report still communicates, but the inconsistencies create openings for a reviewer who wants to question the firm's quality assurance.
Observation-finding disconnect. The engineer observed a concrete pour and noted it was conforming, but the report's observation section describes "formwork inspection" because the drafter reused language from a previous report and forgot to update it. The finding is correct; the supporting narrative is about a different element. This happens when drafting is done by copying a prior report and modifying it - the most common drafting method in the industry.
Missing "not reviewed" entries. The engineer inspected three of seven structural elements on a given visit. The report lists findings for three elements and says nothing about the other four. Did the engineer not inspect them, or did the drafter forget to include them? The report is ambiguous. In a dispute, ambiguity is liability.
Caption-photo mismatch. As described above, a transcription error that undermines photo evidence. More common in reports with 10+ photos, where the drafter is sorting and captioning images after the fact.
Stale regulatory citations. The report cites CSA A23.3-14 when the project was designed to CSA A23.3-19. The finding is likely correct under either edition, but the stale citation suggests the drafter was working from an old template. In a formal review or dispute, the question becomes whether the engineer evaluated against the correct edition - even if the answer is obviously yes.
The drafting-quality connection
Every one of these failure modes is a drafting problem, not an engineering problem. The engineer's judgment was sound. The conversion of that judgment into a written document introduced errors because the drafting process is manual, repetitive, and performed under time pressure by professionals whose primary skill is engineering, not technical writing.
This is the connection between defensibility and drafting tooling. A tool that generates the report narrative from structured field observations - with the correct regulatory citations for the project's applicable standards, photo captions tied to specific findings, and explicit "not reviewed" entries for elements outside the visit's scope - eliminates the transcription errors that create defensibility gaps.
The engineering judgment remains the engineer's. The observation, the finding classification, the recommendation - these are professional decisions that no tool replaces. What the tool replaces is the 45 to 90 minutes of Word-template typing where those decisions get transcribed into prose. That transcription step is where defensibility erodes, one small error at a time.
The signature at the end
A defensible sealed review ends the same way it begins: with the licensed professional's judgment. The five elements - observation specificity, standard reference, finding classification, photo evidence, and recommendation chain - are the structure that supports that judgment in writing.
The seal and signature on the last page mean the engineer has reviewed this document and stands behind it. That is the weight of the stamp. The document needs to be worthy of that weight - not because a regulator might ask, but because the engineer's professional reputation is bound to every page.