Choosing the Right Remote Sensing Tool for the Job: Comparing Accuracy, Cost, and Actionability

A new technology landscape rich in remote-sensing technologies like LiDAR, satellite, aerial photography, and others offers significant advantages for vegetation management. These tools help utilities identify (and address) risks faster, optimize operations budget, mitigate wildfires, and support planners in the office and teams in the field with actionable insights and network-wide situational awareness.

But choosing the right technology from such a vast landscape can be a complex challenge. It demands that you understand both your specific use cases and which tool (or tools) can best support them. And it only becomes more complicated when considering the issue of cost. 

We spoke with utility leaders with firsthand experience in evaluating remote-sensing technology and choosing the right tool for the job. The firsthand insights, facts, and advice that follow should arm you with a clearer understanding of:

  • Hard figures on cost and accuracy necessary to compare two of the most common remote-sensing technologies, LiDAR and satellite

  • Accuracy: why it’s vitally important in your vegetation tech and why more granular data isn’t always better

  • Concrete examples of how operations teams put different vegetation intelligence technologies to work within their programs

Cost and accuracy by the numbers

Neither cost nor accuracy exists in a vacuum. They’re both critical factors in evaluating any technology.

The following accuracy numbers are verified. Cost figures are approximate and were calculated using publicly available information, resiliency plan filing data, and interviews with our current customers.

Horizontal accuracy Vertical accuracy Approximate cost
LiDAR* 6 - 10 cm 3 - 10 cm $350-$450 per mile
Overstory’s satellite data 61 – 122 cm 91 – 182 cm $90-$175 per mile (including AI processing)
Aerial patrol Highly variable Highly variable Variable, and growing
Foot patrol Highly variable Highly variable Variable, and growing

* Relative accuracy, or the accuracy of a point-to-point measurement such as the distance between a tree branch and a cable. This number is reported by utility customers.

Understanding trade-offs

A glance shows that LiDAR is more accurate than Overstory’s satellite imagery. But LiDAR can also cost 4-5x more. And that's Geiger-mode LiDAR. Use another type of LiDAR and the cost can jump to 10x as much as satellite imagery.

Considering just accuracy or cost alone is an oversimplification. Does a higher accuracy or lower cost make one of these technologies better than the other? No. The key is not to treat these numbers as the end goal. Higher accuracy is not always better. Lower cost is not always better.

As an example, CenterPoint Energy recently requested $9.9M in capital spending for a LiDAR-based model of its network as part of its pre-Beryl System Resiliency Plan. But Texas’s PUC staff recommended denying that request for failing to consider more cost-effective  alternatives. And Texas isn’t alone—utility commissions in Michigan and Illinois have also publicly pushed back on the high cost of LiDAR for vegetation programs in the last year.

Though satellite or aerial data may have better suited the PUC’s budget requirements in this instance, it likely wouldn’t offer the accuracy necessary for asset modeling. Understanding the trade-offs can help teams combine the technologies thoughtfully to optimize for cost, time, and impact.

Nick Day, ComEd’s Principal Vegetation Program Manager, explained the idea well in a recent webinar on future-proofing operations: You don’t always need to spend significantly more money to get the highest accuracy data, he said. “The important thing with [remote-sensing] technology is understanding how it fits your needs and what your acceptable accuracy threshold is.”

Comparing the actionability of LiDAR and satellite

Consider how you might use vegetation risk analysis across your network to plan mid-cycle maintenance work.

The highest-accuracy LiDAR provider would deliver an impressive point cloud data set that shows every point of encroachment for every circuit you maintain. But this amount of accuracy often makes the data less actionable—not more. 

Many utilities using LiDAR for maintenance planning find that the data sets include too many points of encroachment. 

The density of the data makes it difficult to determine which spans show the highest risk. So they thin the data to show only the worst-offending encroachment for every 10 feet. This takes time. Then they inspect the circuits individually, ranking spans across each circuit in descending order of risk. At the end of this resource-intensive process, they are ready to plan maintenance work, proceeding one circuit at a time.

The story is different when you prioritize actionability over granularity.

Overstory offers vegetation intelligence sourced from more affordable satellite and aerial imagery. That intelligence incorporates remote sensing data with wildfire maps, your asset and ROW data, your tailored risk framework, and other information that matters to you. AI trained on years of utility data helps us identify meaningful patterns to give you actionable insights needed to meet your specific business goals. 

For instance, you use the framework to perform a fast-stack ranking of risk for thousands of spans. This enables you to perform the same type of risk-ranking as you would with LiDAR data, but at the level of the whole network—not just an individual span. The process is also significantly faster and 4-5x less expensive than LiDAR.

Common use cases for satellite and LiDAR

If you still feel lost about which tool is right for your project, there’s good news: Both satellite imagery and LiDAR are well established and widely used, so there are many precedents to help guide your decision. 

Here are a few use cases where utilities may prioritize one technology or the other. 

Cycle maintenance planning: Satellite

As we’ve discussed, satellite imagery can be processed with AI to produce highly accurate risk analysis at a network-wide scale, and automatically stack rank 1,000s of spans. Considering a risk framework that’s tailored to your network goals, this insight makes it fast to prioritize your work and make the most of your O&M resources.

Wildfire mitigation: Satellite and LiDAR

AI-processed satellite imagery may not be able to resolve at the level of the individual object, but it can offer macro-level insights that are useful for wildfire mitigation planning. For example, it can layer wildfire risk with encroachment risk and tree density, pinpoint high-risk tree species, and identify spans across the network that could be deprioritized because they are free of trees. This actionable data has proven particularly useful in setting critical, timely benchmarks in wildfire mitigation plan filing and execution.

LiDAR offers more detail on the trees themselves. This can be valuable for wildfire control in areas of high risk. In spans where a single branch can cause a wildfire, LiDAR can show every point of encroachment. This enables you to assess and mitigate risk at the most granular level. 

Asset documentation: LiDAR

Design data is notoriously unreliable as a reflection of the real world. For example, it’s common for a pole in the field to measure differently from the standard poles in the supply yard. That’s why many utilities use LiDAR for documentation. The technology captures high-accuracy data that can provide engineering-grade measurements of your real-world assets.

Identifying Off-ROW Hazards: Satellite

Tree health analysis isn’t possible with LiDAR point clouds, and humans often miss hazard trees (those showing signs of decline and within striking distance of conductors) when they’re off the right of way. Satellites, however, can analyze tree health and fall-in risk with surprising accuracy (often up to 90%).

But how does cost come into play? As an example, one mid-Atlantic utility cooperative identified that half of their outages come from hazard trees outside the right of way. With a cost of about $6,000 to restore an outage from an off-ROW hazard tree (and about 500 annually), that amounts to $3,000,000 in annual restoration costs. Just a 10% reduction in outages alone would pay for the costs of satellite intelligence entirely.

Asset modeling and engineering applications: LiDAR

The measurements derived from LiDAR data are accurate enough to enable advanced modeling applications. One common application of LiDAR data is pole loading, or calculating whether a pole has enough size and weight-bearing capacity to support another asset like a cell tower. 

LiDAR data is even accurate enough to document the size and position of a cable in space. That makes it ideal for modeling cable sag and sway. 

Arboriculture: Remote sensing plus boots on the ground

Satellite data can also be processed with AI to identify spans that exhibit risks like crown dieback, which could be the early signs of some disease. That’s enough information to know you need to send an arborist out for an assessment in the field. 

With LiDAR, you gain a higher level of accuracy in tree heights and more information about things like crown shape and tree width. 

In either case, an arborist on the ground will likely identify things like codominant stems and other defects with the highest accuracy. Because of this, it’s often most cost-effective to use a combined approach, using remote sensing data to identify regions that need more attention from arborists on the ground.

Which tool is best for compliance? It depends.

Many states have regulations that require utilities to inspect lines for encroachment closer than a minimum distance, and then report. Since failing to comply can mean paying a significant fine, teams often elect to capture their lines with the highest possible measurement accuracy. Many utilities choose between LiDAR or aerial patrol here due to their long history of use and acceptance by regulatory bodies, though satellite is often more accurate than aerial patrol and sufficient in many cases.

Key considerations for assessing remote sensing technology

To choose the right remote sensing technology for any given project, you can ask yourself the following questions.

  • What are my goals?

  • How accurate should the data be to help me make the “right” decisions?

    • If the data is off by six inches, would that change my decision? 

    • If it’s off by three feet?

    • At what threshold does a tool stop being accurate enough?

  • What scale am I working at?

    • Does the use case focus on individual objects, like assets?

    • Or am I working at the scale of my whole network, making decisions for thousands of spans at once?

    • Am I somewhere in between?

  • Does higher-accuracy data for this use case offer enough value to justify… 

    • The cost differential?

    • The extra time?

    • The extra resources to produce actionable insights?

Investing in the right combination for grid resilience

The rapid evolution of these technologies—alongside the rapidly changing landscape of labor, market, and weather challenges—signals an important moment for operations teams. 

Utilities have the opportunity to rethink how they approach safety, reliability, and operational efficiency. And this choice is more than just choosing between one technology or another—it’s about embracing innovation to meet tomorrow’s reliability needs.

The decision to adopt remote sensing technologies is an investment in grid resilience. By considering how cost, accuracy, and (most importantly) actionability can solve key operational challenges, utilities can drive smarter, faster, and more efficient decision-making.

With the right tools and strategic approach, utilities can protect not only their infrastructure, but also the communities and ecosystems that depend on it.

Previous
Previous

A Year of Resilience: A Letter from Fiona Spruill

Next
Next

4 Key Principles of Resilient Operations Programs