Placeholder Image

字幕表 動画を再生する

  • (gentle music)

  • - [Keith] Okay, hello everybody. I'm Keith Krause.

  • I'm also with the Airborne Observation Platform team.

  • I'm going to take what Tristan did,

  • and go a little more advanced.

  • I'm going to keep this presentation a little shorter than

  • the number of slides I have. So, fortunately,

  • Tristan covered several of my slides already,

  • and I'm going to try to introduce a little more

  • LiDAR theory, because it's kind of in the context

  • that hopefully will make sense of why our

  • waveform data looks the way it does.

  • Hopefully, that will make sense, and also talk a little bit

  • more about, you know, ask questions about

  • target detection and that sort of thing.

  • Some of that's going to come up a little more

  • in the waveform.

  • Tristan already kind of showed you Discrete LiDAR.

  • You're essentially finding returns or objects.

  • You get geolocations. That could be X, Y, Z

  • on a map. Intensity, other attributes.

  • That's great. But the hope is, with full waveform LiDAR,

  • you're actually measuring that entire signal

  • as a function of time.

  • The hope is that you can do more with that data.

  • We'll talk a little more about that.

  • One of the challenges in the past is that

  • full waveform LiDAR data just hasn't been

  • available to people.

  • There's a handful of groups that are working

  • with it right now, but you don't typically

  • see lots of papers or presentations on the subject.

  • We're hoping to change that.

  • At the current moment, we have LiDAR products

  • from some of our 2013 and 2014 flights.

  • That's available by request.

  • Unfortunately, we weren't able to collect any

  • waveform data last year, due to some instrument

  • hardware issues.

  • But we've been collecting it this year, in 2016,

  • and we're currently processing that data,

  • so it should be coming available, hopefully,

  • within the next couple of weeks.

  • As I mentioned, we hope more people get involved

  • with waveform LiDAR.

  • So, these are just more graphical representations

  • of what you've already seen from Tristan.

  • But I think the big thing in terms of waveform LiDAR

  • and what I'm going to talk about, just keep in mind right,

  • once again, you have this outgoing laser pulse,

  • some time goes by, and then you're able to record

  • some reflection of light as a function of time.

  • We're going to keep coming back to this change in time.

  • This is another form of one of the figures Tristan showed,

  • but from Texas A&M and Dr. Sorin Popescu.

  • We're going to zoom a little more into this plot here

  • in a second.

  • Once again, it's a 2D beam that's interacting

  • with objects as a function of time.

  • Keep in mind with LiDAR, time is distance.

  • Distance is time.

  • So, you have discrete return in full waveform.

  • Discrete return, there's usually onboard processing.

  • That realtime. We'll look at that signal and

  • try to do target detection, and then it does the ranging.

  • In Sorin's figure, he kind of talks about this idea that

  • depending on what sort of algorithm or hardware is used,

  • there could be a period of time where it detects an object,

  • and then might have to reset itself.

  • So, it can actually miss things.

  • The nice thing about the full waveform is,

  • you'll capture this entire signal as a function of time.

  • So, hopefully with post-processing,

  • you can go in and get more detail,

  • but as you'll see in a minute,

  • there are some complications too.

  • The hope is, looking at these waveforms,

  • just like with discrete data, you can start

  • to maybe imagine, based on the way the tree structure is,

  • that you might have overstory and some understory,

  • and maybe the ground.

  • You can start thinking about stratification of

  • either vegetation or other objects.

  • I'm not going to spend too much time here,

  • but just the general process of LiDAR is,

  • you fire your laser. You record your signal.

  • You do some sort of target detection.

  • Basically, once you've identified a target,

  • you can then look at the change in time

  • between that outgoing pulse and the received pulse.

  • You do some calculations that converts

  • that time-of-flight into a range.

  • Then from the range, you have your GPS IMU, and you can

  • figure out what direction the scan mirror is pointed at,

  • and then that gets you your coordinates.

  • So, just like discrete return points can have geolocation,

  • full waveforms can too.

  • You'll see with the product

  • that we include geolocation information.

  • In general, ranging follows kind of the basic

  • speed of light calculations from, I don't know,

  • a couple hundred years ago.

  • But essentially, in this case, we know the speed of light,

  • but you have the speed of light.

  • You have the change in time between

  • that outgoing and return pulse.

  • Remember the light has to travel there

  • and then also come back.

  • So the distance is actually half of that time.

  • And then, of course, you have the index of refraction

  • of air, because that laser light's actually going to

  • slow down a little bit traveling in air than say,

  • it would in space if you were in a vacuum.

  • So, that's just that absolute range.

  • You might also hear the term of range resolution.

  • Some people call this different, but Tristan mentioned,

  • you know, when objects get too close to each other,

  • you can't resolve them anymore,

  • and I'll show a figure of that.

  • But, essentially, that's going to be driven by

  • the outgoing pulse shape. So, these laser pulses don't

  • infinitely, or infinitesimally, jump up to a peak signal.

  • It does take time for it to ramp up, fire that laser,

  • and then ramp back down.

  • So, that shape will actually cause blurring,

  • and that's why you can't detect objects.

  • So, there are several different algorithms

  • for how you would do ranging.

  • Different manufacturers will use their different

  • proprietary algorithms.

  • I'm just going to show the really simple ones.

  • You can imagine that if you have your outgoing laser pulse,

  • then some time goes by. It reflects off,

  • in this case, probably the ground,

  • since you just get a single peak.

  • We're going to find the peaks, and then, in this case,

  • we're going to say, well, let's go and figure out

  • where the 50% energy is on the left side.

  • This would be called leading edge detection.

  • That's done in this case, mostly because,

  • if you look at the shape of this outgoing pulse,

  • it actually is kind of pushed more onto the right side.

  • So, it's not perfectly Gaussian.

  • Combination of, you have a sharper rise than you do a fall.

  • And then the other pieces.

  • This is the ground, so it's pretty simple,

  • but if you're interacting with the canopy,

  • you can imagine that that left edge

  • is going to be the top of the canopy,

  • so that might be where you actually want to range to.

  • I guess, one other thing to note.

  • The time between the outgoing pulse and the return pulse,

  • ends up being about 6500 nanoseconds.

  • When you do all the conversation, that comes out to about,

  • in this case, 983 meters.

  • You can imagine, if we're trying to fly

  • at about 1000 meters above the ground,

  • you have some terrain variation,

  • and there you're getting 983.

  • So, this may address your question a little bit,

  • but you can see, just with discrete and waveform,

  • you might get multiple peaks.

  • So, in this case, you could identify three objects,

  • and each of them has a leading edge.

  • So, you could identify in the discrete return,

  • three targets.

  • If you were just looking at the relative time difference

  • between these,

  • maybe you could say, this is the ground,

  • and this is the canopy top, and in this case,

  • the canopy would be 14 meters tall.

  • So, you can start to see, that might be one way that

  • you might analyze waveform data.

  • Rather than building a canopy height model on a raster grid,

  • you might be able to identify canopies and ground

  • within a single laser pulse, and now,

  • start looking at distance measurements that way.

  • So, a little more on range resolution and target separation.

  • This, hopefully illustrates what Tristan talked about.

  • In this case, I've just done a simulation,

  • and we're using a 10 nanosecond outgoing pulse,

  • which is typical of the Optech system,

  • I think at 70 kilohertz.

  • 100 kilohertz might be a little wider,

  • so it would actually blur more.

  • But you can see in this case,

  • if you have a 10 nanosecond wide Gaussian,

  • and you take two ideal targets,

  • and put them 40 nanoseconds away from each other,

  • clearly you can see two peaks, and that's easy.

  • If you move them closer, you can see that the signal

  • starts to blend in the middle,

  • but you can still identify them.

  • Even here, no problem.

  • But you can see here, if you actually separate them

  • by exactly one of the full width half maxes,

  • to you and I, we still see kind of a double peak,

  • but actually a lot of algorithms might have a hard time

  • trying to determine where exactly those two peaks are,

  • and it might still say that there's one peak.

  • And as you get below,

  • if you get less than the full width half max,

  • you still had two targets in the original,

  • but you can see the signal sums into a single shape.

  • So, at this point, you've effectively lost your ability

  • to say there's definitely two objects there.

  • It could just be one object that was brighter.

  • And as you go even further, same kind of thing.

  • And you'll see, if we put some actual Gaussians on this.

  • At least in this case,

  • if you had a really sensitive algorithm,

  • you might say that, I only have one object,

  • but it's not a perfect Gaussian,

  • so maybe there's something else there.

  • But at this point, at half the full width half max,

  • you'd probably have no way of knowing

  • that there's two objects.

  • So, that's kind of the idea of range resolution.

  • You can imagine different branches in a tree,

  • if they're too close together, their signal is just

  • going to sum up and it's going to look like one big branch.

  • Not going to talk too much about this,

  • other than I do have a figure to kind of explain this.

  • But, one of the challenges with all these systems,

  • is being able to write the data fast enough to keep up.

  • Kind of as a comparison, the hyper spectral data.

  • You have a 640x480 array.

  • You're running it at 100 lines per second.

  • That's effectively equivalent to the data rate

  • the LiDAR runs at, at 100 kilohertz,

  • if we had 310 time bins that we were trying to save out.

  • Now, the difference is, the spectrometer has a fancy

  • computer, and I think it simultaneously writes

  • to four hard drives at the same time.

  • Whereas, the LiDAR, I think has a single hard drive.

  • So, there's kind of games you have to play,

  • making sure you're saving out that data fast enough,

  • or else the laser's going to keep firing,

  • and it'll just miss everything.

  • As an example, you might love to save the entire data space

  • from when you fired that outgoing laser all the way

  • through the air and down to the ground and back,

  • but unfortunately, that would be over 6000 bins of data,

  • and just with 100 kilohertz, which is our nominal PRF,

  • and if we had 8-bit data, let's say,

  • which most of the newer systems

  • are running higher than that like 16-bits,

  • you'd actually need to write out

  • at about 5 gigabits per second.

  • Now, the other day I just copied some data

  • from a hard drive and it was running at like

  • 30 megabits per second.

  • So, you can imagine, it's orders of magnitude.

  • You just can't save everything.

  • So there's some solutions to that

  • which we're going to talk about, which is multiple segments.

  • We don't save all the data.

  • The challenges are, you have to set a threshold.

  • If you set that threshold too high, you'll miss stuff.

  • If you set it too low, things like haze in the atmosphere

  • could trigger the LiDAR.

  • A lot of times they'll limit how many bins you can save,

  • so you might burn up that entire space

  • and not even get close to the ground.

  • So, this is just kind of an example

  • of how the multiple segment works.

  • This is a simulated waveform over here.

  • Maybe we set a threshold at 50 DNs.

  • You can see anything that's above,

  • would definitely be triggered as a target.

  • We might also buffer it a little bit,

  • so that's where these green lines are.

  • But you can see here, we've totally missed this

  • low signal peak,

  • and when we save out the waveform data,

  • it's just gone and we never knew it existed.

  • This is just one more example where I moved that

  • center feature over a little bit.

  • You'll see this a lot of times in our waveforms where

  • you'd say, oh well, there must still be something here,

  • but unfortunately you've lost that data.

  • So, that's something you can't recover unless

  • we were to go and drop that threshold,

  • and try to recreate the data.

  • Another thing that you'll see is, it's digital data.

  • We have to sample it on to some time bin.

  • We'll actually do a digitization to one nanosecond.

  • Now, based on those range resolution type numbers,

  • that still applies here.

  • One nanoseconds is about 15 centimeters.

  • We're saving the data in 15 centimeter range bins.

  • But some of the ranging target algorithms,

  • they can do better than that.

  • So, you can get higher absolute precision.

  • But, what happens is, if you have a wide outgoing pulse here

  • with the raw simulation and digitized,

  • you can see it still looks pretty much like the original.

  • But with the newer systems, they're taking the pulse widths,

  • making them really short because what that does is

  • that gives us more 3D structure resolution.

  • But you can see,

  • when you put this onto a one nanosecond grid,

  • now you start to get funny kind of triangles and flat tops,

  • and other weird artifacts.

  • Essentially, if you were just working with this raw data

  • by itself, you might run into errors.

  • So, a lot of the algorithms in processing,

  • will actually go and fit this with some sort of a shape

  • that maybe is on a higher resolution and can recreate

  • where the actual peaks are.

  • And then you'll see, there's also noise.

  • Some of that could just be electronic noise in the system,

  • or in some cases, you know, remember,

  • even though the laser is at 1064 nanometers,

  • and they kind of try to hold the receiver

  • to only see that wavelength,

  • the sun is reflecting off of trees at 1064 nanometers.

  • That will actually cause sort of an overall bias

  • that might raise the signal up.

  • And, you know...

  • So, that might be something you want to look at,

  • like an offset, and do some relative scaling.

  • Just really quick on our product right now.

  • It's a series of binary files.

  • We have the outgoing pulses. We have the return waveforms.

  • We also provide geolocation information.

  • Essentially, I've gone and geolocated

  • what I think the first return is,

  • but also provided other information to be able

  • to transfer that to any other bin in the waveform.

  • We also provide some observation information.

  • That's viewing, geometry, distances.

  • And then finally, we provide some ephemeris data

  • of the GPS and IMU.

  • The hope was even though

  • we're doing the geolocation ourself,

  • if somebody really wanted to go back to the beginning,

  • and recreate it all, they have, hopefully,

  • all the information they need to do rigorous calculations.

  • And then, finally, there's also some QC files.

  • Essentially, they're just point clouds that were derived

  • from the waveform so that we know if it worked or not.

  • So, if there was some big bug in my processing code,

  • we would see it where it might not map things

  • on the ground correctly.

  • And you can see, this is just what the waveform product

  • looks like.

  • So, you have your laser pulse number on the vertical.

  • You have your outgoing pulses.

  • So, this is it's own data array.

  • You have return waveforms as their own data array.

  • If we just grab one horizontal slice out of here,

  • so this is just one laser pulse,

  • you can see you get a waveform with multiple peaks.

  • This is a good example of where,

  • if you were just looking at peaks,

  • you would say, oh well there's four peaks,

  • but, like my algorithm actually

  • can't do the leading edge on this guy,

  • so it'll say, well I know there's something here,

  • but I don't know how to geolocate it.

  • So, sorry.

  • The other thing that I want to use this slide for

  • is to say some of the power of the waveform is,

  • you can see how there's kind of these bumps going on

  • sometimes on the right.

  • So you have light interacting with the canopy,

  • and there's photons doing multiple scattering,

  • and they kind of get delayed a little bit.

  • And so, with waveform, you might actually be able

  • to take advantage of this, whereas all of that information

  • is just thrown away in the discrete LiDAR.

  • Going to skip over geolocation stuff.

  • You can see different targets make different shapes,

  • but it's not very straightforward.

  • It's not like, oh, well conifer trees are always going to

  • look like this.

  • So, even though this is a nice example,

  • the real world is never as nice.

  • But you can see, bare soil is pretty much a hard target.

  • The return waveform is going to look very much like

  • the outgoing pulse shape.

  • Different trees.

  • Deciduous trees might have more reflection off the top,

  • whereas a conifer with its cone shape,

  • you're going to have more photons coming later

  • from lower levels in the tree.

  • In some cases, like pine plantation,

  • where they've cut down several of the trees,

  • you'll have a beam that'll hit part of a tree,

  • but then it also hits the ground.

  • So you can see a strong ground return, but also some

  • vegetation.

  • Just once again, the power of waveform.

  • Here are four different plots of what might only come out as

  • a single return in the discrete data,

  • But you can see the shapes are very different

  • from each other.

  • The hope is that, with waveform LiDAR,

  • this information can be extracted.

  • I'm just going to show you one quick example

  • of what you might do with LiDAR data.

  • I'm not going to explain this too much other than to say,

  • here's a raw waveform.

  • I've smoothed it out.

  • There's algorithms called Watershed Segmentation.

  • Effectively, that's looking for the peaks,

  • and then kind of separating them into different objects.

  • So, you can imagine, if you flipped this upside down

  • and you filled it with water,

  • that explains kind of the watershed concept of the

  • different sections would be different watersheds.

  • Then what I've done is, taking the peak of the first return,

  • I just calculated the rise time.

  • So, from the left to the peak.

  • Fall time. Peak to, maybe some fraction of energy,

  • that could be where it ends.

  • And now, I've gone and done that for several laser pulses,

  • and colorized a point cloud based on that fall time.

  • So, in this case, blue and purple is going to be

  • a very short fall time,

  • so you can see the bare earth and ground come out blue.

  • Some of the pine plantation tends to have more of that

  • structure as a function of time,

  • so it'll come out oranges and reds.

  • One of the challenges here is,

  • on this map, you might say oh I can see the pine plantation,

  • this is an easy land cover classification,

  • but you can see there's red speckled throughout

  • probably what are oak trees here as deciduous.

  • So, maybe you have more yellows and reds here,

  • but it might not be as straightforward

  • to just make a land cover map,

  • but just something to think about of what you might be able

  • to do with waveform.

  • And then, just finally, what a lot of the universities

  • that are working with waveform,

  • they tend to do this thing called deconvolution.

  • So, the idea is, once again, you have that outgoing pulse,

  • it blurs the data, and essentially, deconvolution,

  • they're just trying to sharpen that up and see what

  • the underlying structure might really look like.

  • So, this is just a basic example of one algorithm

  • called Richardson-Lucy.

  • You can see the raw waveform looks like this.

  • As you start to deconvolve, it actually turns

  • more into a Gaussian shape.

  • And then here now, you kind of see three features.

  • As you keep going, it says, well, there's really

  • two objects here, and then there's two over here.

  • Now, one of the challenges with this is,

  • is any of this real?

  • A lot of times people might end up

  • doing an intensity threshold.

  • So, in some, I think I ran this kind of like

  • the plot you just saw with the fall time,

  • and sometimes, like with these points,

  • you'll end up getting noise above and below the ground.

  • When you look at that you say,

  • well this isn't even realistic.

  • It's just an artifact of this algorithm.

  • It's kind of like buyer beware with some of these algorithms

  • that if you don't totally know what they're doing,

  • and you overprocess, it might not be realistic.

  • There's a lot of research going on with simulations

  • and doing ray tracing in the 3D CAD world

  • to sort of understand if this is real or not.

  • And then finally, this is an example

  • from Tan Zhou at Texas A&M.

  • He's just done some different processing levels

  • and presented this nice figure.

  • So, you have your discrete return up top.

  • If you were just to take those full waveforms

  • and put them in a 3D land,

  • they'd be very blurried and confusing.

  • But he's analyzed them through just fitting Gaussians

  • to that raw waveform,

  • or running two different deconvolution approaches.

  • Really, the hope with the deconvolution is

  • if those objects that we saw in the previous slide were real

  • you might be able to get more of a densified point cloud

  • than you could with the discrete data,

  • because hopefully the waveform is picking up

  • some of those objects that were lost.

  • And with that, I will leave it to questions.

  • [audience applauding]

(gentle music)

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

Introduction to Full Waveform LiDAR: A Presentation

  • 2 1
    joey joey に公開 2021 年 05 月 24 日
動画の中の単語