Search Help Show/Hide Menu

Data Management Plan

DMP Template v2.0.1 (2015-01-01)

Please provide the following information, and submit to the NOAA DM Plan Repository.

Reference to Master DM Plan (if applicable)

As stated in Section IV, Requirement 1.3, DM Plans may be hierarchical. If this DM Plan inherits provisions from a higher-level DM Plan already submitted to the Repository, then this more-specific Plan only needs to provide information that differs from what was provided in the Master DM Plan.

URL of higher-level DM Plan (if any) as submitted to DM Plan Repository:
Always left blank

1. General Description of Data to be Managed

1.1. Name of the Data, data collection Project, or data-producing Program:
2017 WA DNR Lidar: Willapa Doty, WA
1.2. Summary description of the data:

No metadata record was provided with the data. This record is populated with information from the GeoTerra, Inc. technical report downloaded from the Washington Dept. of Natural Resources Washington Lidar Portal.

GeoTerra, Inc. was selected by Washington Department of Natural Resources (DNR) to provide LiDAR remote sensing data including LAS files of the classified LiDAR points and derivative products, for approximately 117.9 square mile area per the boundary provided. Airborne LiDAR mapping technology provides 3D information for the surface of the Earth which includes ground information, vegetation characteristics and man-made features. LiDAR for this project was acquired on January 27th and January 28th 2017.

In addition to these lidar point data, the bare earth Digital Elevation Models (DEM) created from the lidar point data are also available. These data are available for custom download at the link provided in the URL section of this metadata record.

Taken From: Item Identification | Abstract
Notes: Only a maximum of 4000 characters will be included.
1.3. Is this a one-time data collection, or an ongoing series of measurements?
One-time data collection
Taken From: Extents / Time Frames | Time Frame Type
Notes: Data collection is considered ongoing if a time frame of type "Continuous" exists.
1.4. Actual or planned temporal coverage of the data:
2017-01-27 to 2017-01-28
Taken From: Extents | Time Frame - Start, Time Frame - End
Notes: All time frames from all extent groups are included.
1.5. Actual or planned geographic coverage of the data:
W: -123.557322, E: -123.163595, N: 46.82169, S: 46.621944
Taken From: Extents | Geographic Area Bounds, Geographic Area Description
Notes: All geographic areas from all extent groups are included.
1.6. Type(s) of data:
(e.g., digital numeric data, imagery, photographs, video, audio, database, tabular data, etc.)
Model (digital)
1.7. Data collection method(s):
(e.g., satellite, airplane, unmanned aerial system, radar, weather station, moored buoy, research vessel, autonomous underwater vehicle, animal tagging, manual surveys, enforcement activities, numerical model, etc.)
No information found
1.8. If data are from a NOAA Observing System of Record, indicate name of system:
Always left blank due to field exemption
1.8.1. If data are from another observing system, please specify:
Always left blank due to field exemption

2. Point of Contact for this Data Management Plan (author or maintainer)

2.1. Name:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Metadata Contact) | Person
Notes: The name of the Person of the most recent Support Role of type "Metadata Contact" is used. The support role must be in effect.
2.2. Title:
Metadata Contact
Always listed as "Metadata Contact"
2.3. Affiliation or facility:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Metadata Contact) | Organization
Notes: The name of the Organization of the most recent Support Role of type "Metadata Contact" is used. This field is required if applicable.
2.4. E-mail address:
coastal.info@noaa.gov
Notes: The email address is taken from the address listed for the Person assigned as the Metadata Contact in Support Roles.
2.5. Phone number:
(843) 740-1202
Notes: The phone number is taken from the number listed for the Person assigned as the Metadata Contact in Support Roles. If the phone number is missing or incorrect, please contact your Librarian to update the Person record.

3. Responsible Party for Data Management

Program Managers, or their designee, shall be responsible for assuring the proper management of the data produced by their Program. Please indicate the responsible party below.

3.1. Name:
No information found
Taken From: Support Roles (Data Steward) | Person
Notes: The name of the Person of the most recent Support Role of type "Data Steward" is used. The support role must be in effect.
3.2. Position Title:
Data Steward
Always listed as "Data Steward"

4. Resources

Programs must identify resources within their own budget for managing the data they produce.

4.1. Have resources for management of these data been identified?
Yes
4.2. Approximate percentage of the budget for these data devoted to data management (specify percentage or "unknown"):
Unknown

5. Data Lineage and Quality

NOAA has issued Information Quality Guidelines for ensuring and maximizing the quality, objectivity, utility, and integrity of information which it disseminates.

5.1. Processing workflow of the data from collection or acquisition to making it publicly accessible
(describe or provide URL of description):

Lineage Statement:
The NOAA Office for Coastal Management (OCM) downloaded the laz files from the Washington Lidar Portal.

Process Steps:

  • Flights were planned to acquire LiDAR data in the provided boundary, totaling approximately 118 square miles. The flight plan was designed with a minimum of 50% overlap in swath footprint to minimize laser shadowing and gaps. Utilizing this flight plan in conjunction with flying in opposing directions, GeoTerra can ensure final point density across the project. Flight planning was performed using Optech Flight Management System (FMS) software to calculate optimum parameters in order to meet project requirements and accommodate terrain variations. The Optech Galaxy sensor produces a pulse rate range of 35 - 550 kHz and can record up to 8 range measurements per laser pulse emitted. PulseTRAK and SwathTRAK technology were employed allowing the sensor to maintain regular point distribution and constant-width flight lines despite changes in terrain.
  • During the aerial LiDAR survey, the Airborne GNSS (AGNSS) technique was utilized to obtain X,Y,Z coordinates of the laser during acquisition. The data collected during the two flights (27 & 28-Jan-2017) was post-processed into a Smoothed Best Estimate of Trajectory (SBET) binary file of the laser trajectory. Once the SBET had been created it was used to geo-reference the laser point cloud during the mapping process. The LiDAR data was acquired utilizing an Optech Galaxy sensor with integrated Applanix POS AV GNSS/IMU systems. During the flight the receiver on board the aircraft logged GNSS data at 1 Hz interval and IMU data at 200 Hz interval. After the flight, the GNSS and IMU data were post-processed using NovAtel's Waypoint Products Group software package, Inertial Explorer Version 8.70.3114. Three Plate Boundary Observatory (PBO) Continuously Operating Reference Stations (CORS) (P415, P417 and P430) located within approximately 30 km of the project area were used for ground control stations during the flights and were held at their surveyed (NAD83)(2011)(epoch 2010.0) positions relative to the ground control points (GCP) previously established during the project. The CORS and their positions are as follows: P415: 46 39 21.55257, -123 43 47.45466, -15.121 m (ellipsoidal height) P417: 46 34 29.04177, -123 17 52.48619, 102.786 m (ellipsoidal height) P430: 47 00 13.82430, -123 26 10.33804, -4.108 m (ellipsoidal height) For the 27-Jan-2017 flight all three CORS were used and for the 28-Jan-2017 flight, P417 and P430 were used. Lever arm offsets between the IMU and the L1 phase center of the aircraft antenna were computed within Inertial Explorer for the flight mission and then combined with the fixed lever arm from the IMU to the mirror which were held at the internal Optech provided values of x= -0.051, y= -0.153, z= 0.003 meters from the IMU to the Mirror (where positive x = right, positive y = fwd, positive z = up). This resulted in a precise trajectory of the laser that was output as an NAD83(2011)(Epoch 2010.0) SBET file with data points each 1/200 of a second.
  • Raw range data from the sensor was decoded using Optech's LMS software. Instrument corrections were then applied to the laser ranges and scan angles. Afterwards, the range files were split into the separate flight lines. The laser point computation used the results of the decoding, description of the instrument, and locations of the aircraft (from the SBET files) as inputs and calculated the location of each point for every laser pulse emitted from the sensor.
  • Relative and absolute adjustment of all strips was accomplished using Optech's LMS and TerraMatch software. Optech's LMS software performed automated extraction of planar surfaces from the point cloud according to specified parameters in this project. Tie plane determinations established the correspondence between planes in overlapping flight lines. All plane centers of the lines that formed a block are organized into a gridded matrix. Planes from overlapping flight lines, co-located to within an acceptable tolerance are then tested for spatial accuracy. A set of accurately calculated tie planes are selected for self-calibration. Selection criteria include variables such as: size and shape of plane, the number of laser points, slope of plane, orientation of plane with respect to flight direction, location of plane within the flight line, and the fitting error. These criteria have an effect on the overall correction, as they determine the geometry of the adjustment. Self-calibration parameters are then calculated. After these parameters are determined, they are used to re-calculate the laser point locations (x,y,z). The planar surfaces are then re-calculated for a final adjustment. Figure illustrates the correctional process. Afterward the planes were analyzed to assess the internal fit of the data block as a whole. For each tie plane, the mean values were computed for each flight line that overlapped the tie plane. Mean values of the point to plane distances were plotted over scan angle. Additionally, flight mission was further reviewed and adjusted in TerraMatch using a tie line approach. This method allows adjustments in areas where planes aren't easily determined. The process began as the software measured the difference between lines (observations) in overlapping strips. These observed differences were translated into correction values for the system orientation - easting, northing, elevation, heading, roll, pitch and mirror scale. After a tight relative fit was achieved, an absolute vertical offset was calculated using surveyed control points. The algorithm computes an average value for the height difference for all control points by comparison to the laser points within specified radius around the control point. During absolute adjustment, data was shifted by the following value: +0.201 ft Point 16-357-007 was withheld from statistics due to it being collected on top of concrete block, but was used to visually assess the horizontal fit. LiDAR QC points were obtained using post processed kinematic GNSS data from a moving vehicle along selected roads within the project area boundary (Error! Reference source not found.). The rover (vehicle) as processed against one of 13 temporary base stations located throughout the survey sites. These stations were positioned by the National Geodetic Survey (NGS) Online Positioning User Service (OPUS) with output in NAD83(2011)(Epoch 2010.0). The post processed kinematic data relative to the temporary base stations were then filtered by the following criteria: fixed ambiguity positions only, 3D quality better than 0.2 feet and no two consecutive points spaced closer than 50 feet horizontally. This resulted in 61,392 usable points for all three phases of the project which were used to QC the vertical fit of the LiDAR data. Out of 8151 points used in the statistical comparison only 21 points were outside of -0.5 ft - 0.5 ft range, making it 99.9% of points that fit within project specification.
  • Once the point cloud adjustment was achieved with the desired relative and absolute accuracy, all strips in LAS format were brought into classification software. Rigorous selection algorithms built within TerraScan were used to automatically classify the data. To ensure accurate ground classification, various parameters were defined. Data from the edges of the strips were omitted during the initial ground classification to maintain quality and grounding was initiated at low seed points and gradually increased. A tailored approach was formulated for different areas within the project. Various specifications were used to determine how aggressive the automated ground classification algorithm should have been. In relatively flat or urban areas, a more tempered approach was used as to not include small buildings and urban features. In the more rural areas, a more aggressive grounding approach was used to better capture steep slopes and sharp natural features that might otherwise be ignored as a ground feature. Once the ground surface was established, points above the ground were extracted into separate classes including: vegetation, structures and water. Significant buildings and structures were auto-extracted by searching above ground classes for planar features. QC procedures were implemented in LP360 and TerraScan to manually check and correct any remaining misclassifications. Several routines were implemented to determine bird strikes and other high noise points as well as Overlap points. Routines that were employed are below. Isolated points - Points that have few neighbors within a determined 3d search radius were classified as class18_high noise points. Height filter - After ground surface was created a height above ground was determined to delete points beyond that threshold. Manual checks using automatic and semi-automatic methods (subtracting ground from first return raster results in areas to check visually for any outstanding points); low points and noisy ground points were also found using several similar routines. Classifying points which are lower than others in their immediate neighborhood. Excluding points from ground surface that in the process of building ground triangles doesn't meet triangle edge length criteria it ensures that some noisy points are excluded from ground surface. Additionally, in the effort to maintain the highest quality ground representation, the data went through a process of identifying and excluding data on the outer edge of flight swaths that did not meet GeoTerra's quality standard. Due to the nature of an oscillating mirror scanner, the data farthest from nadir is somewhat disrupting resulting in less accurate point returns. This data is not utilized in the representation of the terrain surface. The least accurate data from the outer edge was extracted to class 12-Overlap. All the remaining data went through GeoTerra's standard classification process of defining ground, and above ground features. Once ground points were identified and classified in the middle part of the flight line, a quality base from neighboring flight lines was created that could be used to compare the class 12-Overlap data against the quality ground returns from the nadir collection. If data from class 12-Overlap was within a tight range of height above and below the nadir ground plane, it was reclassified from 12-Overlap to 02-Ground. If the data was outside of that range, it was not considered to have met the standard of quality needed to be used in the ground surface and will be left on 12-Overlap class (Figure ). This data can be left in the dataset to later be used as supplemental reference information, however should not be considered as quality information from which to take measurements or conduct analysis on.
  • The final dataset was cut into delivery tiles. Tiles were created according to contractual division of USGS quadrangles. Data within a 100ft buffered boundary was reviewed for classification. Cross strips were left in the dataset as class 00 and were not used in other classification determinations or any LiDAR derivative products.
  • 2022-03-03 00:00:00 - The NOAA Office for Coastal Management (OCM) downloaded this data set from the Washington Lidar Portal. The total number of files downloaded and processed was 289. No metadata record was provided with the data. This record is populated with information from the GeoTerra, Inc. technical report downloaded from the Washington Dept. of Natural Resources Washington Lidar Portal. The data were in Washington State Plane South (NAD83 2011), US survey feet coordinates and NAVD88 (Geoid12A) elevations in feet. From the provided report, the data were classified as: 1 - Unclassified, 2 - Ground, 3 - Low Vegetation (1.5 - 5 ft), 4 - Medium Vegetation (5 - 10 ft), 5 - High Vegetation ( > 10 ft), 6 - Structures, 7 - Low Noise, 9 - Water, 12 - Overlap, 17 - Bridge Decks, 18 - High Noise. OCM processed all classifications of points to the Digital Coast Data Access Viewer (DAV). Classes available in the DAV are: 1, 2, 3, 4, 5, 6, 7, 9, 12, 17, 18. OCM performed the following processing on the data for Digital Coast storage and provisioning purposes: 1. An internal OCM script was run to check the number of points by classification and by flight ID and the gps and intensity ranges. 2. Internal OCM scripts were run on the laz files to convert from orthometric (NAVD88) elevations to ellipsoid elevations using the Geoid12A model, to convert from Washington State Plane South (NAD83 2011), US survey feet coordinates to geographic coordinates, to convert from elevations in feet to meters, to assign the geokeys, to sort the data by gps time and zip the data to database and to the Amazon s3 bucket.
5.1.1. If data at different stages of the workflow, or products derived from these data, are subject to a separate data management plan, provide reference to other plan:
Always left blank
5.2. Quality control procedures employed
(describe or provide URL of description):
No information found

6. Data Documentation

The EDMC Data Documentation Procedural Directive requires that NOAA data be well documented, specifies the use of ISO 19115 and related standards for documentation of new data, and provides links to resources and tools for metadata creation and validation.

6.1. Does metadata comply with EDMC Data Documentation directive?
No
Notes: All required DMP fields must be populated and valid to comply with the directive.
6.1.1. If metadata are non-existent or non-compliant, please explain:

Missing/invalid information:

  • 1.7. Data collection method(s)
  • 3.1. Responsible Party for Data Management
  • 5.2. Quality control procedures employed
  • 7.1.1. If data are not available or has limitations, has a Waiver been filed?
  • 7.4. Approximate delay between data collection and dissemination
  • 8.3. Approximate delay between data collection and submission to an archive facility
Notes: Required DMP fields that are not populated or invalid are listed here.
6.2. Name of organization or facility providing metadata hosting:
NMFS Office of Science and Technology
Always listed as "NMFS Office of Science and Technology"
6.2.1. If service is needed for metadata hosting, please indicate:
Always left blank
6.3. URL of metadata folder or data catalog, if known:
Always listed as the URL to the InPort Data Set record
6.4. Process for producing and maintaining metadata
(describe or provide URL of description):
Metadata produced and maintained in accordance with the NOAA Data Documentation Procedural Directive: https://nosc.noaa.gov/EDMC/DAARWG/docs/EDMC_PD-Data_Documentation_v1.pdf
Always listed with the above statement

7. Data Access

NAO 212-15 states that access to environmental data may only be restricted when distribution is explicitly limited by law, regulation, policy (such as those applicable to personally identifiable information or protected critical infrastructure information or proprietary trade information) or by security requirements. The EDMC Data Access Procedural Directive contains specific guidance, recommends the use of open-standard, interoperable, non-proprietary web services, provides information about resources and tools to enable data access, and includes a Waiver to be submitted to justify any approach other than full, unrestricted public access.

7.1. Do these data comply with the Data Access directive?
Yes
7.1.1. If the data are not to be made available to the public at all, or with limitations, has a Waiver (Appendix A of Data Access directive) been filed?
No information found
7.1.2. If there are limitations to public data access, describe how data are protected from unauthorized access or disclosure:

None

7.2. Name of organization of facility providing data access:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Distributor) | Organization
Notes: The name of the Organization of the most recent Support Role of type "Distributor" is used. The support role must be in effect. This information is not required if an approved access waiver exists for this data.
7.2.1. If data hosting service is needed, please indicate:
Taken From: Data Management | If data hosting service is needed, please indicate
Notes: This field is required if a Distributor has not been specified.
7.2.2. URL of data access service, if known:
Taken From: Distribution Info | Download URL
Notes: All URLs listed in the Distribution Info section will be included. This field is required if applicable.
7.3. Data access methods or services offered:

Data is available online for bulk and custom downloads.

7.4. Approximate delay between data collection and dissemination:
No information found
7.4.1. If delay is longer than latency of automated processing, indicate under what authority data access is delayed:

8. Data Preservation and Protection

The NOAA Procedure for Scientific Records Appraisal and Archive Approval describes how to identify, appraise and decide what scientific records are to be preserved in a NOAA archive.

8.1. Actual or planned long-term data archive location:
(Specify NCEI-MD, NCEI-CO, NCEI-NC, NCEI-MS, World Data Center (WDC) facility, Other, To Be Determined, Unable to Archive, or No Archiving Intended)
NCEI_CO
8.1.1. If World Data Center or Other, specify:
Taken From: Data Management | Actual or planned long-term data archive location
Notes: This field is required if archive location is World Data Center or Other.
8.1.2. If To Be Determined, Unable to Archive or No Archiving Intended, explain:
Taken From: Data Management | If To Be Determined, Unable to Archive or No Archiving Intended, explain
Notes: This field is required if archive location is To Be Determined, Unable to Archive, or No Archiving Intended.
8.2. Data storage facility prior to being sent to an archive facility (if any):
Office for Coastal Management - Charleston, SC
Taken From: Physical Location | Organization, City, State, Location Description
Notes: Physical Location Organization, City and State are required, or a Location Description is required.
8.3. Approximate delay between data collection and submission to an archive facility:
No information found
8.4. How will the data be protected from accidental or malicious modification or deletion prior to receipt by the archive?
Discuss data back-up, disaster recovery/contingency planning, and off-site data storage relevant to the data collection

Data is backed up to tape and to cloud storage.

9. Additional Line Office or Staff Office Questions

Line and Staff Offices may extend this template by inserting additional questions in this section.

Always left blank