Data Management Plan
GUID: gov.noaa.nmfs.inport:55359 | Published / External
Data Management Plan
DMP Template v2.0.1 (2015-01-01)Please provide the following information, and submit to the NOAA DM Plan Repository.
Reference to Master DM Plan (if applicable)
As stated in Section IV, Requirement 1.3, DM Plans may be hierarchical. If this DM Plan inherits provisions from a higher-level DM Plan already submitted to the Repository, then this more-specific Plan only needs to provide information that differs from what was provided in the Master DM Plan.
1. General Description of Data to be Managed
To support the modeling of storm-induced flooding, the USGS Coastal National Elevation Database (CoNED) Applications Project has created an integrated 1-meter topobathymetric digital elevation model (TBDEM) for the Southern California Coast and Channel Islands. The Channel Islands are a chain of eight islands where years of isolation has created unique fauna, flora, and archeological resources. The archipelago extends for 160 miles (257 km) between San Miguel Island in the north and San Clemente Island in the south. Five of the islands are part of Channel Islands National Park (Anacapa, Santa Cruz, Santa Rosa, San Miguel, and Santa Barbara), and the waters surrounding these islands make up the Channel Islands National Marine Sanctuary. High-resolution coastal elevation data is required to identify flood, storm, and sea-level rise inundation hazard zones and other earth science applications, such as the development of sediment transport and storm surge models. The new TBDEM consists of the best available multi-source topographic and bathymetric elevation data for Southern California including the Channel Islands onshore and offshore areas. The Southern California TBDEM integrates 49 different data sources including topographic and bathymetric LiDAR point clouds, Interferometric Synthetic Aperture Radar (IFSAR), hydrographic surveys, single-beam acoustic surveys, and multi-beam acoustic surveys obtained from USGS, NOAA, California State University â Monterey Bay, and Los Angeles County. The topographic and bathymetric surveys were sorted and prioritized based on survey date, accuracy, spatial distribution, and point density to develop a model based on the best available elevation data. Because bathymetric data are typically referenced to tidal datums (such as Mean High Water or Mean Low Water), all tidally-referenced heights were transformed into orthometric heights that are normally used for mapping elevation on land (based on the North American Vertical Datum of 1988). The spatial resolution is 1-meter with the general location ranging from the Mexican Border to Point Conception, and extending offshore to a depth of 2,847 meters. The overall temporal range of the input topography and bathymetry is 1930 to 2014. The topography surveys are from 2005-2014. The bathymetry surveys were acquired between 1930 and 2014. Some of the nearshore void zone (not covered by lidar or multibeam) was filled with NOS surveys from 1967 and 2013.
Notes: Only a maximum of 4000 characters will be included.
Notes: Data collection is considered ongoing if a time frame of type "Continuous" exists.
Notes: All time frames from all extent groups are included.
Notes: All geographic areas from all extent groups are included.
(e.g., digital numeric data, imagery, photographs, video, audio, database, tabular data, etc.)
(e.g., satellite, airplane, unmanned aerial system, radar, weather station, moored buoy, research vessel, autonomous underwater vehicle, animal tagging, manual surveys, enforcement activities, numerical model, etc.)
2. Point of Contact for this Data Management Plan (author or maintainer)
Notes: The name of the Person of the most recent Support Role of type "Metadata Contact" is used. The support role must be in effect.
Notes: The name of the Organization of the most recent Support Role of type "Metadata Contact" is used. This field is required if applicable.
3. Responsible Party for Data Management
Program Managers, or their designee, shall be responsible for assuring the proper management of the data produced by their Program. Please indicate the responsible party below.
Notes: The name of the Person of the most recent Support Role of type "Data Steward" is used. The support role must be in effect.
Programs must identify resources within their own budget for managing the data they produce.
5. Data Lineage and Quality
NOAA has issued Information Quality Guidelines for ensuring and maximizing the quality, objectivity, utility, and integrity of information which it disseminates.
(describe or provide URL of description):
- 2016-02-01 00:00:00 - The principal methodology for developing the integrated topobathymetric elevation model can be organized into three main components. The "topography component" consists of the land-based elevation data, which is primarily comprised from high-resolution LiDAR data. The topographic source data will include LiDAR data from different sensors (Topographic, Bathymetric) with distinct spectral wavelengths (NIR-1064nm, Green-532nm). The "bathymetry component" consists of hydrographic sounding (acoustic) data collected using boats rather than bathymetry acquired from LiDAR. The most common forms of bathymetry that are used include: multi-beam, single-beam, and swath. The final component, "Integration", encompasses the assimilation of the topographic and bathymetric data along the near-shore based on a predefined set of priorities. The land/water interface (+1 m- -1.5 m) is the most critical area, and green laser systems, such as the Experimental Advanced Airborne Research LiDAR (EAARL-B) and the Coastal Zone Mapping and Imaging LiDAR (CZMIL) that cross the near-shore interface are valuable in developing a seamless transition. The end product from the topography and bathymetry components is a raster with associated spatial masks and metadata that can be passed to the integration component for final model incorporation. Topo/Bathy Creation Steps: Topography Processing Component: a) Quality control check the vertical and horizontal datum and projection information of the input lidar source to ensure the data is referenced to NAVD88 and NAD83, UTM. If the source data is not NAVD88, transform the input LiDAR data to NAVD88 reference frame using current National Geodetic Survey (NGS) geoid models and VDatum. Likewise, if required, convert the input source data to NAD83 and reproject to UTM. b) Check the classification of the topographic LiDAR data to verify the data are classified with the appropriate classes. If the data have not been classified, then classify the raw point cloud data to non-ground (class 1) ground (class 2), and water (class 9) classes using LP360-Classify. c) Derive associated breaklines from the classified LiDAR to capture internal water bodies, such as lakes and ponds and inland waterways. Inland waterways and water bodies will be hydro-flattened where no bathymetry is present. d) Extract the ground returns from the classified LiDAR data and randomly spatial subset the points into two point sets based on the criteria of 95 percent of the points for the "Actual Selected" set and the remaining 5 percent for the "Test Control" set. The "Actual Selected" points will be gridded in the terrain model along with associated breaklines and masks to generate the topographic surface, while the "Test Control" points will be used to compute the interpolation accuracy (Root Mean Square Error) from the derived surface. e) Generate the minimum convex hull boundary from the classified ground LiDAR points that creates a mask that extracts the perimeter of the exterior LiDAR points. The mask is then applied in the terrain to remove extraneous terrain artifacts outside of the extent of the ground LiDAR points. f) Using a terrain model based on triangulated irregular networks (TINs), grid the "Actual Selected" ground points using breaklines and the minimum convex hull boundary mask at a 1-meter spatial resolution using a natural neighbor interpolation algorithm. g) Compute the interpolation accuracy by comparing elevation values in the "Test Control" points to values extracted from the derived gridded surface; report the results in terms of Root Mean Square Error (RMSE).
- 2016-03-01 00:00:00 - Bathymetry Processing Component: a) Quality control check the vertical and horizontal datum and projection information of the input bathymetric source to ensure the data is referenced to NAVD88 and NAD83, UTM. If the source data is not NAVD88, transform the input bathymetric data to NAVD88 reference frame using VDatum. Likewise, if required, convert the input source data to NAD83 and reproject to UTM. b) Prioritize and spatially sort the bathymetry based on date of acquisition, spatial distribution, accuracy, and point density to eliminate any outdated or erroneous points and to minimize interpolation artifacts. c) Randomly spatial subset the bathymetric points into two point sets based on the criteria of 95 percent of the points for the "Actual Selected" set and the remaining 5 percent for the "Test Control" set. The "Actual Selected" points will be gridded in the empirical bayesian krigging model along with associated masks to generate the bathymetric surface, while the "Test Control" points will be used to compute the interpolation accuracy (Root Mean Square Error) from the derived surface. d) Spatially interpolate bathymetric single-beam, multi-beam, and hydrographic survey source data using an empirical bayesian krigging gridding algorithm. This approach uses a geostatistical interpolation method that accounts for the error in estimating the underlying semivariogram (data structure - variance) through repeated simulations. e) Cross validation - Compare the predicted value in the geostatistical model to the actual observed value to assess the accuracy and effectiveness of model parameters by removing each data location one at a time and predicting the associated data value. The results will be reported in terms of RMSE. f) Compute the interpolation accuracy by comparing elevation values in the "Test Control" points to values extracted from the derived gridded surface; report the results in terms of RMSE.
- 2016-04-04 00:00:00 - Mosaic Dataset Processing (Integration) Component: a) Determined priority of input data based on project characteristics, including acquisition dates, cell size, retention of features, water surface treatment, visual inspection and presence of artifacts. b) Develop an ArcGIS geodatabase (Mosaic Dataset) and spatial seamlines for each individual topographic (minimum convex hull boundary) and bathymetric raster layer included in the integrated elevation model. c) Generalize seamline edges to smooth transition boundaries between neighboring raster layers and split complex raster datasets with isolated regions into individual unique raster groups. d) Develop an integrated shoreline transition zone from the best available topographic and bathymetric data to blend the topographic and bathymetric elevation sources. Where feasible, use the minimum convex hull boundary, create a buffer to logically mask input topography/bathymetry data. Then, through the use of TINs, interpolate the selected topographic and bathymetric points to gap-fill, if required any near-shore holes in the bathymetric coverage. Topobathymetric LiDAR data sources such as the EAARL-B or CZMIL systems provide up-to-date, high-resolution data along the critical land/water interface within inter-tidal zone. e) Prioritize and spatially sort the input topographic and bathymetric raster layers based on date of acquisition and accuracy to sequence the raster data in the integrated elevation model. f) Based on the prioritization, spatially mosaic the input raster data sources to create a seamless topobathymetric composite at a cell size of 1 meter using blending (spatial weighting). g) Performed a visual quality assurance (Q/A) assessment on the output composite to review the mosaic seams for artifacts. h) Generate spatially referenced metadata for each unique data source. The spatially reference metadata consists of a group of geospatial polygons that represent the spatial footprint of each data source used in the generation of the topobathymetric dataset. Each polygon is to be populated with attributes that describe the source data, such as, resolution, acquisition date, source name, source organization, source contact, source project, source URL, and data type (topographic LiDAR, bathymetric LiDAR, multi-beam bathymetry, single-beam bathymetry, etc.).
- 2019-01-18 00:00:00 - The NOAA Office for Coastal Management (OCM) received the Digital Elevation Model (DEM) file from USGS. The data were in UTM Zone 11 (NAD83) coordinates and NAVD88 elevations in meters. The bare earth raster files were at a 1 meter grid spacing. OCM performed the following processing on the data for Digital Coast storage and provisioning purposes: 1. Tiled the large DEM file into smaller files using Global Mapper. 2. Copied the files to https
(describe or provide URL of description):
6. Data Documentation
The EDMC Data Documentation Procedural Directive requires that NOAA data be well documented, specifies the use of ISO 19115 and related standards for documentation of new data, and provides links to resources and tools for metadata creation and validation.
- 1.7. Data collection method(s)
- 3.1. Responsible Party for Data Management
- 4.1. Have resources for management of these data been identified?
- 4.2. Approximate percentage of the budget for these data devoted to data management
- 5.2. Quality control procedures employed
- 7.1. Do these data comply with the Data Access directive?
- 7.1.1. If data are not available or has limitations, has a Waiver been filed?
- 7.1.2. If there are limitations to data access, describe how data are protected
- 7.4. Approximate delay between data collection and dissemination
- 8.1. Actual or planned long-term data archive location
- 8.3. Approximate delay between data collection and submission to an archive facility
- 8.4. How will the data be protected from accidental or malicious modification or deletion prior to receipt by the archive?
(describe or provide URL of description):
7. Data Access
NAO 212-15 states that access to environmental data may only be restricted when distribution is explicitly limited by law, regulation, policy (such as those applicable to personally identifiable information or protected critical infrastructure information or proprietary trade information) or by security requirements. The EDMC Data Access Procedural Directive contains specific guidance, recommends the use of open-standard, interoperable, non-proprietary web services, provides information about resources and tools to enable data access, and includes a Waiver to be submitted to justify any approach other than full, unrestricted public access.
Notes: The name of the Organization of the most recent Support Role of type "Distributor" is used. The support role must be in effect. This information is not required if an approved access waiver exists for this data.
Notes: This field is required if a Distributor has not been specified.
Notes: All URLs listed in the Distribution Info section will be included. This field is required if applicable.
Data is available online for bulk and custom downloads.
Notes: This field is required if applicable.
8. Data Preservation and Protection
The NOAA Procedure for Scientific Records Appraisal and Archive Approval describes how to identify, appraise and decide what scientific records are to be preserved in a NOAA archive.
(Specify NCEI-MD, NCEI-CO, NCEI-NC, NCEI-MS, World Data Center (WDC) facility, Other, To Be Determined, Unable to Archive, or No Archiving Intended)
Notes: This field is required if archive location is World Data Center or Other.
Notes: This field is required if archive location is To Be Determined, Unable to Archive, or No Archiving Intended.
Notes: Physical Location Organization, City and State are required, or a Location Description is required.
Discuss data back-up, disaster recovery/contingency planning, and off-site data storage relevant to the data collection
9. Additional Line Office or Staff Office Questions
Line and Staff Offices may extend this template by inserting additional questions in this section.