Search Help Show/Hide Menu

Data Management Plan

DMP Template v2.0.1 (2015-01-01)

Please provide the following information, and submit to the NOAA DM Plan Repository.

Reference to Master DM Plan (if applicable)

As stated in Section IV, Requirement 1.3, DM Plans may be hierarchical. If this DM Plan inherits provisions from a higher-level DM Plan already submitted to the Repository, then this more-specific Plan only needs to provide information that differs from what was provided in the Master DM Plan.

URL of higher-level DM Plan (if any) as submitted to DM Plan Repository:
Always left blank

1. General Description of Data to be Managed

1.1. Name of the Data, data collection Project, or data-producing Program:
2004 Southwest Florida Water Management District Lidar: Pasco District
1.2. Summary description of the data:

This metadata record describes the ortho & lidar mapping of Pasco County, FL. The mapping consists of lidar data

collected using a Leica ALS-40 Lidar Sensor, contour generation, and production of natural color orthophotography with a 30-cm

GSD using imagery collected with a Leica ADS-40 Aerial Digital Camera.

Original contact information:

Contact Name: Steve Dicks

Contact Org: Southwest Florida Water Management District

Phone: 352-796-7211

Taken From: Item Identification | Abstract
Notes: Only a maximum of 4000 characters will be included.
1.3. Is this a one-time data collection, or an ongoing series of measurements?
One-time data collection
Taken From: Extents / Time Frames | Time Frame Type
Notes: Data collection is considered ongoing if a time frame of type "Continuous" exists.
1.4. Actual or planned temporal coverage of the data:
2004-01-23 to 2004-05-15
Taken From: Extents | Time Frame - Start, Time Frame - End
Notes: All time frames from all extent groups are included.
1.5. Actual or planned geographic coverage of the data:
W: -82.815369, E: -82.04915, N: 28.48311, S: 28.166952
Taken From: Extents | Geographic Area Bounds, Geographic Area Description
Notes: All geographic areas from all extent groups are included.
1.6. Type(s) of data:
(e.g., digital numeric data, imagery, photographs, video, audio, database, tabular data, etc.)
No information found
1.7. Data collection method(s):
(e.g., satellite, airplane, unmanned aerial system, radar, weather station, moored buoy, research vessel, autonomous underwater vehicle, animal tagging, manual surveys, enforcement activities, numerical model, etc.)
No information found
1.8. If data are from a NOAA Observing System of Record, indicate name of system:
Always left blank due to field exemption
1.8.1. If data are from another observing system, please specify:
Always left blank due to field exemption

2. Point of Contact for this Data Management Plan (author or maintainer)

2.1. Name:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Metadata Contact) | Person
Notes: The name of the Person of the most recent Support Role of type "Metadata Contact" is used. The support role must be in effect.
2.2. Title:
Metadata Contact
Always listed as "Metadata Contact"
2.3. Affiliation or facility:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Metadata Contact) | Organization
Notes: The name of the Organization of the most recent Support Role of type "Metadata Contact" is used. This field is required if applicable.
2.4. E-mail address:
coastal.info@noaa.gov
Notes: The email address is taken from the address listed for the Person assigned as the Metadata Contact in Support Roles.
2.5. Phone number:
(843) 740-1202
Notes: The phone number is taken from the number listed for the Person assigned as the Metadata Contact in Support Roles. If the phone number is missing or incorrect, please contact your Librarian to update the Person record.

3. Responsible Party for Data Management

Program Managers, or their designee, shall be responsible for assuring the proper management of the data produced by their Program. Please indicate the responsible party below.

3.1. Name:
No information found
Taken From: Support Roles (Data Steward) | Person
Notes: The name of the Person of the most recent Support Role of type "Data Steward" is used. The support role must be in effect.
3.2. Position Title:
Data Steward
Always listed as "Data Steward"

4. Resources

Programs must identify resources within their own budget for managing the data they produce.

4.1. Have resources for management of these data been identified?
No information found
4.2. Approximate percentage of the budget for these data devoted to data management (specify percentage or "unknown"):
No information found

5. Data Lineage and Quality

NOAA has issued Information Quality Guidelines for ensuring and maximizing the quality, objectivity, utility, and integrity of information which it disseminates.

5.1. Processing workflow of the data from collection or acquisition to making it publicly accessible
(describe or provide URL of description):

Process Steps:

  • 2004-09-30 00:00:00 - New ground control was established to control and orient the photography, and included only photo-identifiable features. The ground control network and airborne GPS data was integrated into a rigid network through the completion of a fully analytical bundle aerotriangulation adjustment. 1. The digital aerial photo data was ingested into the ISTAR processing system by uploading the data from portable hard drives. 2. The coverage of the imagery was checked for gaps and a directory tree structure for the project was established on one of the workstations. This project was then accessed by other workstations through the network. The criteria used for establishment of the directory structure and file naming conventions accessed through the network avoids confusion or errors due to inconsistencies in digital data. The project area was reviewed against the client-approved boundary. The technician verified that the datum and units of measurement for the supplied control were consistent with the project requirements. 3. The photogrammetric technician performed an automatic triangulation of the data using the ISTAR processing system. The aerotriangulation adjustment merged the airborne GPS, IMU, and ground control data into a project-wide network. 4. While ground control points (GCPs) were used, reliance on the GPS-/IMU-derived orientation parameters required significantly fewer GCPs than are typically used in aerotriangulation. 5. The adjustment was performed for each sortie and then multiple sorties were merged to produce a project-wide adjustment. 6. The aerotriangulation component of the ISTAR suite utilized the airborne GPS as a separate control source and held the IMU (Inertial Measurement Unit) parameters rigidly. 7. The accuracy of the final solution was verified by running the final adjustment, placing no constraints on any quality control points. The RMSE values for these points must fall within the tolerances above for the solution to be acceptable.
  • 2004-08-17 00:00:00 - EarthData has developed a unique method for processing lidar data to identify and remove elevation points falling on vegetation, buildings, and other aboveground structures. The algorithms for filtering data were utilized within EarthData's proprietary software and commercial software written by TerraSolid. This software suite of tools provides efficient processing for small to large-scale, projects and has been incorporated into ISO 9001 compliant production work flows. The following is a step-by-step breakdown of the process. 1. Using the lidar data set provided by EarthData, the technician performs calibrations on the data set. 2. Using the lidar data set provided by EarthData, the technician performed a visual inspection of the data to verify that the flight lines overlap correctly. The technician also verified that there were no voids, and that the data covered the project limits. The technician then selected a series of areas from the data set and inspected them where adjacent flight lines overlapped. These overlapping areas were merged and a process which utilizes 3-D Analyst and EarthData's proprietary software was run to detect and color code the differences in elevation values and profiles. The technician reviewed these plots and located the areas that contained systematic errors or distortions that were introduced by the lidar sensor. 3. Systematic distortions highlighted in step 2 were removed and the data was re-inspected. Corrections and adjustments can involve the application of angular deflection or compensation for curvature of the ground surface that can be introduced by crossing from one type of land cover to another. 4. The lidar data for each flight line was trimmed in batch for the removal of the overlap areas between flight lines. The data was checked against a control network to ensure that vertical requirements were maintained. Conversion to the client-specified datum and projections were then completed. The lidar flight line data sets were then segmented into adjoining tiles for batch processing and data management. 5. The initial batch-processing run removed 95% of points falling on vegetation. The algorithm also removed the points that fell on the edge of hard features such as structures, elevated roadways and bridges. 6. The operator interactively processed the data using lidar editing tools. During this final phase the operator generated a TIN based on a desired thematic layer to evaluate the automated classification performed in step 5. This allowed the operator to quickly re-classify points from one layer to another and recreate the TIN surface to see the effects of edits. Geo-referenced images were toggled on or off to aid the operator in identifying problem areas. The data was also examined with an automated profiling tool to aid the operator in the reclassification. 7. The data were separated into a bare-earth DEM. A grid-fill program was used to fill data voids caused by reflective objects such as buildings and vegetation. The final DEM was written to an ASCII XYZ and LAS format. 8. The reflective surface data were also delivered in ASCII XYZ and LAS format. 9. Final TIN files are created and delivered.
  • 2005-04-15 00:00:00 - This process describes the method used to compile hydro-breaklines to support H&H modeling efforts. The technical method used to produce hydro-breaklines for use in this project only included water features and they should not be confused with traditional stereo-graphic or field survey derived breaklines. Watershed Concepts and EarthData utilized techniques developed for FEMA floodmap modernization projects to synthesize 3D break lines using digital orthophotos and lidar data. 1. For larger streams (widths greater than 50 feet), breaklines were collected on the left and right water edge lines. The 2D lines defining streams and other water bodies were manually digitized into ArcView shape file format from the ADS-40 digital imagery. Flat water bodies such as ponds were collected by examining points near the edge of water, were a low point could be quickly identified. This allowed the operators to draw an even-elevation breakline at that elevation around the water body's perimeter. 2. A bounding polygon, created from the edge of bank lines, was used to remove all lidar points from within the channels of streams and bodies of water. This step ensures that the lidar bare-earth point files match the breaklines. 3. The elevation component of the 3D streamlines (breaklines) was derived from the lowest adjacent bare earth lidar point and was adjusted to ensure that the streams flow downstream. The best elevation that can be derived for the 3D streamlines will be the water surface elevation on the date that the lidar data was acquired. 4. Automatic processes assigned elevations to the vertices of the centerline based on surrounding lidar points. The lines were then smoothed to ensure a continuous downhill flow. Edge-of-bank vertices were adjusted vertically to match the stream centerline vertices. 5. The new 3D lines were then viewed in profile to correct any anomalous vertices or remove errant points from the lidar DTM, which cause unrealistic "spikes" or "dips" in the breaklines. 6. For this project, hydro breaklines were generated in the matter described above for all streams and water bodies. a) A 2000 to identify any quality issues. b) An automated routine was run to check the data for closure of water bodies. c) An evaporation routine was run to remove lidar points from water bodies. d) A final routine was run to check the generate TINs for anomalies including outside township/range boundary and elevation extremes. 7. New TINs were then created from the remaining lidar points and newly created breaklines. 8. The breakline data set was then put into an ESRI shape file format 9. The 1 foot contours were generated in Microstation (using 2 foot specifications) with an overlay software package called TerraSolid. Within TerraSolid, the module Terramodeler was utilized to first create the tin and then a color relief was created to view for any irregularities before the contour generator was run. The contours were checked for accuracy over the DTM and then the Index contours were annotated. At this point the technician identified any areas of heavy tree coverage by collecting obscure shapes. Any contours that were found within these shapes are coded as obscure. The data set was viewed over the orthos before the final conversion. The contours were then converted to Arc/Info where final QC AMLs were run to verify that no contours were crossing. The contours were delivered in ESRI .shp format as a merged file. Due to the nature of the breaklines collected in accordance with FEMA guidelines, the contours do not meet any specified accuracy requirement and are delivered as is.
  • 2004-09-15 00:00:00 - The digital orthophotography was produced in natural color at a natural ratio of 1 to 2,400 with a 1 ft pixel resolution. A step-by-step breakdown of the digital orthophoto production process follows. 1. Digital image swath files were visually checked for image quality on the networked ISTAR processing farm. 2. The digital image files were loaded onto the digital orthophoto production workstation. The following information was then loaded onto the workstation. - The camera parameters and flight line direction - Ground control and pass point locations - The exterior orientation parameters from the aerotriangulation process - ASCII file containing the corner coordinates of the orthophotos - The digital elevation model. - Project-specific requirements such as final tile size and resolution. -Orientation parameters developed from the aerotriangulation solution. A coordinate transformation based on the camera calibration fiducial coordinates was then undertaken. This transformation allowed the conversion of every measured element of the images to a sample/line location. Each pixel in an image was then referenced by sample and line (its horizontal and vertical position) and matched to project control. 3. The newly re-sected image was visually checked for pixel drop-out and/or other artifacts that may degrade the final orthophoto image. 4. DTM data were imported and written to the correct subdirectory on disk. 5. The DTM file was re-inspected for missing or erroneous data points. 6. A complete differential rectification was carried out using a cubic convolution algorithm that removed image displacement due to topographic relief, tip and tilt of the aircraft at the moment of exposure, and radial distortion within the camera. Each final orthophoto was produced at a natural scale of 1 to 2,400 with a 1ft pixel resolution. At this point in the process, the digital orthophotos covered the full aerial frame. 7. Each digital orthophoto image was visually checked for accuracy on the workstation screen. Selected control points (control panels or photo-identifiable points) that are visible on the original film were visited on the screen, and the X and Y coordinates of the location of the panel or photo-identifiable point were measured. This information was cross-referenced with the X and Y information provided by the original ground survey. If the orthophoto did not meet or exceed NMAS standards, the rectification was regenerated. The digital orthophotos were then edge-matched using proprietary software that runs in Z/I Imaging OrthoPro software package. Adjoining images were displayed in alternating colors of red and cyan. In areas of exact overlap, the image appears in gray-scale rendition. Offsets were colored red or cyan, depending on the angle of displacement. The operator panned down each overlap line at a map scale to inspect the overlap area. Any offset exceeding accuracy standards was re-rectified after the DTM and AT information was rechecked.
  • 2004-09-15 00:00:00 - 8. Once the orthos were inspected and approved for accuracy, the files were copied to the network and downloaded by the ortho finishing department. This production unit was charged with radiometrically correcting the orthophotos prior to completing the mosaicking and clipping of the final tiles. The image processing technician performed a histogram analysis of several images that contained different land forms (urban, agricultural, forested, etc.) and established a histogram that best preserves detail in highlight and shadow areas. EarthData International has developed a proprietary piece of software called "Image Dodging." This radiometric correction algorithm was utilized in batch and interactive modes. Used in this fashion, this routine eliminated density changes due to sun angle and changes in flight direction. A block of images were processed through image dodging, in batch mode and displayed using Z/I Imaging OrthoPro software. At this point the images have been balanced internally, but there are global differences in color and brightness that were adjusted interactively. The technician assigned correction values for each orthophoto then displayed the corrected files to assess the effectiveness of the adjustment. This process was repeated until the match was considered near seamless. The files then were returned to digital orthophoto production to mosaic the images. 9. The processed images were mosaicked using the Z/I Imaging software. The mosaic lines were set up interactively by the technician and were placed in areas that avoided buildings, bridges, elevated roadways, or other features that would highlight the mosaic lines. File names were assigned. 10.The finishing department performed final visual checks for orthophoto image quality. The images were inspected using Adobe Photoshop, which enabled the technician to remove dust and lint from the image files interactively. Depending on the size and location of the flaw, Photoshop provided several tools to remove the flaw. Interactive removal of dust was accomplished at high magnification so that repairs are invisible. 11.The final orthophoto images were written out into GeoTIFF format.
  • 2008-01-25 00:00:00 - The NOAA Office for Coastal Management (OCM) received the files in LAS format. The files contained Lidar elevation measurements. The data was in Florida State Plane Projection and NAVD88 vertical datum. OCM performed the following processing to the data to make it available within the LDART Retrieval Tool (LDART): 1. The data were converted from Florida State Plane West coordinates to geographic coordinates. 2. The data were converted from NAVD88 (orthometric) heights to GRS80 (ellipsoid) heights using Geoid03. 3. The LAS data were sorted by latitude and the headers were updated.
5.1.1. If data at different stages of the workflow, or products derived from these data, are subject to a separate data management plan, provide reference to other plan:
Always left blank
5.2. Quality control procedures employed
(describe or provide URL of description):
No information found

6. Data Documentation

The EDMC Data Documentation Procedural Directive requires that NOAA data be well documented, specifies the use of ISO 19115 and related standards for documentation of new data, and provides links to resources and tools for metadata creation and validation.

6.1. Does metadata comply with EDMC Data Documentation directive?
No
Notes: All required DMP fields must be populated and valid to comply with the directive.
6.1.1. If metadata are non-existent or non-compliant, please explain:

Missing/invalid information:

  • 1.6. Type(s) of data
  • 1.7. Data collection method(s)
  • 3.1. Responsible Party for Data Management
  • 4.1. Have resources for management of these data been identified?
  • 4.2. Approximate percentage of the budget for these data devoted to data management
  • 5.2. Quality control procedures employed
  • 7.1. Do these data comply with the Data Access directive?
  • 7.1.1. If data are not available or has limitations, has a Waiver been filed?
  • 7.1.2. If there are limitations to data access, describe how data are protected
  • 7.4. Approximate delay between data collection and dissemination
  • 8.1. Actual or planned long-term data archive location
  • 8.3. Approximate delay between data collection and submission to an archive facility
  • 8.4. How will the data be protected from accidental or malicious modification or deletion prior to receipt by the archive?
Notes: Required DMP fields that are not populated or invalid are listed here.
6.2. Name of organization or facility providing metadata hosting:
NMFS Office of Science and Technology
Always listed as "NMFS Office of Science and Technology"
6.2.1. If service is needed for metadata hosting, please indicate:
Always left blank
6.3. URL of metadata folder or data catalog, if known:
Always listed as the URL to the InPort Data Set record
6.4. Process for producing and maintaining metadata
(describe or provide URL of description):
Metadata produced and maintained in accordance with the NOAA Data Documentation Procedural Directive: https://nosc.noaa.gov/EDMC/DAARWG/docs/EDMC_PD-Data_Documentation_v1.pdf
Always listed with the above statement

7. Data Access

NAO 212-15 states that access to environmental data may only be restricted when distribution is explicitly limited by law, regulation, policy (such as those applicable to personally identifiable information or protected critical infrastructure information or proprietary trade information) or by security requirements. The EDMC Data Access Procedural Directive contains specific guidance, recommends the use of open-standard, interoperable, non-proprietary web services, provides information about resources and tools to enable data access, and includes a Waiver to be submitted to justify any approach other than full, unrestricted public access.

7.1. Do these data comply with the Data Access directive?
No information found
7.1.1. If the data are not to be made available to the public at all, or with limitations, has a Waiver (Appendix A of Data Access directive) been filed?
No information found
7.1.2. If there are limitations to public data access, describe how data are protected from unauthorized access or disclosure:

None

7.2. Name of organization of facility providing data access:
NOAA Office for Coastal Management (NOAA/OCM)
Taken From: Support Roles (Distributor) | Organization
Notes: The name of the Organization of the most recent Support Role of type "Distributor" is used. The support role must be in effect. This information is not required if an approved access waiver exists for this data.
7.2.1. If data hosting service is needed, please indicate:
Taken From: Data Management | If data hosting service is needed, please indicate
Notes: This field is required if a Distributor has not been specified.
7.2.2. URL of data access service, if known:
Taken From: Distribution Info | Download URL
Notes: All URLs listed in the Distribution Info section will be included. This field is required if applicable.
7.3. Data access methods or services offered:

This data can be obtained on-line at the following URL: https://coast.noaa.gov/dataviewer;

7.4. Approximate delay between data collection and dissemination:
No information found
7.4.1. If delay is longer than latency of automated processing, indicate under what authority data access is delayed:

8. Data Preservation and Protection

The NOAA Procedure for Scientific Records Appraisal and Archive Approval describes how to identify, appraise and decide what scientific records are to be preserved in a NOAA archive.

8.1. Actual or planned long-term data archive location:
(Specify NCEI-MD, NCEI-CO, NCEI-NC, NCEI-MS, World Data Center (WDC) facility, Other, To Be Determined, Unable to Archive, or No Archiving Intended)
No information found
8.1.1. If World Data Center or Other, specify:
Taken From: Data Management | Actual or planned long-term data archive location
Notes: This field is required if archive location is World Data Center or Other.
8.1.2. If To Be Determined, Unable to Archive or No Archiving Intended, explain:
Taken From: Data Management | If To Be Determined, Unable to Archive or No Archiving Intended, explain
Notes: This field is required if archive location is To Be Determined, Unable to Archive, or No Archiving Intended.
8.2. Data storage facility prior to being sent to an archive facility (if any):
Office for Coastal Management - Charleston, SC
Taken From: Physical Location | Organization, City, State, Location Description
Notes: Physical Location Organization, City and State are required, or a Location Description is required.
8.3. Approximate delay between data collection and submission to an archive facility:
No information found
8.4. How will the data be protected from accidental or malicious modification or deletion prior to receipt by the archive?
Discuss data back-up, disaster recovery/contingency planning, and off-site data storage relevant to the data collection
No information found

9. Additional Line Office or Staff Office Questions

Line and Staff Offices may extend this template by inserting additional questions in this section.

Always left blank