JOIN US AT OUR WORKSHOPS

GIS for NG9-1-1, ETL Ensures GIS for Everyone

Government organizations, such as states, counties, cities, and special districts, over the last couple of decades, have begun to realize that they are enterprises — not just individual groups of public servants. This shift in thinking is no more evident than in how they have used information technology. When it comes to data, there is a need to consolidate datasets across departments, and where possible, into master datasets to improve data quality, reduce duplication of maintenance efforts, and standardize data across the enterprise. A key tool in these efforts is “ETL”.

Short for “Extract, Transform and Load”, ETL is a process that extracts, transforms, and loads data from multiple sources to a data warehouse or other unified data repository. That’s geek-speak for “take it from one place, tweak it a bit and use it in another place”. Generally, these steps involve:

Extract – Data is copied from one or more sources.

Transform – The schema, say a data type of a given field or field name, for instance, may need to be changed to work in the target system. The data values may need to be normalized, validated, or reported on. Or there may be geospatial transformations, such as coordinate systems, that need to take place.

Load – Move into target database.

Think of ETL like this:  Cows produce milk.  Milk is extracted from the cow to create many things we enjoy.  Depending on what needs to be made, the milk is transformed to create, well different milks like buttermilk, but also butter, cheese, and ice cream!

GIS stakeholders can “Build and maintain once, use many times.” The maintenance database combines all the features and attributes for building new features and maintaining existing features of multiple types while the ETL process would create system specific outputs from that data for many enterprise systems.

Next Generation 9-1-1 (NG9-1-1) will require such ETLs for GIS data in many cases. This is true because the GIS data, particularly for road centerlines and site/structure address points, were initially built for other purposes using other data schemas, coordinate systems, and levels of precision. For instance, a city’s planning department, who began creating address points as they issued them about a decade ago, may require the following adjustments before they can be used in a NG9-1-1 call routing system.

1. Attributes unnecessary for NG9-1-1, such as building permit number, need to be dropped
2. NG9-1-1 attributes such as Emergency Service Number (ESN) and MSAG Community need to be imputed
3. Address and road name data needs to be parsed at a more granular level for NG9-1-1
3. Validation of data normalization between feature classes should be conducted and reports should be created to allow for QA/QC
4. Coordinate system, frequently maintained in State Plane Coordinate System, needs to be transformed to WGS84

 

Some of these transformations can be handled by the NG9-1-1 functional element called the Spatial Interface (SI), such as the coordinate system transformation, but most should be executed prior to submission to the SI. The rest, in many cases, are best handled with an automated ETL.

Why ETL?

  1. ETL process saves time and effort of manually handling data.
    The biggest advantage of ETL process is that it helps you gather, transform, and consolidate data in an automated way. This means you can save the time and effort of importing rows and rows of data manually.
  1. ETL makes it easier to work with complex data
    Over time, multi-jurisdictional and other data requirements as well as timelines introduce complexity for 9-1-1. That in addition to other requirements brought on by other data stakeholders, the Planning Department in our example, will mean more voluminous, complex, and diverse datasets and maintenance workflows. Throw in more attributes to the mix and you can find yourself formatting data round the clock. Plus, incoming data files can be in different formats, layouts, and types. This is where ETL can simplify things for you.
  1. ETL reduces risks associated with human error
    No matter how careful you are with your data, you aren’t safe from making errors. For instance, data may be accidentally duplicated in the target system, or manual inputs could be entered incorrectly. By eliminating human intervention, an ETL tool can help you dodge such a scenario.
  1. ETL process boosts the government enterprise’s return on investment (ROI)
    The mantra for data maintainers in the government enterprise must become, “Build and maintain once, use many times.” The same address points feature class in the example should be enabled to serve needs in systems such as asset management, permitting and computer-aided dispatch (CAD) in addition to NG9-1-1. This makes the business case for these datasets a no-brainer for the budget decision makers. As you save time, effort, and resources, ETL process ultimately helps you increase your ROI.
  1. Enables non-technical data users
    It is critical that GIS professionals not attempt to drag 9-1-1 or Public Safety professionals into their world. Educate them on the things they need to understand but embrace the magician inside. ETL is magic to non-technical users and that’s ok!

 

“Embrace the suck”

In military circles, this phrase is used to tell warfighters to consciously accept or appreciate something that is extremely unpleasant but unavoidable. Likewise, there are some challenges that GIS professionals will have to work through both with technical staff as well as other stakeholders.

  1. Apples to Oranges transformations
    For some NG9-1-1 datasets, especially in the case of multi-jurisdictional aggregation, there will be some significant transformations that will need to be made. It will be critical that any ETL process, whether completely automated or semi-automated, best tested to ensure that it delivers the desired outputs in every use case. Once you achieve that, test it again and again.
  1. Source data sets aren’t always aligned
    Because datasets, again especially in the case of multi-jurisdictional aggregation, will sometimes come from disparate sources and have disparate origins there will be cases where datasets aren’t well aligned either geospatially, in schema, metadata, etc. This poses a significant, but surmountable in almost every case, challenge. Being the consummate data and/or GIS professional, perseverance and a solid “break/fix/test” methodology will carry you through.
  1. May create intermediate datasets that must be managed
    The creation of intermediate datasets is an almost certain reality. In the case of completely automated processes this can be managed without much issue. Where there is a semi-automated process, there will have to be good documentation, training, and quality checks to ensure these intermediate datasets don’t introduce error or confusion for stakeholders.
  1. Other technical challenges
    Other peculiarities, based on the software available, experience of participants or other factors, will undoubtedly arise. The pain will be worth it when the process yielding life-saving GIS data for the men and women answering 9-1-1 calls is produced. You have had a part in helping citizens in their time of need and more than likely saving lives. You must buy into the mission and let that motivate you to success!

 

NG9-1-1 GIS Data Implications

“ETL” should be embraced by GIS professionals supporting NG9-1-1 everywhere its appropriate. This will allow the GIS professional to maintain a master database with multiple outputs including, but not limited to, NG9-1-1 call routing, CAD, and other enterprise datasets. This will require the support of a culture of quality assurance set both in GIS and 9-1-1 shops. Quality of service and quality of data should drive excellence. This culture of quality assurance will embrace all tools, methodologies, and opportunities to make data, workflows, and systems better and more supportive to the line-level telecommunicators and dispatchers who are the “First, First-responders”!

Want to Know More?

Don’t forget to visit Jeff Ledbetter, GISP, ENP and the entire DATAMARK team at NENA 2021 in Columbus, OH next week (July 24 – 29)! We will be stationed at Booth #901 and ready to answer your burning questions about the ETL process, and more! We can’t wait to see you there.

/ July 22, 2021

Share on Facebook Share on Twitter Share on LinkedIn

Contact Us




This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.