EIIP Virtual Forum Presentation — November 16, 2005

GIS Day Special Event
Making Decisions About 'Sensitive' Geospatial Data

Michael Domaratz
National Geospatial Programs Office
U.S. Geological Survey

Amy Sebring
EIIP Moderator

The following version of the transcript has been edited for easier reading and comprehension. A raw, unedited transcript is available from our archives. See our home page at http://www.emforum.org

[Welcome / Introduction]

Amy Sebring: Happy GIS Day everyone and welcome to the EIIP Virtual Forum! Ava is joining us remotely from the IAEM conference and our speaker is on the road as well, and we are hoping his connection will hold up today! Today we will be discussing the new "Guidelines for Data Access in Response to Security Concerns" recently adopted by the Federal Geographic Data Committee (FGDC).

Now it is my pleasure to introduce today's speaker, Michael Domaratz. Mike is a member of the National Geospatial Programs Office of the U.S. Geological Survey. He works on implementing The National Map, a project a plan to provide current and accurate digital map data for the United States.

His presentation today, however, is based on work he led as co-chair of the Homeland Security Working Group of the Federal Geographic Data Committee (FGDC). In addition to the guidelines, the working group is developing map symbols for emergency response (which was presented to the EIIP community in December 2003), "standard" geospatial data sharing agreements, guidance on geospatial content useful for homeland security applications, and other activities. Welcome Mike, and thank you for joining us today. I turn the floor over to you to start us off please.


Mike Domaratz: Thanks, Amy, and hello and welcome to all! Geospatial data underpin one-half of the Nation's domestic economic activities. The data aid our international competitiveness, support a large array of Federal, state, local, and tribal government activities, and serve the general public. Many public, private, and non-profit organizations originate geospatial data. Public dissemination is essential to the missions of many organizations.

The events of September 11, 2001, greatly heightened concerns that public access to geospatial data might increase the vulnerability to an attack. Federal and other organizations made different, and sometimes contradictory, decisions about access to data. They withdrew access, attempted to "sanitize" data, or decided to make no changes. In some cases it was difficult to learn the consequences of such actions or why access was changed.

The working group developed the guidelines to help organizations decide on reasonable access to sensitive data and to avoid unnecessary safeguards. They balanced (sometimes competing) principles ranging from the public's right to participate in government to the public's "right to know" to protection of sensitive information. The guidelines and related materials are available through the working group's web site at http://www.fgdc.gov/fgdc/homeland/index.html under "policy support."

It is important to note that the majority of geospatial data are appropriate for public release. However, a small portion of these data could pose risks to security and may therefore require safeguarding.

The Guidelines

The guidelines are organized as a sequence of decisions. Each decision is accompanied by related instructions and discussion. The sequence is illustrated in this decision tree. Amy, slide 1 please.

[Slide 1]

The guidelines have three sections that ask these questions:

1. Is it your decision to apply safeguards to these data?

2. Do these data need to be safeguarded?

3. What safeguards are authorized and justified?

We'll take each section in turn.

Section 1: Is it your decision to apply safeguards to these data?

This section answers the question "who gets to decide." Organizations that originate geospatial data decide. Such organizations are best positioned to understand the usefulness of the data, benefits that users receive from them, and comparable sources of information.

There are other interested parties who might contribute to the decision. Law enforcement and emergency management agencies experienced in homeland security matters may be sources of information about security consequences of disseminating geospatial data. Users can provide insights into the benefits of data dissemination. Others who benefit from the data also have equity at stake even if they do not use the data directly.

Originating organizations should document their use of the guidelines. A record will help organizations review the consistency of their decisions, recall their reasoning during subsequent reviews of a decision, and explain a decision if challenged. So that's section 1: The "originating organization" decides, but can get advice from other parties.

Section 2: Do these data need to be safeguarded?

This section is a three-part test to decide if the geospatial data need safeguards. If the data fail any part, safeguarding of the data is not justified.

The test is adapted from the RAND Corporation report "Mapping the Risks: Assessing the Homeland Security Implications of Publicly Available Geospatial Information." (The report is available for free online download and order in book form through http://www.rand.org/publications/MG/MG142)


The first part, "usefulness," is a mini user needs assessment in which an adversary is the user. The questions are: (1) do the data provide information about the location and nature of facilities or features that would allow an adversary to select critical targets and (2) do the data provide information that offer intimate knowledge of a facility, its characteristics, or its operations that is helpful in executing an attack and/or maximizing the resulting damage?

Concern centers on data that provide very specific and timely information. Examples include information about the relative importance of a feature, the timing of activities, previously identified vulnerabilities; and measures for protecting facilities and responding to attacks or damage. In many cases, the attribute component of geospatial data is more likely to be sensitive than is the location component. Sensitive information does not include the fact of existence of a facility at a particular place or the general layout of a facility. Care should be taken not to automatically assume that the high cost or accuracy of data means that the data have high value to an adversary.


If the data are "useful," we go to the second part, "uniqueness." This part evaluates the likelihood that actions you take to safeguard "useful" (or sensitive) information will be effective. Efforts to safeguard useful information that is readily available through open sources or observation are unlikely to reduce vulnerabilities.

The goal is to identify "useful" information that is unique, not just geospatial data that are unique. Other publications and media may disclose the same information found in geospatial data. Consider relevant historical data in addition to contemporary data. A thirty year-old facility has thirty years of records; depending on the adversary's needs, "newer" may not always be "better." An example might help further explain the "uniqueness" test. This image is an annotated aerial photograph of downtown Washington DC. Amy, slide 2 please.

[Slide 2]

Let's hypothesize that there is a surface-to-air missile battery (shown as a light blue symbol in the upper left) and that this battery is a method of protecting the annotated facilities. So knowledge of it might be "useful" to an adversary and the annotated image passes the "usefulness" test. On to the "uniqueness" part of the test. Amy, slide 3 please.

[Slide 3]

It turns out, however, that the battery is not hypothetical and our geospatial data are not the only source of this information. In fact, as illustrated by the newspaper clippings, the information is quite well known and is readily observable. (If you're ever walking north on 17th Street in front of the Old Executive Office Building, look up.) So the annotated image fails the "uniqueness" test and safeguards are not justified. So those are the "useful" and "unique" tests. On to the last test for the section.

Cost and Benefit

If the data are "useful" AND "unique," we go to the third part, "cost and benefit." Originating organizations should consider the magnitude of the security risk incurred versus the benefits that accrue from the dissemination of any particular data. The benefits should be evaluated using quantitative and qualitative measures. Include among the societal benefits the opportunity costs of reduced availability of data resulting from safeguarding.

In summary for section 2: Safeguarding is justified only for data that contain sensitive information, that are the unique source of the sensitive information, and for which the security risk outweighs the societal benefit of dissemination.

Linking back to the decision tree for the guidelines (slide 1), we've gone through section 1 and decided that it was our decision to make. Then we went through section 2, which has three tests: usefulness, uniqueness, and cost and benefit. If the data are not useful to an adversary, then the data are not sensitive. So we don't need to safeguard the data. If the data are "useful" but not "unique" (that is, the information is known or knowable from available sources other than our data), then we don't need to safeguard the data. If the data are "useful" and "unique", then we need to decide the merits of safeguarding versus releasing the data ("cost and benefit"). If the data fails any of the tests in Section 2, then safeguards are not warranted. We'll now go to section 3, which assumes that safeguards are warranted, at least based on our evaluation.

Section 3: What safeguards are authorized and justified?

The guidelines offer two options: change the data to remove sensitive information or restrict the data. Originating organizations should maximize possible access to data, and so the guidelines emphasize the use of the minimum safeguards required to prevent access by a potential adversary.

Change the Data

To change data, originating organizations remove the security risk by redacting or removing sensitive information and/or reducing the sensitivity of information. Techniques can include data simplification, classification, aggregation, statistical summarization, or other information reduction methods. Do not place disinformation in geospatial data. For example, one can block details (as in the roof in the image on the left) or reduce resolution (as in the image on the right) in the next slide. Amy, slide 4 please.

[Slide 4]

Organizations should review the changed data to ensure that the change(s) deal effectively with the security concern. Changes should be described in documentation for the data. If changing the data is not an option, organizations can restrict access to, use of, or redistribution of the data. Restrictions should be commensurate with the assessed risk. For example, the maps on the next slide depict the same place at three times. Amy, slide 5 please.

[Slide 5]

Between 1940 and 1953, the valley in the center of the maps was flattened and infrastructure and streams were re-routed. The 1968 map shows buildings in the valley. These buildings were built in the 1940's; information about them was restricted from public access in the 1953 edition of the map.

Restrict the Data

Originating organizations that restrict data should have written policies that identify data that can be accessed, used, and/or redistributed, the conditions under which these actions may occur, and the organizations that are permitted to access, use and redistribute the data. Include these terms and conditions with transfers of such data to ensure that organizations that receive the data know the restrictions. Care should be taken to ensure that the release of the data does not enable others to force additional dissemination of the data under freedom of information laws.

For both changing and restricting data, organizations must ensure that they have the authority to take these actions. If they do not have the authority, they may seek it from an appropriate decision maker. The decision maker may provide the authority to safeguard the data, overrule the conclusion that the data require safeguarding, or find that there are no legal means to safeguard the data.

In summary for section 3: Coming out of section 2, we decided that we had data worthy of safeguards. In section 3, two options are offered: changing data (somehow removing the sensitive pieces of data) or, if that's not an option, restricting the data to prevent release of the sensitive pieces. We also are challenged to be sure that we have the authority to take these actions BEFORE we act. As well as have in place an infrastructure (agreements, policies, etc) that make our intentions known. (And our bosses have to know what we're up to and are comfortable with those actions.)

One final thought is the need to work with "neighbors" to avoid circumstances in which different organizations make contradictory decisions. Amy, slide 6 please.

[Slide 6]

The originating organizations that produced the two rows of images in the illustration followed similar advice for the same area, but came up with different results. Do such actions reduce vulnerabilities? So the integrity of individual actions in part also depends on the integrity of the collection of individual actions.

That's all I have for the presentation portion of today's activities. I'm now available to answer any questions you'd like to post! I turn the floor back to our Moderator.

Amy Sebring: Thank you very much Mike. Now, to proceed to your questions.

[Audience Questions & Answers]

Amy Sebring: Mike, how does Freedom of Information or state open access laws pertain to this? If you are in possession of somebody else's data under provisions of restricted access, is that data in your possession thereby subject to such requests?

Mike Domaratz: First, let me start off with "I'm not a lawyer," but from the Federal side the main point is to understand up front what restrictions if any are needed, be sure you have the necessary authorities, and then take actions in a way that you safeguard the sensitive data.

Joe Sukaskas: Mike, If data sets produced by different organizations who do not choose to restrict them (Sec. 1 - the originating organization decides), but the aggregate effect of those data sets is sensitive (Sec. 2 - useful, unique, cost/benefit), how does the model work? For example, if a major gas line and major telecom line intersect, but the owners of those infrastructures don't know about the others' existence, what happens?

Mike Domaratz: The combined data would be a new set subject to the rules of its "originator," although I'm not sure that merely knowing about the intersection is particularly useful.

Tim Nolan: I work at a local government (County). We have trouble getting the other local governments to share critical data with us. Do you have any suggestions? Each city, for instance, is suspicious about releasing data to our local Homeland Security Department. How can we convince them that we share the same goal?

Mike Domaratz: This obviously is a tough problem, and not unique to security concerns. What we've found in Katrina/Rita is that organizations in the area that were unwilling to share before the event suddenly were open to sharing. The problem is that an event is a terrible time to be on a learning curve. So that's why the working group is working on a common set of agreements.

Burt Wallrich: I just want to congratulate you on developing a model that does not use the very real security threats we face as a basis for unreasonable secrecy. I wish there was more of this type of thinking in the federal government.

Mike Domaratz: Thanks Burt, but there's lots more work to do in the area.

Chidi Ugonna: What independent measures can be put in place to ensure that information that is critical to the public is not stifled according to criteria 3?

Mike Domaratz: We've encouraged organizations to document their decisions. We also encourage them (to the extent possible) to make these decisions public. And we encourage them to work with their user communities.

Lonnie Meinke: You mentioned the data may be considered sensitive but must be unique to warrant protection. What would you respond to the comment "Why make it easier to find?" Anyone who has done a data search can attest to the frustration of not being able to locate data simply because your search parameters were not specific enough. The more times/places the data set is available the more likely it is to be found.

Mike Domaratz: "Uniqueness" is a tough one for people to take but the point is that the facts are ALREADY known and easy to find -- why spend time on useless activities that provide a false sense of security?

Isabel McCurdy: Mike, what is an 'adversary'? Maybe the question is "Who is the adversary?"

Mike Domaratz: "People who would do you harm". Initially we used "terrorist" but they're not the only people about which to be concerned.

Amy Sebring: Mike, you mentioned the workgroup is working on a common set of agreements? These will be models for data originators to use I gather. Do you know when these will be available?

Mike Domaratz: That group's schedule calls for something to be available the middle of next year (I think; I'm not on the group).

Lori Wieber: If I understand this correctly, it would be inappropriate to attempt to mask new aerial photos of a critical infrastructure installation, for example an electrical substation, if there were already older aerial photos already in the public arena. Newer photos could reveal additional security measures. In this case is the data originator the aerial photographer?

Mike Domaratz: Each edition could have a separate originator and if the newer version showed sensitive information not on other versions, it might be worthy of safeguards.

Amy Sebring:
Mike, do you know if the workgroup interested in getting feedback from those who attempt to apply the decision tree for future consideration? If so, what is your suggestion for providing feedback?

Mike Domaratz: The group is looking for examples of people applying the guidelines for examples of their use to share with others and for finding shortcomings. In the short term you can provide feedback through me at [email protected].

Chidi Ugonna: Are there similar activities like the FGDC guidelines happening in other countries at risk from adversarial actions? Any moves towards an international collaboration?

Mike Domaratz: Not that I'm aware of, although international access like that provided by Google might spur something.


Amy Sebring: That's all we have time for today. Thank you very much Mike for an excellent job. We hope you enjoyed the experience. Please stand by a moment while we make a couple of quick announcements.

Again, the formatted transcript will be available later today. If you are not on our mailing list and would like to get notices of future sessions and availability of transcripts, just go to our home page and click on Subscribe.

Thanks to everyone for participating today. We stand adjourned but before you go, please help me show our appreciation to Mike for a fine job and excellent information.