EIIP Virtual Forum Presentation — April 8, 2009

The NIMS Supporting Technologies Evaluation Program
Assisting Practitioner Decision Making

Camille Osterloh
Task Lead, NIMS STEP
National Incident Management System Support Center

Jim Goodson
Task Lead, Product Evaluation
National Incident Management System Support Center

Chad Foster
Associate Director, Incident Management Program
Eastern Kentucky University

Amy Sebring
EIIP Moderator

The following has been prepared from a transcription of the recording. The complete slide set (Adobe PDF) may be downloaded from http://www.emforum.org/vforum/NIMSSTEP/NIMSSTEPpresentation.pdf for ease of printing.


[Welcome / Introduction]

Amy Sebring: Good morning/afternoon everyone. Welcome to EMforum.org. Our topic today is the NIMS Supporting Technologies Evaluation Program (NIMS STEP). This program provides an independent, objective evaluation of commercial and government software and hardware products to assist in the implementation of the National Incident Management System.

Please note that there are three handouts available for download today. There is a one page fact sheet about the NIMS STEP program, and a one page fact sheet about the NIMS STEP Assessor program, as well as a letter that explains how you can participate, which Camille will be describing.

This week’s poll on our homepage is, "Should grant-funded software be required to comply with NIMS standards? Yes or No." Please take time to participate by voting and review the results thus far.

[Slide 1]

Now it is my pleasure to introduce today’s speakers: Camille Osterloh is a member of the National Incident Management System Support Center (NIMS SC) Test and Evaluation staff. She is the task lead for the NIMS Supporting Technology Evaluation Program (NIMS STEP) and works with teams of Subject Matter Experts in homeland security, emergency management, and response as well as engineering staff.

In her previous experience with Detachment 1 Air Force Operational Test and Evaluation Center (Det 1 AFOTEC), Mrs. Osterloh supported multi-year Advanced Concept Technology Demonstration programs and rapid, non-traditional military utility assessments.

Chad Foster is a Program Manager from Eastern Kentucky University. Chad oversees implementation of the National Incident Management System Support Center - a program that operates under a cooperative agreement between the FEMA and the Justice and Safety Center at EKU. This program is designed to develop new responder tools, enhance technology integration and interoperability, and provide technical assistance and support to the incident management and response community.

Before joining EKU in 2006, Chad served as Special Projects Coordinator for the Emergency Management Accreditation Program or EMAP. From 2002 to 2005 and following a career in the military, Chad worked for the Council of State Governments where he directed and managed the organization’s public safety and justice policy work.

Welcome to you both, and thank you very much for being with us today. I now turn the floor over to Camille to start us off please.

[Presentation]

Camille Osterloh: Good Afternoon Everyone. First of all I wanted to thank the EM Forum for the invitation today. We’re excited to be talking with you about some of the activities here at the NIMS Support Center including our NIMS STEP program. I’m the task lead with SAIC for the evaluation program. I’ll cover a bit of background information about our program for those who haven’t heard about NIMS STEP before and also discuss some of our plans for this year. After our presentation, I’ll pull up our website and walk through how to apply for the program for any vendors that are interested. From our website you can also download a copy of our program guide for more information. We’ll also demonstrate how to access results from our evaluations on the Responder Knowledge Base. I’m joined on the phone today by a number of our team members that have helped build the evaluation program and are responsible for conducting evaluations.

Before we get into the specifics of our evaluation activities I’d like to turn it over to Chad Foster for some opening remarks about NIMS and the NIMS Support Center. Chad is our program lead from Eastern Kentucky University for the NIMS Support Center Program.

[Slide 2]

Chad Foster: Thanks, Camille, and good morning/afternoon everybody. Thanks for having us today to talk about the NIMS STEP program. I’ll just take two or three minutes here to provide a high level overview of who we are and our relationship to NIMS. As Amy had mentioned, the NIMS Support System is a cooperative agreement between FEMA and Eastern Kentucky University. We provide support to the Incident Management Systems Integration Division of FEMA which is responsible for providing assistance to states and local jurisdictions with the implementation of NIMS.

I think most of you are familiar with what NIMS is. It was first released in 2004 and an updated version came out in December of 2008. One of the key requirements in NIMS is the need for interoperable communications and information management systems, and underpinning that requirement is the need for standards and related testing. That’s where the NIMS STEP program, really the genesis behind the NIMS STEP program is.

The NIMS Support Center manages the day to day functions of the program for IMSI and most of the work is done in Somerset, Kentucky at our NIMS Support Center Facility. We’ve worked over the past two or three years over the development of the program working with FEMA, with DHS S&T as well as various other practitioners through our practitioner working group on the formation of the program. It has been a number of years in development.

Our focus is on NIMS STEP which Camille will cover in detail. I did want to talk about some of our other task areas just to make attendees aware of what they are. We’d be happy to answer additional questions about them or steer you in the right direction to get information.

[Slide 3]

The first area we do, other than the evaluation work which Camille will talk about, we do have a number of systems development projects that I’ll just touch upon here. The first one is the Incident Resource Inventory System (IRIS), many of you may be aware of. It’s a software tool which is available for download through the FEMA-NIMS Resource Center website. It’s a basic resource inventorying tool to allow users to inventory both typed and non-typed resources. I think it has been available for download for about 2 or 3 years now. The most recent version (2.2) is now available. We would encourage that you go to the NIMS Resource Center website which is www.fema.gov/nims to download a copy of that.

Another system we’ve been working on, largely in partnership with the Emergency Management Institute, is the development of the Exercise Simulation System, which is a server based tool to assist them and others with training and exercises. It essentially consolidates all the community information that they use into a single, virtual environment. It allows students to make decisions, to generate maps, among other capabilities. Again, that is a new software system available to EMI, but also others upon request.

[Slide 4]

A lot of the training and exercise part that we provide to FEMA is provided through our facility in Somerset. The slide in front of you has a few snapshots or images of the simulated Emergency Operations Center there. We’re able to provide support both on-site in Somerset with any training exercise needs, or remotely. Just wanted to make you aware of that capability.

[Slide 5]

Like I said before, there are a number of other projects that we support INSI on. Most of the products and services are available through the NIMS Resource Center web page that I provided earlier. The facility that we have in Somerset supports our test and evaluation work, and with that I’ll turn it back over to Camille.

[Slide 6]

Camille Osterloh: Our evaluation program is called the NIMS Supporting Technology Evaluation Program (NIMS STEP) and we conduct objective evaluations of software products. We also look at hardware in some instances but primarily in a supporting capacity for the software under evaluation, such as a cell phone or portable device. The overarching goals of the program are to evaluate products for their incorporation of NIMS concepts and help improve interoperability and information sharing capabilities among the emergency response and management community. Through our testing program, we help ensure that the products adhere to the NIMS recommended technical standards.

We look at systems that are designed for emergency managers and responders. Specifically, we focus on tools with some type of operational capability, so those that are used during an incident or event. The general product categories we review are listed here, including command and control and resource management applications, as well as alert and warning systems, and communication tools. In our program guide we also outline a few other technology categories but the ones listed here are typically the primary function of the products we evaluate.

When we conduct an evaluation, we’re looking at two primary aspects. The first is the products adherence to NIMS criteria, and second is the product’s adherence to the NIMS recommended technical standards.

The NIMS criteria assessment is conducted by our Subject Matter Experts, so emergency managers and responders with real world experience. You may hear me referring to them as SMEs or Assessors throughout the discussion today. We’ve extracted key concepts from the NIMS document and National Response Framework and use a standardized worksheet to assess a product's incorporation of NIMS concepts and principles. I have a slide here in a few which lists the categories SMEs review and we’ll talk about that portion in more detail.

We also verify that technologies conform to interoperability standards and can exchange critical messages during disasters. Our test engineer reviews products for several standards which were developed by the OASIS Emergency Management Technical Committee. These standards are intended to support the exchange of information between different or disparate systems.

The two standards we currently evaluate products against are the Common Alerting Protocol (CAP) and EDXL-Distribution Element. We have links to OASIS on our website if you’re interested in more information or obtaining a copy of that standard.

In November of 2008, OASIS released two new standards as part of the EDXL Suite. They’re called Hospital Availability and Resource Messaging. We’re in the process of adding the two new OASIS standards to our process and are currently seeking a couple of vendors to participate in a pilot for those evaluations.

One frequent question I get from vendors is if they’re still eligible for an evaluation if they don’t implement the technical standards or if they only address one of them. The answer is yes…we can still review the product for just the NIMS criteria if it doesn’t address any of the technical standards or look at it for one of the standards.

[Slide 7]

There is no cost to participate in our program except for the time and effort to support planning, setup, and training activities. The vendor is responsible for providing our team whatever training they would typically provide to someone that purchases the product. After the evaluation, we develop a comprehensive report which addresses all the team’s findings. We also develop a short three to four page Summary modeled after the SAVER program summaries. Both of these documents are available through the Responder Knowledge Base website which provides vendors that participate added exposure to the community. Vendors can also use results to identify gaps and enhance their product in the future. Ultimately, we believe this leads to better products available in the field.

[Slide 8]

We recognize that there are a lot of systems out there to choose from. From a practitioner perspective, we’re trying to provide objective evaluations of products in a consistent and easy to read format. The summaries from our pilot program and several full reports are available online. Additionally, you can access this information and practitioners can reference these recommended technical standards and criteria when looking to purchase or develop products.

[Slide 9]

Just a bit of background about our program. We initiated developing the infrastructure here at the facility in early 2006. We have a simulated Emergency Operations Center and a Simulation Cell that functions as our Test Laboratory.

In 2007 we conducted a series of pilot evaluations to define our processes and focused on developing the test procedures for the Common Alerting Protocol standard. In developing our test procedures we worked closely with DHS Science & Technology staff.

In 2008, we expanded the technical portion of our program to include the EDXL-DE standard. We continued a series of pilot evaluations; completing 17 products that year. We’ve been working on putting together some statistics about the products we’ve evaluated in an Annual Report and found that they were deployed across the country, in at least 22 states.

Finally, we formed a working group which includes some of our key federal partners within DHS as well as practitioners and vendor representatives. Our last meeting with our working group was in early March. In the development and implementation of our program, we’re trying to collaborate with all stakeholders. We also initiated the development of our program website which was deployed last year.

[Slide 10]

Here is a snapshot of where we’re going this year. We launched our website this past October and are formally accepting applications for the evaluation program now. We’ve been focusing on outreach activities such as journal articles, these types of discussion forums, and are planning to attend several conferences this year. We’re conducting evaluations throughout the year, on an ongoing basis and we’re posting the results of the evaluation online so the community will have access to them through the Responder Knowledge Base. We’ve recently gone back to all of our pilot participants and developed short (3-4 page) summaries based on the full reports. These have all been published online.

We’re expanding our assessor program this year as well. I have a slide on that here in a few so won’t go into much detail about it here.

Again, we’re planning an in-person meeting of our working group this fall. We’ll also conduct several calls with the group throughout the year as they provide input to both our Standards and NIMS STEP task areas.

We’re in the process of developing procedures for the Hospital Availability and Resource Management standards. Actually, we’ve completed a draft of our procedures in coordination with DHS S&T staff and are currently looking for several vendors to participate in a pilot evaluation. If you know of any good candidates please let us know.

Jim Goodson is our Chief of Test and Evaluation and leads our laboratory accreditation efforts through A2LA which is the American Association for Laboratory Accreditation. Jim has been instrumental in the development of NIMS STEP and other T&E efforts here at the facility. The NIMS SC laboratory was recently accredited under the 17025 standard and Jim is working on obtaining certification under the 17020 standard for inspections as well.

Outreach is key to our efforts this year. We have several program fact sheets, which are available for you to download today through the handouts tab. We use the fact sheets at conferences and other events to get the word out about our program. We’re also working currently on several journal articles, and plan to have one in the APCO journal soon. Certainly, if you have other ideas for outreach please let us know during the Q&A period.

[Slide 11]

This is a high level overview of the steps we go through for each product we evaluate. To start the process, vendors can register and apply online at nimsstep.org and we review applications on a monthly basis with our Product Selection Committee. We coordinate the logistics with approved vendors and put together a plan specific for each product. Prior to each evaluation our lead Subject Matter Expert for the event develops a scenario appropriate for the system. It typically takes one to two months of planning prior to an evaluation.

Our evaluation team for each product consists of five key people; three Subject Matter Experts, a Test Engineer, and an analyst. This team looks at a product over the course of a week. The test engineer runs the appropriate set of test procedures for the technical standards and our SMEs work through a scenario with the product to get hands-on experience with it. At the conclusion of the evaluation they complete the NIMS STEP worksheet related to the NIMS concepts and principles portion which ends up in our evaluation report. We have an analyst that leads the vendor coordination activities, and is responsible for synthesizing all the data we collect, and works with our technical writer to develop the report.

After the evaluation is complete, we return the product to the vendor and remove it from our systems. Within about 45 days we have a draft report available for the vendor to review and they have an opportunity to provide the evaluation team feedback and a response letter before we finalize the report. As I mentioned, we also develop the three to four page summary and post the results online.

[Slide 12]

I wanted to go into a bit more detail about the OASIS technical standards that are part of our program and how we evaluate products against the standards. In Fiscal Year 2007 the CAP and EDXL-DE standards were added to the NIMS Recommended Standards List. And the HAVE and RM standards were recently added. We’re currently geared towards these interoperability standards.

These are all XML based standards, and are often referred to as part of the EDXL-Suite. The Common Alerting Protocol (CAP) standard was released in October 2005 and provides a format for exchanging emergency alerts and public warnings. The next standard, EDXL-DE is often referred to as the envelope. This standard helps route messages to different recipients. It can, for example carry a CAP or other message type as a payload. The next two standards, HAVE, and RM were released in November of 2008 so these are fairly new. The Hospital Availability Standard provides a format for status information of hospitals, what services they provide, and their resources. Similarly, RM, is focused on message sets for exchanging resource information. Again, these standards are available free of charge at the OASIS website if you’re interested.

[Slide 13]

This table shows how we test products for their adherence to the technical standards. All of our procedures are structured basically the same way and products are rated in each of these areas by our test engineers. This is an example for the CAP tests. The first and second columns here are just how we refer to the Evaluation Case. The last column talks about the objective during that portion of the test.

I won’t go into too much detail on this. We generate messages from the system under test and run them through several automated tools to ensure the message is structured properly. That is accomplished in the first two evaluation cases. In this third case our engineer manually checks everything in the XML against the OASIS standard, including business rules, making sure cardinality and mandatory elements are present.

The first three test cases were directly related to the standard. This final test case (#4) is focused on the intent of interoperability, above and beyond the standard itself. Here we’re looking at if the system under evaluation can successfully exchange messages with a disparate (or different) system which must be from a different vendor. For example, we use DMIS and the OPEN backbone during the majority of our evaluations.

EDXL-Distribution Element, HAVE, and RM test process follows same general format.

[Slide 14]

As I mentioned, we review each product for NIMS Concepts and Principles. This slide shows the main categories our Subject Matter Experts are looking at. SMEs have a worksheet that they use for each product which breaks each of these criteria down further. The detailed criteria are available in Appendix A of our new NIMS STEP Guide which is available on the main page of our website if you’re interested in more detail on these criteria.

So the main criteria are Emergency Support, Hazards, Resource Management…as you can see listed here. We’ve recently restructured these categories to fold Scalability within the Communication and Information Management and have built out criteria on Common Operating Picture.

In the first criteria, we’re looking at what ESFs and Incident Command functions the system applies to and identifying any potential obstacles to them implementing the product. In terms of the Hazards criteria, we’re looking at what hazards the system is applicable to (manmade or natural). If it applies to the product, we look at Resource Management components to ensure it is consistent with the process outlined in NIMS and if it supports FEMA typed and non-typed resources.

In the Communication and Information Management criteria, we look at how the product shares information with other (different) systems. So, for example does the system under evaluation have a way to get the information into any other system or is it restricted to only other users of the same product. We’re also looking at the product’s use of plain language or clear text and any information security issues.

As I mentioned, we moved Scalability to this section and added some more detailed questions regarding Common Operating Picture. In terms of scalability, the focus is on how does the product scale from a local incident to something at a state, regional, or even a national level. This is an important concept of NIMS. Within this criteria the SMEs consider how it would apply to a multi-agency, multi-jurisdiction, and multi-discipline efforts. Under Command and Management we’re focused on if the product supports the management characteristics of ICS and is consistent with the terminology.

This portion of the evaluation is conducted by three Subject Matter Experts. This includes two of our staff members and we have a new addition to the program this year where we include an external assessor as part of the evaluation team.

[Slide 15]

Here I’ve listed the minimum qualifications to volunteer as an assessor for our program. As you can see we have criteria for years of experience and completed training, among others. If you’re interested in volunteering, we’ve uploaded a copy of our Background and Interest form to the Handouts section today. Please download a copy, complete it, and send a copy to us via email to [email protected] (see third bullet). We outreach to volunteers when we have a product that matches someone’s particular background. The program reimburses participants for allowable travel, meal, and incidental expenses. After participating, you’ll receive documentation that you can use to support your professional development goals such as Certified Emergency Manager. We’re currently working on getting some information about our assessor program on our NIMS STEP Website.

[Slide 16]

Most of our evaluations are conducted here at our facility in Somerset, Kentucky. For web-based products we can setup a remote evaluation. As I mentioned, we have a simulated emergency operations center and a Simulation Cell that we use for evaluating products.

[Slide 17]

I’d like to turn it over to Jim Goodson, our Chief of Test and Evaluation, for a few minutes to talk about the NIMS SC laboratory accreditation.

Jim Goodson: Thank you, Camille. In late 2007 and early 2008, DHS S&T asked the NIMS Support Center through FEMA IMSI to investigate accrediting the laboratory through a formalized process. That process selected was through, as Camille mentioned, The American Association for Laboratory Accreditation. Specifically, we looked at validating our procedures and processes in accordance with ISO standard 17025 for testing and calibration laboratories.

In fall of 2008, that accreditation began in earnest with a visit by A2LA where they evaluated our capabilities. In fact, I have to commend Camille and Tim Gilmore, our lead engineer, because in that evaluation we came up with zero technical deficiencies on the processes and procedures they had developed. But focus more on our management system and quality system on how we can better manage and validate our processes for the customer at large.

In December of 2008, we then received our accreditation through A2LA, and at the same time, in request from A2LA, at least identification by A2LA, to look at being accredited under ISO 17020 which is for inspection bodies, which focuses primarily on the non-technical standard. That is specifically those criteria that our subject-matter experts evaluated.

We have, in fact, submitted an application package for that particular standard (ISO 17020), and that is in the evaluation process at A2LA this week. We should be getting a response back in the very near future on status of that particular accreditation.

We’re looking at leveraging our capabilities. Our laboratory is a straightforward IT laboratory. We are the only IT laboratory so far accredited in the United States under this particular standard. We do anticipate others will be doing this in the future. We look forward to participating with the industry when that does occur.

Other programs are looking at this program as a model to use in the future such as possibly with the FEMA IPAWS program. If that, in fact, does happen we’re looking forward to supporting them in kind. We do promote the use of the laboratory in particular for IT evaluations, and as Camille has well said, is prepared to accept products through the STEP program as soon as the vendors apply for those particular opportunities.

Camille Osterloh: I’ll spend a few minutes to pull up our program website and demo it as well as how to access the results on RKB.

[http://www.nimsstep.org]

This is the main page. I’d like to call your attention to a couple of items. Here on the right-hand side, there’s a Documents Download section, where there are a couple of documents that I mentioned today—our NIMS STEP guide outlines the program and in Appendix A it includes the worksheet that our subject matter experts use during the evaluation.

This is a copy of the Facts Sheet. You actually have a more recent version in the hand-outs tab today, so please use that version. We have a copy of a case study that our standards team worked on that talks about how the common-alerting protocol in the EDXL-DE standards are implemented in the field. I think that will be of interest to you if you get a chance to look at that.

If you’re a vendor, and you’re interested in participating in our program, you need to go to the vendor registration and just provide some basic information to set up a user account with us. Once that’s complete, you can log in to our site and select the "apply now" button, and it brings up a basic intent for evaluation form, provides some general information about the product, and submit that, and as I mentioned, our selection committee review that on a monthly basis for vendor products.

The next part that I’d like to talk about is right here, we have a link on our site to the Responder Knowledge Base website. There’s a recent news release, actually, about the NIMS STEP report. I’ll go ahead and follow that link and then I’ll show you how to do a keyword search on it.

[http://www.rkb.org]

This is the news release. It talks about the different products that we’ve evaluated. It provides, at the end of this, you can find all the NIMS STEPS reports, and I’ll link to those.

Here on the Operational Assessment tab, it has actually brought up all the items that were hit on and included in the facts sheet, as well as some of the news releases. If you click on the Operational Assessment tab, it brings up specifically the summaries and the reports from the evaluation activities.

If you don’t follow that link, if you just hit the Responder Knowledge Base home page, you can do a search on keyword "NIMS STEPS", and it will basically bring you to the same location and you can click the Operational Assessment tab. So you don’t have to follow that link if it’s not available.

[Slide 18]

And with that, our contact information is listed here. Please feel free to email us at [email protected]. This gets distributed to our team so please let us know if you have any further questions after the discussion today.

For more information about the program again, visit our website at www.nimsstep.org. If you’re a vendor interested in applying for the program you can now go online and register, then complete an application form to submit a product. Our Product Selection Committee meets on a monthly basis to review vendor applications. Our next review cycle will be in mid-April.

And with that, if anyone on our team has anything to add…otherwise, I’d turn it back to you Amy for any questions from the group.

Amy Sebring: Thank you very much. Now, to proceed to our Q&A.

[Audience Questions & Answers]

Question:
Mitch Saruwatari: Is there a second evaluation following vendor modification to mitigate any identified gaps found in the first evaluation?

Camille Osterloh: That’s a very good question, Mitch. We do go through, we have an opportunity for vendors to re-apply for the program and we have a re-test request that they can go through. The results on the website are initially based on the product we evaluate at that time.

Question:
Chuck: Do RKB or OASIS require secure or vetted login IDs or procedures?

Camille Osterloh: That’s a good question, Chuck. To access the reports and summaries, it does not require a log-in or password, if that’s what you’re looking for. The OASIS standards are available to anyone. You can go to their website, and there are links to them from our NIMS STEPS website. Through the Emergency Management Technical Committee, if you search on them, you can obtain copies of the standards free of charge and there’s no user name or password required to access those, just on the membership area.

Question:
Isabel McCurdy: Camille, do you accept vendors from Canada?

Camille Osterloh: Yes, we do. One of the criteria that our selection committee considers is if it has at least one deployment within the United States; that is the criteria for the international applicant. We do encourage applications from Canada.

Moderator: I believe Canada has adopted CAP up there and they have a CAP profile as well, do they not?

Camille Osterloh: Yes.

Question:
Geoff Hoare: What are good introductory resources for understanding the extent of standardization that is occurring for disaster response inventory types of information? For those without an IT background, this field is confusing. Particularly when trying to educate response partners vetting different vendors with incompatible databases.

Camille Osterloh: I would encourage anyone to look at the NIMS document and National Response Framework and look at some of those key concepts in there, and also referencing the OASIS standards.

Moderator: I think one of the links that we have on today’s page is that there was a NIMS alert recently about adopting the new technical standards. It is a little confusing, and if you’re a practitioner, you may need to, as far as the technical side of it, get some IT help.

Camille Osterloh: Right.

Question:
Dan Linehan: What are the limits placed upon your reviews for marketing or other commercial applications?

Jim Goodson: As far as marketing is concerned, we do have frequently asked questions and that’s actually one of the questions on the degree that you can use our information for your marketing purposes. Please take a look at that, and that will help illuminate the degree that you can.

The most important thing to remember is that by coming through the evaluation process it does not mean that FEMA or anyone else that’s a part of the evaluation program is stating that the product is NIMS compliant. We don’t do a compliancy check in that respect. Take a look at the website and look at the frequently asked questions on that particular topic.

Camille Osterloh: Vendors can certainly use the evaluations and reports. There are no restrictions on the distribution of it once it’s finalized, for their purposes.

Question:
Mitch Saruwatari: Do you consider participation with the Unified Incident Command and Decision Support (UICDS) program in your evaluation?

Camille Osterloh: No, we don’t.

Question:
Roop Dave: What is validity of evaluation in this highly dynamic technology area?

Jim Goodson: The test that we do is actually a point-in-time test of the product as it’s delivered. You recognize that there is a lot of volatility in the market and people are always competing and improving their products. There is a clear understanding that there will be changes to the products after they come in to test.

The findings that we provide many times will instigate the vendor to make a change, sometimes immediately, sometimes in their standardized change process. If somebody comes back in for tests, we would want to take a look at that product for the changes since it last was tested here. Clearly, that’s an area of sensitivity for the product evaluation review team to take a look at and see what those changes are.

Question:
Ian Hay: How do you see integrating private sector preparedness with NIMS compliance? (to ensure we are speaking the same language during a disaster)?

Chad Foster: I’m not sure in terms of private sector preparedness, obviously from a NIMS STEP perspective, many products we have looked at in, or expect to look at are used in both public and private sectors. In terms of the products we’ve seen at least until now, the twenty or so products we’ve evaluated, they’re used both in the public and private sector.

One of the requirements that state and local jurisdictions have through the compliance program is to encourage adoption of NIMS among all the private sector NGOs within their jurisdiction. It definitely is an important part of NIMS, from a high level.

Question:

Moderator: Can you talk a little bit about what is going on with the common alerting protocol right now, that in fact the IPAWS profile is out for review and comment as we speak?

Jim Goodson: Presently the CAP IPAWS profile is in public review. It’s going to be in public review through the 2nd of May. Once that review is complete, there will be some findings that they’re going to have to digest and roll into a corporate level integration of those changes as they’re deemed appropriate. I think there will be an additional short review after that to look at the FEMA profile that is actually somewhat independent of the public review that’s ongoing.

All indications are that within the next six weeks or so, I would imagine that IPAWS profile itself will be well into the vetting process.

Question:
Amy Sebring:
For those not familiar with the acronym, IPAWS is Integrated Public Alert and Warning System. FEMA is working under some FCC rules on that to meet requirements. I mostly wanted to highlight the fact that CAP is also going to be the basis for the CMAS system, that’s the Commercial Mobile Alert System that we’re all eagerly awaiting. Have you looked at any testing compliance issues with CMAS at all for the future?

Jim Goodson: No, we haven’t. We actually don’t have the tasking to get into the details of that at this time. What will happen is that we’re do attend the federal working groups as people in the peanut gallery, basically listening to the processes and the work going on, but right now we’re really interested people on the sidelines looking at what’s going on.

Question:
Ian Hay: Are there Federal programs that offer private sector, CI and NGO's NIMS training at no cost?

Chad Foster: From what I understand, all of the independent study courses that EMI has on the website are available to the general public, including the private sector. Go the EMI web page under training, and search for the NIMS courses, those are all available. I know states coordinate some of the more advanced in-person courses on ICS and other NIMS topics.

I would encourage everybody to go to the NIMS Resource Center web page and see what they and EMI offer in terms of independent study courses. Otherwise, you may want to contact your state NIMS coordinator to find out what opportunities may be there within your state as a private sector entity. [http://www.fema.gov/emergency/nims/NIMSTrainingCourses.shtm]

Question:
Geoff Hoare: I may have missed it, but are there a number of scenarios that others can use that have been used in the virtual world exercise capability? Or is this still too new?

Chad Foster: There are exercise scenarios that FEMA has developed, as well as others I would say, that are available. We’d be happy to point the person in the right directions in terms of getting their hands on the scenarios. If you would like to just send a note to the email address provided on the slides, we can point that person in the right direction. There are scenarios developed by FEMA. I’m just not sure they’re available on the web somewhere.

Question:
Dan Linehan: What is the typical lead time from the time a vendor registers for a review to the time the review takes place? Is it a requirement or suggested that the vendor appear at Eastern Kentucky?

Camille Osterloh: No, there’s no requirement that the vendor necessarily comes to the NIMS Support Center when they apply. It’s typically a couple of months process. Once the vendor gets their application in, we have our regular meetings of our selection committees around the middle of each month, and so that could take a couple of weeks to get into a review. Then it’s about a month or so of planning with the vendor. But we’re conducting evaluations throughout the year, so we will work with the vendor to schedule in a time that works.

[Closing]

Amy Sebring: Time to wrap for today. Thank you very much Camille, Chad and Jim for an excellent job. We wish you success with your efforts in the future.

Please stand by just a moment while we make a couple of quick announcements...

Again, the recording should be available later today. If you are not on our mailing list and would like to get notices of future sessions and availability of transcripts and recordings, just go to our home page to Subscribe.

Don't forget to vote in our poll, and PLEASE take a moment to do the rating/review! I am going to load the rating/review form into Live Meeting so you can complete it on the spot. Note: We are asking you to rate the relevance of the information, and this will assist our future visitors.

Thanks to everyone for participating today. We stand adjourned.