Posts Tagged: igis
It's amazing how quickly 2017 flew by. The dominant theme of current events this year seemed to be that of change, and not always for the better. But as the news cycle spins faster and faster, we in IGIS continued to focus on the long wave of progress, building bridges between ANR's research and extension mission and the super-exciting developments in geospatial technologies and data. Our year was extremely busy. A few of the big highlights are below.
Strategic Planning. We started out 2017 winding up an internal strategic planning process we began in 2016. After numerous discussions, reflections, and iterations, the result was a draft Strategic Plan for our program based around five core goals.
- Provide GIS training and support services across the ANR continuum and program implementation cycle
- Expand ANR's capacity for drone research and applications
- Strengthen collaborations within ANR, UC, and beyond
- Be a bridge for ANR to access cutting-edge geospatial data, tools, science, and research
- Sustain and develop ANR's flux tower network
Our Strategic Planning will continue in 2018 so we can incorporate the key recommendations from our Five-Year Program Review.
Five-Year Program Review. 2017 marked our Program's fifth anniversary (talk about time flying!), hence it was time for our Five-Year Review. This was a welcome opportunity as a lot has changed since IGIS was established in 2012, and we needed to assess what we've achieved and where we need to focus moving forward. The Review started with a presentation by Maggi Kelly to Program Council in January 2017, and concluded eleven months later in December 2017 with a presentation by Committee Chair Michael Cahn on the Review's key findings and recommendations. In between were numerous information requests from the Review Committee, an extremely interesting Ripple Effect Mapping stakeholder exercise in June, and a lot of behind the scenes work by the Review Committee including meetings, surveys, phone interviews, and writing. We are enormously grateful to all the members of the Review Committee, Michael Cahn for chairing the process, Jennifer Caron-Sale for coordinating everything, and all of the stakeholders who provided input through surveys, meetings, and interviews. We are looking forward to hearing the final recommendations by Program Council and Vice President Humiston early next year, and have already started to implement many of the draft recommendations. We want to use this information to make IGIS more effective and useful in the future.
Mojave desert, which Sean and Andy Lyons mapped over a week for a project studying desert tortoise habitat. To help carry the load, all IGIS staff obtained their FAA Part 107 Remote Pilot licenses in March, and we now have basic drone equipment at all staff locations. This new capacity was put to good use shortly after the wildfires in northern California, when Shane Feirer flew a series of missions for UCCE Forest Advisor Yana Valachovic and fire ecologists seeking to understand fire dynamics and impacts. We also partnered with Drone Scholar on the novel #Fly4Fall project, and helped collect images of vegetation changing color worldwide.
DroneCamp was designed to give participants exposure to the full range of skills needed to collect data with drones safely, legally, and effectively. 36 participants from all over California and as far away as Hawaii came to Davis for three intensive days at the end of July. The curricula included a wide range of topics from mission planning, to flight operations, to visualizing the processed data. It was a huge amount of work but very successful and we've enjoy staying in touch with everyone through the new California Drone Mapping email list we launched in August. Watch this space for an announcement about DroneCamp 2018!
Workshops. Our workshop calendar filled up quickly. Aside from DroneCamp, we held 7 workshops on drones, 4 workshops on GIS and mobile data collection, 2 remote sensing workshops, and one workshop on spatial analysis with R. These were held all over the state in collaboration with more hosts than ever before, including RECs, UCCE offices, 3 UC campuses, and for the first time ever a private aerial imaging company. In-person workshops are a lot of effort, but we enjoy doing them and many spin-off projects have come out workshop conversations. Our workshops are complemented by an increasing number of resources on our website, including Tech Notes on important workflows and a growing number of videos on our YouTube channel. In 2018 look for a new training needs assessment that will help us continue to meet the professional development needs of the Division.
Looking ahead to 2018. In 2018, we are looking forward to incorporating the recommendations from our Program Review and completing our Strategic Plan. A big theme for 2018 will be finding more and better ways to connect with and stay in touch with all parts of ANR, and expand the reach of our small staff. We already have a number of new projects in the pipeline that will keep everyone busy, in addition to the usual portfolio of ongoing GIS support, workshops and drone services. Our research work will continue tackling some of the big technical bottlenecks in working with drone data, including data management, image processing, and extracting more juice from the high resolution 3D data. We're all going to the Statewide Conference in April and will be taking part in several sessions - see you in Ontario!
Happy New Year everyone from us at IGIS and ANR.
Recently I was fortunate to work with the IGIS team in Santa Rosa and Sonoma to explore why so many homes and buildings were lost in the October Tubbs and Nuns Fires. With the IGIS's Shane Feirer we collected drone-based video to record how the fires burned through the vegetation near and around the lost structures.
We observed several sites where there was little fire activity in the forests or woodlands, yet the homes burned. This type of video helps us document how devastating a wind-driven ember fire can be and of the important lessons we can learn to be better prepared for wildfire.
From this experience I came away with a painful reminder that we all need to do a better job at focusing on fuels near our homes (e.g. combustible wood mulches used in landscaping, lawn furniture, leaf accumulations, dry landscape plants, etc.), especially in the 5 feet immediately adjacent to our homes. While the Tubbs Fire originated in grassy area in Calistoga it easily picked up embers from the burning vegetation which were moved by the 40-70 mph winds and created spot fires ahead of the flaming front. In short time these embers were blasted into homes via attic or soffit vents (critical to let moisture out of a building) or they ignited combustible materials close to buildings; these types of exposures are the primary way the Tubbs Fire started to consume homes. Eventually the Tubbs Fire moved to the more densely populated areas of the Fountain Grove subdivision in Santa Rosa and with each new home that was ignited a new source of embers were created. The embers that came from the burning buildings included 2 x 4s, chunks of wood the size of a frisbee, and other materials. These materials were blasted over Highway 101 on to homes and businesses in the urban center of Santa Rosa- a place most thought could not be impacted by wildfire. The winds persisted till mid-morning on October 9th providing considerable time for an ember to find a weakness in the home. All of us hope we never have a fire like this again, but as history shows us, California's most damaging fires typically occur in the September and October and are often wind-driven.
For many years UC has worked in educating homeowners about fire preparedness in the Wildland Urban Interface (WUI). These fires have resulted in the largest number of structure losses to date in California and we all need tools to better understand how to learn from these experiences. I greatly appreciate IGIS's willingness to help me collect some critical data in a time sensitive manner.
The National Agriculture Imagery Program (NAIP) is a USDA service which has been collecting aerial imagery once a year during the growing season for the entire continental US. Established in 2003, the images are taken from airplanes and then stitched together to create a four band (blue, green, red, & near-infrared) digital orthomosiac at a one-meter resolution.
This dataset has become the foundation for many important analyses across the country, including crop distribution maps and agricultural forecasts, and has served as the basemap for countless maps. For us in IGIS, NAIP imagery is one of our go-to datasets for map backdrops or raw data for things like classification or georeferencing drone images. ESRI makes NAIP imagery easy to use by distributing it through their ArcGIS online platform.
Aside from its national scope and high quality, a big reason why NAIP imagery has been so widely used is because it has always been public domain, available at no cost through the USDA Geospatial Data Gateway. But this may soon be changing. As recently reported on GIS Lounge, the USDA Farm Services Agency (FSA) is considering making NAIP a Commercial-Off-The-Shelf (COTS) product subject to a license. This means everyone, including government agencies and researchers, would have to start paying for the data, and have limitations on what they can do.
According to a recent presentation by John Mootz, Imagery Program Manager at the FSA Aerial Photography Field Office, the switch to a license model may become necessary because the current funding model isn't working. NAIP is funded under an innovative arrangement where costs are shared by state and federal agencies (not unlike Cooperative Extension). However, some states haven't been paying their bills, leaving a $3.1 million shortfall over the past few years. Aside from being financially unsustainable, late or missing payments cause delays in scheduling flights which can result in images being collected past the peak agriculture growth season. The time it takes to process the data, which has already stretched from 2 years to 3 years, is also affected.
What does this mean for ag? For cash-strapped agencies, researchers, and members of the public, losing access to a valuable dataset is never a good thing. But a lot of questions are still unknown. Collecting geospatial data is expensive, particularly for the entire USA, and as we've already seen with NAIP funding shortfalls can affect the quality and timeliness of the data. A lot of other data collection is funded through license models, which can work well if they are affordable and licenses tailored to the different needs of users. How will the state agencies who have been funding NAIP respond if FSA switches to a license model? Could other technologies fill the gap, such as the many commercial high resolution satellites that are now in orbit? Can political will be mobilized to convince USDA that collecting data that serves a public good is a good role for government, or has that cow left the barn?
FSA needs to make a decision by May 1, 2018 whether or not to change the distribution model for the NAIP 2019 data (to be collected in summer 2019). They are currently collecting impact statements "to allow FSA leadership a clear understanding before they make the final decision". MapBox, the popular mapping engine, for example has come out strongly in favor of keeping NAIP open. If this affects you, please leave a comment below and also consider letting the FSA know how your work would be impacted, for better or worse, if NAIP data become licensed.
GO FAIR is an initiative to promote and support data stewardship that allows data to be Findable, Accessible, Interoperable, and Reusable. I was pleased to attend the launch of the first North American FAIR network last week at the UC San Diego Supercomputing Center.
Coping with a Data Tsunami
To say that we live in a data rich world is an understatement. We live a data drenched world (a fact I'm constantly reminded of by the 'hard drive full' warnings that pop-up on my computer on a weekly basis). Thanks to simultaneous, order-of-magnitude, advances in our ability to produce, disseminate, and store all manner of data, people working in fields from economics to physics to agriculture are struggling to benefit from, rather than be paralyzed by, the volume and diversity of data we produce. And this is by no means a problem only affecting academics, as more and more individuals, private companies and organizations are collecting and working with large volumes of data, from personal health sensors to drones.
Adding to the challenge, there are often major barriers to get data to talk to each other. They may be stored in different formats, use different scales or units of analysis, or be under different restrictions. If you've ever carried personal health data from one doctors office to another by hand, you know what I mean.
FAIR Data Stewardship Principles
These are not new problems, but have taken on increased sense of urgency as the challenge gets worse and the demand for integrated analyses of complex problems grows. GO (Global Open) FAIR is a European based initiative that has two faces: i) a set a principles for data stewardship, and ii) a growing network of institutions and programs that are taking tangible steps toward a world in which data are Findable, Accessible, Interoperable, and Reusable. FAIR certainly doesn't mean that collected data have to be free or open access, but data stewardship should have a way to share information about the existence of data, and a means for access when appropriate.
The FAIR principles mirror what open science advocates have argued for many years. As a program, GO FAIR has gained more traction than many of its predecessors. Following endorsements from the European Commission and other international bodies, the EU has already committed €2 billion to the first phase of implementation. Starting in 2018, the major EU funding agencies will require applicants to submit data stewardship plans that align with the FAIR principles. The initiative is also investing a lot in training people to use metadata standards and tools, many of which already exist.
How is This Relevant for ANR?
ANR academics are impacted by the data psunami in at least two ways (neither for good). Like all practicing scientists, we have to deal with the usual challenges of managing large volumes of data, the frustrations of not being able to find or use data that others have collected, and the burden of all the gymnastics one must do to combine data from different sources into a robust, repeatable analysis. On top of that, as public servants whose work is funded by taxpayers, we have an additional moral and legal responsibility to be good stewards of all data collected for our public mission, which means ensuring the data we collect remains discoverable and accessible for other studies. Similarly, our extension mission also requires us to help California growers and land stewards get the most value from the data they collect, with tools that address their requirements for privacy and security.
While this may all seem like a lot to think about and additional work, the rewards are pretty exciting as the following video shows:
How Close are Your Data to Being FAIR?
For many us, putting the principles of FAIR data stewardship into practice will require a step or two we're not accustomed to, such as i) generating metadata in a format that can be read by both people and machines, and ii) storing our data (and metadata) for the long-term. The table below from a recent Nature article breaks down the gold standard a little further.
F1. (meta)data are assigned a globally unique and persistent identifier
F2. data are described with rich metadata (defined by R1 below)
F3. metadata clearly and explicitly include the identifier of the data it describes
F4. (meta)data are registered or indexed in a searchable resource
I1. (meta)data use a formal, accessible, shared, and broadly applicable language for knowledge representation.
I2. (meta)data use vocabularies that follow FAIR principles
I3. (meta)data include qualified references to other (meta)data
A1. (meta)data are retrievable by their identifier using a standardized communications protocol
A1.1 the protocol is open, free, and universally implementable
A1.2 the protocol allows for an authentication and authorization procedure, where necessary
A2. metadata are accessible, even when the data are no longer available
R1. meta(data) are richly described with a plurality of accurate and relevant attributes
R1.1. (meta)data are released with a clear and accessible data usage license
R1.2. (meta)data are associated with detailed provenance
R1.3. (meta)data meet domain-relevant community standards
Wilkinson, Mark D., et al. "The FAIR Guiding Principles for scientific data management and stewardship." Scientific Data 3 (2016): 160018.
As a research technology unit, I think we're doing fairly well in terms of keeping our data organized and accessible for the long-term. However after looking at our data management practices through the FAIR lens, I now see our metadata misses some important characteristics, a lot of the quality metrics aren't machine readable, and need to learn more about metadata repositories and discoverability, particularly for our drone data. These are challenges common to many new sources of geospatial data, and we look forward to engaging with the new arm of the GO FAIR network to develop solutions.
Under the Fly4Fall campaign, amateur drone hobbyists across the globe are invited to take aerial 360 photos with their drone and contribute them to a collection of fall landscapes that will grow over time.
Never taken an aerial 360 photo before? Me either, but fortunately it recently got a whole lot easier with a free iOS app called Hangar 360. The Hangar app flies your DJI drone for you, climbing to the height you program and then taking about 25 photos in a circle at three different angles to the horizon. The whole thing takes about 2 minutes, and you can collect multiple panos per flight. You then land the drone (but don't turn it off just yet!), transfer the photos from the drone to your phone over the WiFi, and then upload the photos to Hangar. Hangar stitches the photos for you in the cloud (also free!), and sends you a link. The results are stunning! See the panoramic photo below of Kearny REC made by IGIS's Robert Johnson earlier this week.
Inspired by citizen science initiatives like the Christmas Bird Count and Project BudBurst, where large numbers of naturalists record observations in a coordinated way, Fly4Fall is part non-professional science project, part art, part community building, and a whole lot of fun. Crutsinger discussed some of the potential science angles in a recent LinkedIn post.
Full instructions can be found at Fly4Fall.com. Currently, the Hangar app only works on iOS, unfortunately, and only with DJI drones (but the list includes most of the popular ones). Android enthusiasts can check out Litchi, which includes similar functionality but costs $25 and you have to process the images on your own (look for tutorials online).
Of course like any drone flight you have the follow the rules - only fly in permitted areas, don't fly directly over people, and be safe!
We look forward to seeing the Fly4Fall panoramas coming in. Feel free to use the comment box below to share your experiences and thoughts!