Introduction
The 2020 COVID19 pandemic changed the way educators work across the board. Like many groups, we had to quickly retool our geospatial workshops from a largely in-person format to online delivery. We quickly learned that simply replacing the overhead projector with a webcam and Zoom was not going to cut it. It's been an interesting journey, with lots of lessons learned, some surprising benefits, and a number of ongoing challenges and frustrations.
In this two-part post, I reflect upon lessons and aspirations for a very specific type of online training: live workshops focused on showing people how to use GIS, photogrammetry, and statistical software. These reflections stem from several workshops I was part of in 2020, including our first ever virtual DroneCamp, a new workshop focused on using climate data for community resilience and adaptation planning, a workshop on working with climate data in R, and two 12-hour workshops on geospatial analysis with R I taught in collaboration with SCGIS and BayGeo. Some of these lessons are specific to this new online technology, while others go back to age-old best practices for effective teaching.
In Part I (this post), I review some of the lessons, techniques, and conversations about training we had in 2020. In Part II, I discuss some aspirations and directions for 2021. But first let's review the particular requirements and challenges for this specific type of online training.
The Curious Circumstances of Live Online Software Workshops
Several characteristics of live virtual workshops frame the challenges we face:
- One-time engagement. Unlike a course that meets repeatedly over many weeks, we generally have a single point of contact with workshop participants. Even when a series of workshops are thematically connected, they are almost always offered as standalone modules, so we can hope but not expect that people have attended previous workshops.
- Diverse backgrounds. Also unlike a traditional classroom, where you can mostly assume students have minimum prerequisites, workshop participants generally have a wider range of backgrounds both in terms of their familiarity with the content (e.g., GIS concepts) as well as the specific software tools being used. Particularly with 'Intro' workshops (which we do a lot of), even basic computer skills like unzipping a zip file can not be taken for granted.
- Hands-on emphasis. The primary goals of our workshops are to impart practical skills, so the format requires a strong hands-on software component. There's a huge demand for hands-on software training, and many, many benefits for the learner, but incorporating hands-on practice also adds a lot of moving parts.
- People connecting from home. Even before COVID, most participants in virtual workshops were joining from home. This generally means people are connecting from a laptop of unknown specifications, using a slow to moderate internet connection. Computer hardware, workspace distractions, and the need to multi-task are generally all over the map when people are connecting from home, and there's not a lot we can do to influence these factors.
- Voluntary, self-motivated participants. The people who attend our workshops are there because they want to be. In many cases they're actually paying something out of pocket. Self-motivated students are delightful to work with, because they're already curious and see value in the content. But it also increases expectations. Unlike a traditional classroom environment where accountability is primarily upward, in adult learning we are accountable to our students. If our materials are not clear or coherent, we definitely hear about it.
Workshop Goals and Design
What is the value of live online instruction anyway?
Unlike a traditional classroom, online instruction forces us to rethink our value proposition because all of sudden there's instant competition on par with what we're delivering - the vast amount of high-quality pre-recorded content out there. Groups like ours have to ask ourselves why would someone sign-up for our live online workshop, when there are hundreds of hours of video and online lessons available on-demand, and often for free or a very modest subscription. Should we just save ourselves the trouble and point them to existing resources?
Online instruction forces us to rethink our value proposition because all of sudden there's instant competition on par with what we're delivering - the vast amount of high-quality pre-recorded content out there.
Having taught numerous workshops both in-person and online, as well as recorded many hours of video tutorials on computational modeling, I see a number of niches for live instruction that recorded on-demand content will never be able to match.
1. Helping beginners get up to speed. Beginners often need to hear concepts presented in different ways, with lots of examples, and the opportunity to ask questions about fundamental terms and concepts. For this audience, live presentations from an experienced instructor, coupled with a generous number of check-ins and opportunities for Q&A, are an efficient and effective way to move up the learning curve.
2. Reaching verbal learners. Many people are verbal learners, and cement their understanding by echoing and exploring concepts they've just encountered. Live instruction, when done well, can provide opportunities for feedback on both the breadth and depth of their understanding from peers and the instructor. This is particularly helpful when people are digesting new ways of thinking or exploring how new tools might be useful in their world. This is harder to do with pre-recorded content. One should always introduce important concepts before the technical stuff comes up, and pre-recorded content can incorporate glossaries, etc. This helps of course, but there is still no opportunity for Q&A and some learners may get left behind.
3. Exploring applications. Another learning goal that benefits greatly from live instruction concerns how to apply a technology to a particular domain. "How GIS can help you manage your ranch" would be an example of this type of learning goal. A workshop mixing basic concepts and tools, case studies, open-ended discussions, and perhaps group feedback on participants' action plans might be good tasks to incorporate for this type of goal.
4. Planning to engage. It's worth noting that while live online instruction can facilitate discussion, it is by no means guaranteed unless built-in by the instructor. This means doing things like capping the student to instructor ratio, building in pause points to address questions and have discussions, and structuring exercises to encourage interaction. One of my best virtual workshops this year was one in which the exercises were done in small groups in Zoom breakout rooms. Each breakout room had an instructor in the wings, but the students basically talked through the exercise and helped each other work through the material. Contrast that to a more traditional approach where the instructor gives an intro presentation for 45 minutes and then prompts to the students to individually work through a long exercise.
5. When live instruction loses out to pre-recorded. The higher up you go on the learning curve, live instruction starts to lose its edge. Intermediate and advanced users know what to ask Google or YouTube, and are knowledgeable enough to follow a guide.
Niche topics are another domain where the scales probably tip in favor of pre-recorded content. I've already decided the tutorials for a R package I'm developing to import climate data into R (an important but decidedly niche task) will be primarily on-demand recorded videos. Coupled with those however will be an extension of our online office hours, which I've found to be a much better way to help intermediate and advanced users work through challenges that can't be resolved by a Google search.
The higher up you go on the learning curve, live instruction starts to lose its edge.
What to do about the registration fee?
The question of registration fees for training is always an interesting conversation. When done thoughtfully, conversations about how much to charge delve into the host organization's mission, financial realities in the short and long term, organizational branding and outreach, and the needs and impact on related activities. More often than not, these discussions illuminate both tensions and synergies between different goals and perspectives, and the final result reflects some combo of necessity and compromise.
In my experience, there are a half dozen principles, or logics, people commonly invoke when thinking about how to price a workshop:
1) Make it free
Making workshops free has great appeal at many levels. It aligns with the values and mission of public universities to provide training and research for the masses. Since most of us are already paid by taxpayers directly or indirectly, training is something we can do to give back and make all the effort we put into research and innovation pay off. Technology is also liberating, so making it available to everyone is a way to reach those parts of society that need it the most.
Despite all of the above, we've come to conclude that charging nothing is rarely a good idea. In both in-person and online training, we've seen the worst attendance rates at workshops where there was no registration fee. Not surprising really. It's easy to sign-up for events that bear no cost, no commitment, and no risk. We all do it. But that makes it equally easy to blow off when something else comes up at the last minute. I put a lot of effort into preparing for workshops, so even if it was very successful, if only 5 out of 30 registrants show up I feel like my time has not been well spent.
Charging even a nominal amount like $5 will not only increase the attendance rate, but increase the odds that the right people show up. That small fee will get people to read the workshop details a little more closer, and decide if it's really worth their time. They're also more likely to do any preparation tasks you send them, such as installing software and downloading data.
Charging even a nominal amount like $5 will not only increase the attendance rate, but increase the odds that the right people show up.
Of course there are tradeoffs and limits to charging even a nominal fee. In our case as a small unit within a public university, just the modalities of collecting money has a lot of rules to follow. We're fortunate that we can outsource this task to another unit (thank you PSU!). If we had to do it ourselves, collecting small fees may not even be worth the effort. Also when we teach on one of our own campuses, we can't really charge students to attend (this is also where our attendance rates are most unpredictable).
We also don't want cost to ever be a barrier to people who need the training but lack the resources. Our go-to model has always been to always provide a fee waiver or scholarship application when there's a fee involved. This isn't hard to conceptualize or incorporate, but it's more work for someone to plan and execute (and probably not what the instructor wants to focus on).
2) Meet financial necessities
Sometimes financial necessity is the major driver behind revenue goals. If an organization's revenue model depends on income from training, those targets will certainly be front and center. Salary support is likely to be a major factor in these cases. Sometimes the training program itself may require a dedicated instructor or coordinator, although this seems less common in the geospatial world than other fields. Another common case is when some staff are on soft money and there are shortfalls or gaps in funding. There could also be another project within the organization, perhaps a pilot project, that is under-funded and needs financial support to keep going. For public institutions like ours, all of this is framed by the long-term trends in state funding, which lately seems to fluctuate between flat in good years and downward in bad years.
3) Recoup implementation costs
Another common goalpost for setting revenue goals is how much it costs to run the workshop. For in-person events, this typically includes the cost of renting a venue, snacks, and transportation for the instructors. On a couple of occasions we've also been asked to pay for a computer lab administrator's time to install software. The costs for virtual workshops are generally much lower, but could include the cost of your virtual platform. As of this writing, a 500-person Zoom webinar license costs $1400 / year. Other online platforms will be in the same ballpark, possibly more if they're tailored to virtual instruction. If you choose to hire someone to support registration, marketing, or back-end support, those will be additional costs. An expense we encountered for the first time in 2020 was the need to hire American Sign Language interpreters for one of our virtual workshops, at the rate of $140/hour.
Recouping implementation costs has always been the 'floor' of the revenue goals for our workshops, because we don't have a budget line for training operations. In many cases we stop at cost recovery because we or our collaborators want the training to be as accessible as possible. In the same spirit, we keep costs down to a bare minimum by collaborating with partners who can contribute meeting space, computer facilities, support staff, or even accommodation.
4) What will the market bear?
Another approach to thinking about price is to research the going rate for similar types of training. This isn't always practical because what we offer is typically not available elsewhere (a good thing otherwise we probably shouldn't be doing it), and the "market" we're targeting may be very different than what other programs are serving. Nonetheless there are many groups around the country and the world who provide geospatial training of various forms, and it can be informative to do a little price research to get a benchmark.
5) Prioritize people, not dollars
Most groups that offer geospatial training have other core activities, such as funded research, consulting services, or software development. In many cases, the real value in providing training is not a revenue stream but the relationships it develops and the market share it builds. Reaching people may therefore be a more important goal than maximizing revenue. And not just any people, it has to be the right people. This will probably favor keeping the registration fee relatively modest, but you might want to bump it up if you're aiming for a smaller niche audience.
Research-centric groups like ours fall into this general category. Providing training is part of our mission but not the core. Our bread-and-butter work is technology innovation and bringing geospatial expertise to research and extension collaborations. Hence goals for all our training initiatives include a bit of marketing, expanding and strengthening our networks, and gathering ideas for new research and extension projects. This is our core business, and our and training has definitely helped develop it. Due in part to our exposure through workshops, we've become a hub for both knowledge and people in our core areas of expertise, and have had several funded projects that started out as a side conversation at a workshop (something I wish Zoom was better equipped for!).
6) Can we get donations or sponsorship?
Aside from registration fees, donations and sponsorship may be able to support workshop costs. This requires further strategic thinking, because donors and sponsors will only be interested in supporting communities and events that advance their interests. In the geospatial world, this will likely include marketing opportunities for a sponsor's goods and services, so the audiences have to match. Another goal for many private and non-profit organizations is expanding technology to underserved audiences. These are elements that can't be tacked on to an agenda two weeks before the workshop is held. They require thoughtful planning and explicit efforts from the very beginning in how the workshop is framed, designed, and marketed.
We've dabbled in soliciting donations and sponsorships over the years for DroneCamp, with modest results. Corporate sponsors haven't been a great fit, because they're looking for exposure to large numbers of potential customers which our in-person workshops simply don't deliver. That might change however with more online programming - over 330 people attended our first virtual DroneCamp 2020. Raising money for scholarships has been an easier sell, because the amounts are lower and what we provide is really valuable to several resource-poor and underserved audiences, starting with students. All of this requires a lot of strategizing and legwork, which ultimately may be the biggest bottleneck in raising this type of funding.
Techniques for Online Teaching
More Important than Ever: Engagement and Active Learning
Classroom teachers quickly learn that the keys to keeping students engaged and focused are active learning and a bit of performance. The same is even more true for online instruction. In addition to all the usual distractions and challenges of presenting complex material, online instructors have to contend with dozens of other windows to click on, temptation from other devices, and Zoom fatigue. A long-winded presentation in a classroom may result in some glossy eyes; that same dry presentation given online may result in losing your students and never getting them fully back.
Being an engaging presenter starts with your screen persona. The usual characteristics of being a good presenter translate well - personable, spontaneous, good use of intonation, a pace that is neither slow nor fast, a little comedy here and there, etc. I often don't feel naturally engaging, particularly when I've been scrambling to develop new content, but I've also learned that much of this is performative, can be practiced, and to some extent scripted.
However the biggest impact on the level of engagement in my experience is not the instructor's personality but the workshop structure and flow. Gone is my traditional 45-minute opening presentation. My general rule of thumb now is to get people doing something hands-on within 15 minutes. Even if its just running sample code they don't really understand yet, I aim to get them quickly interacting with the software and keep them busy after that. I can guarantee this is more interesting than listening to my voice, but it will also motivate them to watch the next batch of slides or demo.
The biggest impact on the level of engagement in my experience is not the instructor's personality but the workshop structure and flow.
There are many other tried and true techniques to make online instruction more engaging. Polls, white board activities, panel discussions with audience questions, etc., are all fairly easy to integrate in platforms like Zoom. I'm a big fan of Google Docs and Etherpad (an open source version of Google Docs) for more collaborative work that gives everyone a chance to contribute. I have yet to do collaborative diagramming using tools like Miro or Google Jamboard but these can help collect and organize ideas or topics from a group.
Teach According to How People Learn
The highlight of many software workshops are the hands-on exercises, where a guided workflow culminates in a satisfying output almost by magic. Students love it, but we all know that no matter how satisfying, completing an exercise by itself does not represent successful learning. Getting students to the point where they can apply the concepts and tools to solve different problems is far higher bar to meet, and requires us to teach according how people actually learn.
There are many aspects of aligning teaching with learning, but some of the most pertinent ones for software instruction are i) understanding your students' background and aspirations, ii) constantly flipping back and forth between higher level concepts and the nitty-gritty of an application, iii) dishing out material in mentally digestible chunks, and iv) getting continuous feedback in the form of assessments. This list is just scratching the surface; tomes have been written about effective instruction. A good overview on the subject I have bookmarked is a recent webinar What Every Data Scientist Should Know About Education, by Greg Wilson, co-founder of Software Carpentry and now at RStudio.
Technical Challenges
Software setup: An old headache that just got worse
One of the biggest headaches in any software centric workshop is getting everything setup on the computers. Geospatial workshops in particular have several requirements - software has to be installed, licenses obtained and activated, and workshop materials including data distributed. One of the biggest benefits of conducting workshops in computer labs is that this setup process can be dealt with beforehand, however in a virtual setting that is no longer an option.
This past summer I was a helper in a 3-hour live online workshop where literally the first hour was spent on setup. The 'normal' way of installing a critical extension had recently broken due to a update (care to guess which software company?). More than 1/2 of the participants couldn't proceed, so the instructor had to walk through an alternative setup process. Spending 1/3 of the workshop on setup is a bit extreme, but even in a 'good' workshop there is always a handful of people who get behind from the get go because they're working through setup problems. Instructors have a tough choice to make when this happens - take time to help a few people who can't get started at the detriment of everyone else, or keep going and hope they catch up.
Our group has learned that setup issues are best dealt with proactively, before the workshop starts. Starting with the workshop description, I make it clear that i) setup is a required task, ii) is the responsibility of each participant, and iii) needs to be complete before the workshop starts. This is coupled with very detailed setup instructions that are shared at least a week in advance, and pre-workshop tech support for anyone who needs help.
Our group has learned that setup issues are best dealt with proactively, before the workshop starts.
DroneCamp 2020 was one of our biggest challenges ever regarding setup. There were six hands-on workshops involving five large photogrammetry and GIS programs (Pix4D, Metashape, OpenDroneMap, ArcGIS Pro & QGIS). Data downloads for some of the workshops were several gigabytes. Anticipating challenges, we first held full practice workshops with each instructor 2-3 weeks in advance, to test everything including setup. These lessons were incorporated into detailed installation instructions were posted on the website over a week advance. We also had two days of drop-in office hours the week before for anyone having trouble.
To provide some structure and incentive for completing setup, we also assigned a modest "Step Zero" exercise for each program to show that they had successfully installed the software and loaded some data. Although we stopped short of collecting these before the course began, we made it quite clear in our messaging that if people didn't have their computers setup in time for the workshop we weren't going to be able to help them, but they could still learn something by watching the slides and demo. This combo of clear communication about expectations, detailed instructions, pre-workshop tech support for those who need it, and plenty of time to get things setup generally paid off. None of the workshops got bogged down with setup issues, and instructors were able to jump right into the main content.
The Single Screen Problem
A typical live software workshop requires participants to juggle a minimum of 4 windows. You have the instructor's webcam, the instructor's screen share (slides or a software demo), the participant's application (e.g., ArcGIS Pro or Pix4D), and a written exercise guide. On top of that, everyone needs access to the Zoom chat window, the participants pane to raise their hand, the Zoom control bar to mute and unmute themselves, and possibly a note taking application.
In the best case scenario where someone has one or even two external monitors, that's a lot of windows to manage. For participants who have nothing but their laptop, it's almost hopeless.
I try to mitigate the single screen problem in my workshops proactively. Starting with the workshop description, I strongly encourage people to use a second monitor. I've never made it an outright requirement, although I've been tempted to do so. I also provide tips on juggling multiple windows on a single screen, including third party utilities that allow you to pin certain windows on top, and utilities that can save a configuration of window sizes.
Because there are so many windows to juggle, I try to be cognizant during the workshop that not participants may be seeing the slide I'm talking about or the application window I'm sharing. I constantly verbalize which window or slide I'm on, where I'm clicking, etc., and whenever we switch gears I pause to give people a chance to get the right window open. For my R workshops, I've also modified how students do the short exercises. I used to have them copy-paste prompts and starter code from a slide deck into RStudio. That kept them active (a good thing) and ensured they were following the slides as well. However that gets cumbersome very fast on a single screen. I've since switched to R Notebooks, where the prompts and starter code are ready-to-go, and I can use markdown text to explain the task.
Because there are so many windows to juggle, I try to be cognizant during the workshop that not all participants may be seeing the slide I'm talking about or the application window I'm sharing.
The Small Screen Problem
Related to the single screen problem is the small screen problem. I have a beautiful 26" high resolution external monitor, so I never have problems with legibility or enough real-estate for multiple windows. But the participants viewing my screen share may be working on a 14" laptop. Zoom offers two options when the screen being shared is a different resolution. They can either automatically resize my high-res screen share (and get a magnifying glass to read the tiny font), or crop it and just watch part of my screen share (hopefully the right part).
I don't know of a great solution, but when I teach I try to mitigate the small screen problem by first reducing the resolution of my beautiful 26" external monitor down to a modest 1080p, even though its capable of considerably more. That makes it look slightly fuzzy to me, but I know it will come out better for most of the people watching my shared screen. To make it easier to follow my mouse, I also increase the mouse pointer size, enable a feature that allows me to visually highlight the mouse location when I press the control key, and sometimes turn on mouse trails. I've seen other presenters, particularly on YouTube, use Windows Magnifier or a similar utility like ZoomIt to help people follow details.
But most importantly, I try to tackle the small-screen problem by simply slowing down and verbalize each step when I'm doing a demo. I also generally invite them to just watch a few steps, and then pause so they can repeat those step on their own, rather than try to keep pace with me. As we go I ask how people are doing so I can adjust my pace. There may be one or two who just get hopelessly behind, and others who might be bored because I'm going too slow, but my goal is to make sure most people are able to follow.
Part II: Goals and Directions for 2021
Attached Images: