Code for Science and Society is happy to announce our second cohort of Event Fund grant recipients! The Event Fund is a community-advised fund to...
On 21 April 2021, Lou Woodley and Jenny East of the Center for Scientific Collaboration and Community Engagement (CSCCE) hosted an interactive session on virtual events as part of the Code for Science & Society (CS&S) grantee workshop series. They focused on how to select and test online tools to help facilitate your meeting activities, and shared a guidebook to help you decide what tool to choose. This post, authored by Jenny and CSCCE’s communications director, Katie Pratt, gives an overview of the workshop and the motivation behind creating the guidebook.
Why use online tools?
We’re all used to meeting online, with many of those meetings taking place on video-conferencing platforms such as Zoom. While these platforms afford several facilitation features, including whiteboarding and breakout rooms, there are instances where you might need to supplement these features with additional functionality.
There are a wide variety of online tools out there that offer advanced whiteboard, sticky notes, voting, brainstorming spaces, and more. And as an organizer it can feel a little overwhelming to decide which tool will meet your needs while also being accessible to your participants.
A guidebook to save time and support tool selection
To help meeting organizers and community managers strategically and efficiently make these decisions, we created a guidebook, with checklists and tips to easily assess whether a tool will meet your needs, including fitting in to your budget. We also walk through how to test the product, an essential step to take to make sure the tool works as advertised.
About the workshop
The session started with an overview of a potential scenario that often plays out for planning virtual events. We walked through some basic needs and considerations, the “must haves,” and also highlighted some features that would be “nice to have.” For this example use case, a remote team was gathering on Zoom to plan an online conference, and needed a brainstorming space to collaborate.
Once the team’s requirements were identified, we discussed creating a shortlist of tools to consider exploring. Then we walked through the BASICS for three different tools by exploring their individual websites. Assessing the BASICS (Branding and customization, Accessibility, Sign up requirements, IP and privacy considerations, Costs, Solidity) allowed us to determine whether to move forward with logging in and trying out a particular tool. And then, we used the SCORE (Set up a sandbox, Configure settings, Outline your use case in the sandbox, Revisit requirements, Evaluate tool overall) checklist to walk through configuring settings, trying out various relevant activities like adding ideas and sorting, and confirming we were meeting the original requirements (please see the guidebook for a detailed discussion of the BASICS and SCORE checklists).
Participants in the seminar noted that they could find some of their answers via a tool's website, and were eager to experiment with how the tools work. There were also questions about accessibility from a user inclusion perspective, and participants noted that some of these tools are currently improving on this front. The session provided a great opportunity for reflection and discussion on how to select tools in an efficient and effective way, in order to enhance virtual events and engage participants.
About the CSCCE
The Center for Scientific Collaboration and Community Engagement champions the importance of human infrastructure for effective collaboration in STEM. We provide training and support for the people who make scientific collaborations succeed at scale - and we also research the impact of these emerging roles.
If you are interested in learning more about CSCCE’s online training offerings, which can be customized for your organization or team, please visit our website: cscce.org. We also offer training for individuals, and host a vibrant community of practice of more than 250 scientific community managers.