the Dat conference this year has finished. We are happy that many showed up and think overall it turned out well.
In this article, we would like to review what we did and what we learned from it, for other operators or our-future-selves to not repeat mistakes.
Warning: We did attempt to keep this review as short as reasonable but it is quite long as we did a lot. Please stay with it and read with care if you wish to learn from our efforts.
I have made this longer than usual because I have not had time to make it shorter.
- Call for Proposals
- Website & Registration
- Big Blue Button
- Streaming to Youtube
- Before and After
We have been looking into holding a Dat related event at the beginning of 2020. As the Corona Virus Crisis unfolded we quickly scrapped our plans for a physical event and decided to move to an online format. Making this our first online event.
Diego, Franz, and Martin volunteered to hold the event in the online call. First, we used Github Team Discussions for sharing our progress. We had an initial kick-off on March 22nd but afterward progress stalled so we set up a meeting weekly - and one-hour call. Initially, the meetings were at a time similar to our consortium calls, during which Martin would need to get up late at night. We switched it soon to work for everyone. For the first few meetings, we would announce them using Github-discussions, but we turned it into a google calendar invite as people had a hard time tracking it.
After a few meetings we started, to work on the homepage and invited Santiago to help us with it. Over time, he would become a full member of the team 😍.
During our meetings, we would keep notes in HackMD as it was the most convenient/well-known to all of us. We started using a "
Prev - Next" navigation to make sure that people could easily catch up with previous meetings.
Finding a good date
We discussed a few dates for the Dat event in the Dat consortium. The date of the 30th and 31st of July was chosen as it gave us time to organize but was still in the foreseeable future. We chose weekdays as those were easier for consortium members to attend.
We are in friendship with dweb-camp and ournetworks the former one not having a date for 2020 yet and
ournetworks to be live between September 8–13, 2020. Diego joined their weekly calls as they are out of the timezone for other organizers.
Initially, we planned for the Dat Conference to be just before
ournetworks but as there were delays with our network we have been a little over a month early.
We tried Github issues, but they didn't work for us. Likewise, Github projects didn't work for us as well. What we would do is create action items during each meeting and review them to make sure that we have some progress.
- Set up weekly or bi-weekly calls and use a time at which everyone is awake; Share invites using (Google) Calendar.
Prev - Nextnavigation in your meeting notes to make sure that other people can catch up.
- Don't use Github-discussions.
- Your initial team is likely to stay as-is without recruitment.
- We need a HackMD equivalent running in Beaker.
- Specify action-items with deadlines clearly.
This year we wanted to have a Call for
PapersProposals that is as open as possible to the wishes of the community. To highlight that we use Github, we thought it's enough to simply create Github issues.
Please submit your ideas as issues in our issue tracker.
As this was not accepted by the community (no-one signed up) we opened a more common CfP quickly using google forms
Please submit your ideas in this submission form.
Being "open for all suggestions" didn't give a frame of mind for what is possible to possible presenters. We also were very busy and didn't promote the CfP enough, but the Google Form did its job well enough. We got just about enough proposals for a conference. 😅
Then we had to sort and order the talks and put the schedule on the website though, and whoever had to do that task was asked for a few things:
- Respect the presenters` availability.
- Respect timezones when specifying the times.
- Give space (breaks) between the entries.
- Notify all presenters of things that are happening.
Moving to Pretalx
And also we had some missing information in our CfP that we had to gather. While looking into good tools to display the Schedule, we found Pretalx and we found it quite smart:
- It automatically generates emails to presenters.
- It has a lot of useful features & flags.
- It deals (pretty) well with Timezones.
- It supports iCal out-of-the-box.
- It is OpenSource.
So we went with it. We imported all the talks we received through the google form and sent out the invitations to the presenters to confirm their talks.
Note: If you don't use Pretalx: confirm the talks with the speakers similarly as Pretalx would.
Arranging the entries to a schedule with Pretalx is a breeze:
We pulled the Pretalx data through an API into our Netlify hosted website (inherited from the previous dat event) and rendered it in our style.
That doesn't mean all with Pretalx went well, there have been a few issues that we had to figure out:
- Make sure the database collation is set up right pretalx-docker#22, else emojis and other unicode-characters don't work.
- Pressing the delete button on emails deletes emails without a second question pretalx#967.
- Speakers are not sortable pretalx#975.
- There is no good hook to trigger an update of the Netlify page pretalx#964 - We manually triggered the updates.
- If the speakers don't activate their account within the preset limit, the sign-up breaks pretalx#554 - we had to work-around this issue using direct SQL access
- Pretalx doesn't show to the presenters well what timezone is used for the event. Which is problematic if you are not clear about this in the CfP description: pretalx#924.
This is an online event and we had a relatively complex scheduling requirement with speakers from Australia, Europe, and US timezones attending. To make sure to have space for all of them we scheduled the second day from 9:30-24:00 which meant that the moderators had to attend in shifts.
It also meant that we couldn't expect the attendees to attend all sessions equally and that our recordings would become all-the-more important. Furthermore, it meant that we didn't have an after-hours special event for everyone to attend. We initially thought it might be a good idea to have gather.town or Mozilla Hubs for a meetup but figured that it would be selective to the attendees and we wanted to focus on things beneficial to all. If the event would have taken up to 5 hours and focus only on a certain time-zone in which case an after-party would have been more reasonable.
One way to reduce the time would have been to have multiple talks starting at the same time, with multiple rooms. This, however, would have meant to set up multiple recording setups and we would have needed twice as much moderation & extra coordination. And while the shorter duration would have helped with the first issue, we would still need to accommodate the presenters of each timezone 😓.
To make sure that we get a breather between sessions and to fix eventual technical issues we added a 20min break between each session which we had to reduce occasionally to 15min to fit all presentations. This turned out to be a really good thing as we indeed had to deal with technical issues, some sessions went over-time and we had some time to make sure that the next speaker is ready (eventual tech-check).
When we received the presentations through the google form we left it open to enter any duration and by happy accident, we think that was a good idea. Each session was just as long as the conductor felt comfortable with.
- The CfP needs to be private - a public CfP (using Github-issues) doesn't work.
- Use Pretalx from the Beginning.
- Be clearer in our event description and with our CfP goals.
- Put more effort into the promotion of the CfP.
- It is doable to set up a schedule with sessions of different durations and that turned out to be a good thing.
- Add generous breaks (15-20min) between sessions.
- If you send out emails, make sure that they are seen.
- Support rixx on Liberapay or Patreon as the software they build is useful.
3. Website & Registration
The 2019 website was hosted on https://events.datproject.org, we moved it to https://events.dat.foundation and split it up into two separate folders. The domain was provided by the dat foundation, its source-code is on datproject/public-events, is hosted on Netlify and since we ran out of Netlify data transfer quickly we activated the CloudFlare CDN to cache the data.
Registration through Eventbrite
We knew before the event that attendees likely forget about the event if there is no way to remind them. To prevent this we got an event-registration page up online, which we replaced with the "how-to-join" page once the event started as we wanted to makes sure that late-joiners would find their way.
We added Eventbrite registration to the website.
4. Big Blue Button
For the main system at the conference, we did think about different tools. We ended up trying the OSS tool Big Blue Button and it immediately checked all our lists.
- It gives the options for multiple presenters showing their webcam simultaneously besides showing a presentation or screen-share.
- It gives the option to vote on questions. (which were useful during the event)
- It allowed for up to 200 attendee's in a room which is quite nice.
- We were able to moderate attendees. (in case of CoC issues)
So it had a decent set of functionality and it was open source but Big Blue Button needs to be hosted somewhere and as we didn't want to stress ourselves with server issues while dealing with the conference we asked Collocall to host for us. They made us a very fair offer and we took it.
We had two hiccups with the presenter sharing their screen. One during "Changing to "Hyper" and the future of the Dat ecosystem" (Youtube) where the screen was stuck on part of the screen. It was fixable by restarting the screen share and in the Youtube-Editor made it possible to cut out the section relatively smoothly.
The other hiccup was during "Inclusive indexing of research outputs through decentralized web" where the presenter would share the wrong screen and think that the presentation moved on while non of the attendings noticed. We eventually managed to get back to the correct screen sharing but it was broken for a while. We could have easily prevented this by asking the presenters to share their presentation beforehand and step through two slides to see if it works. However, what we failed to notice it that it entirely broke on the stream which meant we had to fix the video track in post-editing and re-upload the video. This took about 4~5 hours.
Slow chat issue
When we had many chat messages, the user-interface to enter new messages would become very slow. This seems to be a big-blue-button issue (related to the React) that now Collocall is aware of and that will hopefully be fixed in upcoming BBB versions.
Break out rooms failure
During "Inclusive indexing of research outputs through decentralized web", Vinodh wanted to break out the room for separate discussions. For some reason, even though it worked during the test, when trying to create the break-out rooms the BBB UI would be locked. We worked around this by holding the workshop in one room.
Backing up Notes & Chat
Before ending the BBB session we saved the notes and we saved the chat log every time before we cleared it, to keep a record of the data. We stored it on our google drive.
The streaming servers (see the next section) added two extra users "Youtube Stream Bot" which would show up in the attendee-list and would also be eventually sent into breakout rooms. There should be a feature to have bot users in BBB.
Moderator help is limited
During some sessions, the presenter would ask the moderator or co-presenters to support them by preparing polls or other tasks. To do so, the moderator would need to "take the presenter role", and that meant that the screen that was shared by the presenter would be turned off and when the presenter role was passed on again the screen-sharing would need to be reinitiated.
- BBB has bugs, some of which will show after running BBB for a long time, some of which only show with many attendings.
- Ask the Presenter in an initial test to step through slides and ask if they show as the presenter intended.
5. Streaming to Youtube
BBB allows recording but the recording of BBB records every session in multiple files (video streams, presentations, etc.) that would need to be stitched together afterward, to publish videos that could be shared.
We wanted to have a video live-stream on a different platform that would also allow for a smooth following of the event on a mobile phone or on a browser that happens to be not supported by BBB.
Considered Alternative Platforms:
- Twitch: First we thought it would be a better idea to start a twitch stream for the event but it turned out that Twitch is not embeddable with third-party cookies disabled.
- Dacast: Dacast has the most reasonable streaming pricing model that would allow embedding of streams, but its hosting of the videos afterward would cause a lot of additional work after the event.
- Vimeo: Vimeo's premium plan supports video streaming on the own website but its rather expensive at a fixed (~70USD/month). But it would allow nice video modification and embedding.
In the end, we went with Youtube as the support for scheduling, community caption and immediate availability for the live videos was very attractive.
In our first tests, it was also possible to embed live videos on a page with third-party cookies enabled but maybe it was a temporal bug or whatnot, but it turned out that it didn't work and we had to redirect the streams to Youtube directly: datproject/public-events#d4bbb0fe
Note: we also streamlined a third-party-cookie test for this, in case you might need it: nginx third-party-cookie-test
For Youtube, we created an account using our organizer email address and a password that each organizer knows.
We discussed quite a bit if we should have 1 Youtube stream for the whole event, 2 Toutube streams, one for each day or 1 per session.
We chose 1 per session with the following arguments:
+ We will lose motivation after the event and cutting a live stream apart will be painful.
+ People following the stream will see when the sessions start using a time indicator.
+ If something happens to fail, only one session will, and we can repair things until the next session.
- It will be more stress during the event.
- It is a lot to set up during the busy time before the event.
One of the things that certainly was well appreciated was the direct streaming to Youtube. Once we got all Speakers out we used a Sketch file, Data Populator, and a generated JSON file of the sessions to generate the thumbnails for each Youtube session.
We then created and recorded one dummy session on Youtube live. This was useful because it would later allow us to create the new session using the first one as a template:
It makes it rather quick to fill out the rest of the information as further information (only available through Youtube Studio) will be copied too. Like License, Language, or Comment settings.
The only problem that comes up is that the time needs to be entered in your local timezone. Now, we could have made our lives easy and simply changed the timezone of our computers when setting up, but we instead created a small Html file that we started with
npx http-server that shows the sessions in the current timezone.
And while we thought we were clever with using the template it did bring another issue: by copying the stream, it also copies the stream key to each new video, showing this error message:
Now, that is because each video is (correctly) activated auto-start for starting the live-stream when we press the button in OBS:
Naturally, Youtube doesn't know which live-stream to autostart if all have the same stream key. As this feature seems to be made for streamers who have repeating streams (once a week?) they don't mind. But we did.
So we hat do check each live-stream setting and create a new stream-key for it. We used the talk-code as an identifier, as that was easier to remember:
To stream the video from BBB, we set up 2 streaming computers. One to make sure we have a back-up. One Mac and one Windows computer stationary computer, each with two screens! Each of the computers was connected to a TeamViewer account that all of the organizers have access to.
Important We made sure that both computers had enough hard disc available to keep the recordings!
On each computer, we installed ObsStudio which allows streaming to Youtube. For every session, we had to create a separate OBSStudio Profile. (Each profile allows to store a different stream target in OBSStudio)
So, we created one profile for every Youtube video to setup:
Each profile contains the Stream key we copied from Youtube. We named each stream. For the naming, we made sure that the names are sorted correctly (OBSStudio automatically orders the names).
Also, we would prepare 3 scenes that can be used during the live-stream.
- A few minutes before the session started we would go to
"Before Session"which shows the
/livepage using a
"Browser"input source which would show the next session about to start.
- Then we'd check to have the correct profile setup and press
- We would go to the Youtube video for the current presentation and make sure that the video is running.
Human Error Recovery:
- If the Profile belonged to a stream that wasn't started yet we could simply change the description of the video and hope nobody notices.
- If the Profile was already recorded, we would Stop Streaming, and start again as streaming to a finished stream wouldn't work.
- During the session,we would switch to
"Big Blue Button"which does a screen capture of the second screen. We set the resolution of the second screen to 720p (1280x760). On the second screen, we would have a browser window open, full-screen, logged-in to BBB.
To remove the BBB elements that we didn't want to show up in the stream we created a little bookmarklet that we would run after the browser showed the BBB room view.
- Right after the session ended we would switch to the
"After Session"screen which is pretty much like
"Before Session", but shows
/liveshows after the session the next session while
/live-endshows "thanks for joining the previous session".
- We monitored the YouTube output until the "After session" scene was displayed before pressing "Stop Streaming", to ensure that the whole video session was captured by Youtube.
- We ended the stream in Youtube Live Stream (top right, current live stream).
Refreshing the Browser windows
The browser window in
"Before Session" and
"After Session" occassionally froze, to make sure it shows the correct window, OBSStudio has a slightly hidden "Refresh cache of current page" feature that allows updating the page.
On Mac OS, the setup works much like on Windows except that it doesn't support audio recording out of the box. You need to install the iShowU audio plugin, define it as Input Source in BBB and use it as an audio-input-device in the MacOS Audio settings. More about this here: obsproject/forum#77163.
If the BBB window in Chrome on the second is active for too long it will stop sending audio, resulting in a missing audio channel.
To prevent this it is necessary to restart Chrome and log into BBB again. It was curious that this was an issue with Chrome that happened on both streaming computers (Windows and Mac) simultaneously.
The BBB window in crome may also just get entirely stuck with the audio still running, if unnoticed by the moderator the video was needed to be restructured in a future effort.
- Just redirect to the Youtube Playlist unless you have trusted, well viewed Youtube channel allows live embedding.
- Change your timezone to the event's timezone to enter Youtube links.
- On the streaming machines: restart the browser for the stream if there are long enough breaks between the sessions.
- Make at least one example-run of the Streaming setup with every organizer. It is good for everyone to be prepared.
- Find a way to pay the developer of iShowU to make sure that it works in the future on Mac.
Kevin: you can, but the audio playthrough won't work, as that is baked into the iShowU apps themselves. I'm pondering if I write a paid-for variant of this.
- Live streams make breakout rooms difficult, as the live-stream computers are also logged-in.
- Disable the recording-button in BBB avoids confusion. It can and should be disabled in the setup.
- Always record talks on both the main and backup computer, otherwise content may be lost if the main computer has an issue. Press "Start recording" on the backup computer and the "Start streaming" on the main computer. This way you can always "Start streaming" on the backup computer while it is still recording if the main computer fails.
- Check if the avatar in the Youtube stream is actually moving and not just showing the face to make sure that the browser is not stuck and look at the audio level to see that they are in the green area.
6. Before & After
With too short notice and only a little time at hand, we prepared only a short script to introduce the event to people. We wanted to record it before the first session but were short on time so we held it live the first day and recorded it on zoom during a break for the second day. (At the beginning of the second day Martin would be by themselves, as they live in the most-eastern timezone)
The topics covered in the introduction can be seen in our video.
We showed the video before the first event on the second day using the video feature of BBB.
After the event
As quickly mentioned before we skipped on the after-hours party. But after the first day, a group of people shared independently a
gather.town link and we had discussions there in a small group.
On the second day, a group of people stayed after the event as well and we had small conversations.
Also, we opened a comm-comm event and announced it quickly in the last session of the event and on the discord chat. (Notes)
Following the comm-comm event, we prepared a newsletter summary in which we linked all videos and gave a summary of the event itself in future.
Before the event, we were considering to add other expenses: like the a-forementioned
gather.town, sending gifts to presenters, or making sure every presenter has a good microphone. We were also considering asking for money from the attendees - only an amount big enough to assure that people would show up, maybe ~8USD - and find sponsors for the event to make the video work not an entirely volunteer effort and also as a means to donate to dat.
As we are volunteer organizers and the structure of the Dat foundation has been in flux in 2020, we decided to treat the event as an "outside" effort. This meant that we would not consider any forms of income, as we did have some costs we were able to convince the Dat foundation to support us with 2000 USD.
The Dat foundation asked us to present a combined receipt with all statements for the event. We needed 300 USD for hosting BBB with Collocall. We furthermore decided for each of us for organizers to receive a token 100 USD for their effort. For the work on the website, Santiago got 1050 USD as initially promised. We also decided to support Pretalx, BBB, and OBSStudio with 50 USD donations each (the BBB donation went directly to Collocall, as per BBB instructions). The rest of 100 USD went to Franz for making sure that the finances are in order (in case some taxes or accounting costs would show up at a later time). - (OpenCollective Receipt)
Acknowledgement for non-monetary Support
- We were happy to use the Netlify plan from the Dat foundation for the homepage hosting.
- The Pretalx instance and the organizers` email address were provided by Georepublic.
- The streaming computers were provided by friends of the team.
- Eventbrite provides its service for free to free events.
- Cloudflare offers free caching Bandwidth for our domain.
- Probably clear but we still should acknowledge that we didn't need to pay for the Youtube Account.
- The initially used Google Form was as well provided free-of-charge by Google.
- Instead of a zoom call for the introduction, it would have been better to just reserve a presentation slot and Youtube stream.
- Practice the introduction a few times to make sure you get it right.
- Announcing the follow-up comm-comm call would have been good to do in the intro.
- It would have been good to place the introduction in the welcome-notes of the BBB channel.
- It would have reduced some friction to define before the kick-off of the event about the sponsorship of it, as we spent quite a bit of time after the fact to figure this out.