- Students have assigned seats to make it easy for a teacher to scan the room and see who is absent
- Students cannot speak until they raise their hand and are called upon, minimizing disruptions
- Teachers can share their screen, but students cannot
A fully-featured educational application may not use these features exactly, but let’s review how they work in case you want to implement similar features in your own application.
Is everybody here? This question comes up a lot when in a classroom or even a team meeting with the same group getting together each day. Most off-the-shelf conferencing applications will sort attendees by alphabetical order, or order in which somebody joined, or in some other way that does not fit the use case.
If your workflow involves taking attendance or waiting for a quorum, this can be a disruption that you don’t typically have in the real world. Looking around a classroom or a meeting room, typically it is very easy to see who is there and who is missing. This sample app tries to demonstrate that workflow by having participants identify which student they are, and then placing them in their assigned seat.
The way this sample app accomplishes this is through use of the Firebase Realtime Database. As a participant joins, they are correlated with an attendee record in the database.
Another common issue in the virtual classroom is audio nuisances. If somebody forgets to mute themselves, they may be unaware and make noises or start talking and disrupt the teacher or meeting. To help solve this problem, the sample application forces all participants into a muted state. In order to speak, a student must not only raise their hand but also be called on by the teacher.
The student that is called upon now has an opportunity to speak. By pressing the space bar, they can begin streaming audio content and get a visual waveform indicator to help confirm they are no longer on mute.
This type of visual feedback can be immensely valuable given the number of troubleshooting steps that are often required when unable to hear somebody speaking to identify if it is the talker or the listener that may have a problem to resolve.
✓ Audio is Enabled for conference
✓ Hardware (microphone) is plugged in and not muted
✓ Media device (microphone) is not muted by operating system
✓ Media device (microphone) is allowed in browser
✓ Participant is not muted to stop transmitting audio
✓ Remote participant is not muted to stop receiving audio from that stream
The way the sample app implements this behavior is by responding to an event and calling
stopAudio() as needed for each participant based on the state of hand raising and being called upon in the Firebase Database.
Sharing the contents of a screen, window, presentation, or video is an essential collaboration tool for educators. Many off-the-shelf applications however change the layout when sharing the screen which can have a problem: verification that everybody is seeing the same thing and being able to maintain a view of everybody in the conference.
The example project includes screen sharing as a fundamental part of the view. The teacher is allowed to share their screen, but in the demo at least students are not able to.
For collaboration and verifying work this could be similar to the audio behavior in that a student must raise their hand and be granted permission to speak and share their screen. Either way, the student view is different from the teacher view to make the most important information visible for each role.
To get started running this demo:
git clone https://github.com/dolbyio-samples/meet-dolbyio-classroom.git cd meet-dolbyio-classroom
You may want to watch or star the repository to receive notifications for any project updates or bug fixes.
You’ll set these values in src/utils/voxeetUtils.js
const consumerKey = '<DOLBYIO_COMMUNICATIONS_API>'; const consumerSecret = '<DOLBYIO_COMMUNICATIONS_SECRET>';
To run this demo application, you’ll need to sign-up for a Firebase account. The steps to do this can be accomplished entirely from the Firebase Console.
At a high level:
- Create a Project
- Create a Web App
- Copy the firebaseConfig into src/providers/Firebase.js
- Create a Realtime Database in test mode
Firebase makes this process rather easy to accomplish, there is no other code to run as the data model will be populated when the app runs.
Run the App
Once Dolby.io and Firebase credentials are added to the project you can run the application:
npm install npm run start
The app was built with
create-react-app so should load immediately in an open web browser with hot-refresh so that changes you make to the code can be previewed immediately. If you are new to React, the Create React App site is a good place to start.
That’s it, you should have the demo app up and running.
If you want to learn more, visit our Project Gallery where some of the technical details of this Classroom Example Project are explained in more detail. You can also find a link to the source code on GitHub from the gallery page.
If you would like to get some additional guidance, we are currently running the Dolby.io Build the World Hackathon with over $25k and 200+ prizes. For this event, we’ve highlighted the classroom sample app on our meet.dolby.io showcase application. There is still time to join the event and take advantage of the opportunity to participate in our Office Hours and ask questions about the Communications APIs.