Intuitive Scheduling Tool for Direct Knowlege Sharing

Get Started Now

About CereBro

During a semester, students often find some parts of their coursework difficult to understand. Finding the right person to discuss questions and doubts, that might help the student to effectively eliminate their difficulties is a task that is tougher than it seems. More adventurous students, looking to enhance their professional skills beyond the scope of their academic curriculum, often look forward to learning these skills from others having sufficient expertise in the areas that the student wishes to explore. However, finding a mentor, willing to share his knowledge, and who might be the perfect match for an eager student, is another challenge.

Our application - CereBro aims at bridging the gap between students who seek knowledge, and those students who possess knowledge and are eager to share it with avid learners. This is a platform to effectively connect knowledge seeking students with their mentor counterparts. Cerebro has been conceptualized as a tool to match students with their prospective mentors, using effective matching algorithm.

Bell - A perfect theme

Requirement Gathering Tools

User Survey

    We created a survey using Google forms. The survey was shared with other students by means of various online channels. The survey recorded 20 responses from UIC students.
  • Students in general prefer seeking help with their coursework, or prefer to learn about a topic, from friends, and people they know
  • Students were not hesitant to help another fellow student with their coursework, even though they might not know each other.
  • Fewer students preferred to go to TAs and professors for help with their coursework, as compared to their Friends.
  • Students ranked the teaching and communication skill as their top priority when it comes to learning from a tutor.
    We conducted interviews with UIC professors. These interviews helped us a lot in gaining insights into the process of selection of the TAs by the professors who are our secondary users. Some of the questions asked to professors were:
    Q: Other than educational qualifications, what are the interpersonal skills that you would you look for? How would you rank them?
    A: They said that they would give preference to teaching skills and communication skills. We have used this information as one of the factors in rating our tutors
    Q: If you face a shortage of TAs for a course, how do you go about finding them?
    A: Most of the professors faced
    shortage of TAs as they were not able to find students equipped with technical knowledge, This information helped us in making professors as secondary users of our application.
    Q: Would peer ratings be a good measure for you to be able to identify the skills and the level of expertise in the coursework for a particular course, that a student may possess?
    A: The professors felt that peer rating would be helpful provided it is unbiased and fair. This helped us in incorporating measures such as providing “like” button for our reviews.

Interviews

Product Analysis

    Meetup is a platform that facilitates offline group meeting of people having common/unified interests around the world in various localities.
    Points Taken :

  • Notifying the student requests to the tutor and vice versa.
  • Tagging topics.
  • Allow the user to set/update the interests in their profile.
  • Allow the student and tutor to contact each other in some form.

Functional Requirements

  • The application shall allow a new user to register with the system.
  • The system shall notify the user if a potential learner is requesting his assistance, or a potential tutor is available for providing assistance.
  • The system shall allow a learner to post their question and categorize them.
  • The system shall suggestively schedule meeting times for a learner-tutor pair by determining a reasonable time based on both of their time preferences.
  • The system shall allow learners to rate tutors on a standardized rating scale.
  • The system shall suggest learners with potential tutors based on their time preferences, location, rating or expertise in areas/topics relevant to the learner.
  • The system shall provide tutors with a list of learners requesting their assistance.
  • The system shall allow users to contact each other.
  • The system shall provide secondary users with progress of students as tutors, based on their ratings and relevant to the areas of interests of professors.

The Design Dairy



Formative Evaluation

For formative evaluation,we conducted cognitive walkthroughs using paper prototypes. Three users were assumed to take up with two personas for the tasks and was given task goals for each task.They were asked to tell their thoughts out aloud at each action step of every task.At the end of each task, they wereasked to provide feedback about their experience from the walkthrough for every task.

The following were the tasks that our users performed

  • Task 1 - You are a Computer science student at UIC. You have a question regarding your HCI coursework. Post your question “What is the difference between getting the right design and getting the design right?” and find potential tutors.
  • Task 2- You have posted a question on the platform and have received suggestions for potential tutors. Select a tutor that you think can best answer your question, out of the suggested tutors, and view their profile.
  • Task 3-You have posted a question on the platform and have received suggestions for potential tutors. You have also selected your preferred tutor out of the suggested tutors, and viewed their profile. Request the tutor to answer your question.
  • Task 4- Now, you are a tutor who has received a request for help from a learner. Respond to his request by either accepting or rejecting their request.




The feedback from the users is listed below.

  • A definite confirmation would be helpful to know that the question that was posted was posted successfully.
  • Tutor skills were mistaken as clickable buttons. Need to differentiate between clickable and non-clickable elements.
  • It would be better if dashboard was shown after responding to a request, instead of blank requests page.




Summative Evaluation



We asked 2 users to perform 3 tasks by directly interacting with the application.

  • Task 1- As Learner and Tutor, Register as a learner and tutor on CereBro
  • Task 2- As a learner, ask a question and choose a top rated user based on rating and request for a session at the earliest available slot.
  • Task 3- As a tutor, view all the requests and accept the first request, reject the second request.

We measured time taken by the user to complete the task, success on task and the errors that occurred during the execution of the task. We also measured user satisfaction using the NASA TLX surveys and the results were as follows.

The following points were noted.

  • Having quick transitions, helped reduce wait times and thus user satisfaction was more
  • Testing with more rigorous conditions will help accuracy of matching, and provide better results
  • User accuracy was >95% on average
  • Both users gave the application a 4.5/5 rating

Demo

CereBro - Learner




CereBro - Tutor


Meet our team



Our Tech Stack