Video Call Intercom with Vibration Sensor for Deaf Communication

Category: Hardware + Software (IoT & Accessibility)
Difficulty: Intermediate
Time to Build: 4–6 weeks
Prerequisites: Python basics, Raspberry Pi, local networking, GPIO wiring
Deliverables: SRS, architecture diagram, wiring schematic, FastAPI/Tkinter code, test report, working prototype

Get Project Kit (Free) | Request Instructor Pack | Book 15-min Consult

Problem Statement & Expected Outcome

Problem Statement
Traditional intercoms and doorbells give no tactile alert and limited visual cues, making them inaccessible for the deaf community.

Expected Outcome
A fully functional Raspberry Pi intercom with vibration-based call alerts, live video communication, and optional gesture and speech recognition for seamless, inclusive interaction.

Abstract

Build a smart intercom that combines vibration alerts, video calling, and optional sign-language and speech-to-text modules. Using Raspberry Pi and Python, it offers zero communication cost on a local IP network and a user-friendly touch interface.

Details: User Stories & Acceptance Criteria
  • As a deaf user, I can receive vibration alerts and see incoming calls on screen.

  • As a family member or visitor, I can initiate a video call and communicate in real time.

  • Acceptance: Vibration motor activates within one second of a call; video call connects within three seconds on a broadband network.

Scope & Modules

Module 1: Server Management

FastAPI server tracks connected devices and their IP addresses.

Module 2: Video Communication

Python sockets stream video frames and data packets for low-latency communication.

Module 3: Gesture Interpretation

Mediapipe monitors hand movements and matches them with a trained dataset to recognize 24 common words and phrases.

Module 4: Call Alerts

Vibration motor activates whenever an incoming call is detected.

Module 5: User Interface

Tkinter-based touch interface shows connected devices and lets users place or accept calls.

Proposed Architecture & Tech Stack
Option A (Recommended for MVP):
  • Hardware: Raspberry Pi 4, camera module, vibration motor, XPT2046 touchscreen

  • Backend: FastAPI

  • Frontend/UI: Python Tkinter

  • Libraries: OpenCV, Mediapipe, GPIO, Socket

Option B (Web UI Alternative):

  • Backend: Django/Channels for signaling

  • Frontend: React or Vue for browser-based dashboard

  • Optional: WebRTC for browser-based video streaming

KPIs & Analytics
  • Call connection latency

  • Gesture recognition accuracy (%)

  • Uptime of intercom service

  • Number of successful vibration alerts per day

Milestones & Timeline
  • Week 1: Finalize SRS, wiring schematic, and database (if needed)

  • Week 2: Set up FastAPI server and Pi camera streaming

  • Week 3: Integrate vibration motor and develop basic UI

  • Week 4: Add sign-language recognition and speech-to-text

  • Week 5: Test and tune performance

  • Week 6: Documentation, final testing, and presentation

Who It’s For
  • Students/Capstone Teams seeking a socially impactful IoT project

  • Instructors wanting a ready-to-deploy pilot kit

  • Institutions looking for a scalable, low-cost communication aid

Progress Checklist

  • SD card flashed and Pi boots correctly
  • Camera tested with live preview
  • Vibration alert functioning
  • Video call tested between two devices
  • Gesture and speech modules verified

Resources & Links

Download Project Kit (ZIP): Click here

Related Abstracts You May Like

Related posts

Dynamic Route Rationalization Model using AI & ML

Writing Pen and Pad for Children with Specific Learning Disability

Learning Path Dashboard for Enhancing Skills – Build Guide & Project Kit

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More