Hi everyone! I’m Asif Khan, and in this blog post, I’m excited to take you through the process of building a Real-Time Restricted Area Monitoring System. This system combines the power of YOLOv11 for real-time person detection and FastAPI for fast video streaming, enabling us to monitor a specific area and instantly notify us if someone enters or comes too close to that restricted zone.
Using YOLOv11, a cutting-edge object detection model, along with FastAPI, a high-performance web framework, gives us the perfect combination for both fast and accurate video analysis. This project will give you the tools and knowledge to implement a real-time monitoring system with live video streaming and object detection capabilities. Let’s dive into the details of how we can make this system work smoothly and efficiently!
The main focus of this project is to track individuals entering or approaching a predefined restricted area. If a person enters the area, the system will display a warning message.
Use Case
The system is ideal for use in environments such as:
- Security Surveillance: Monitor restricted areas in buildings, factories, or secure zones.
- Hospital Rooms: Ensure that no one enters a restricted medical area.
- Private Offices: Protect confidential areas by alerting if unauthorized individuals approach.
The system provides a real-time video feed with the detection of a person near or inside the restricted area. It’s an efficient solution for enhancing security without requiring complex setups.
Installation Guide
Before running the system, make sure you have Python 3.12 installed. Follow the steps below to set up your environment.
1. Install Python 3.12
Make sure you have Python 3.12 installed. You can download it from the official Python website.
2. Install Dependencies
You’ll need a few Python libraries to run this project. Install them using pip:
pip install fastapi
pip install uvicorn
pip install opencv-python
pip install ultralytics
3. Install the Model Weights
Make sure to download the YOLOv11 weights (best.pt
) that are used for object detection. You can download the YOLOv11 model from this link: YOLOv11 Model
Complete Code
Here’s the full code for the Real-Time Restricted Area Monitoring System:
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
import cv2
from ultralytics import YOLO
import asyncio
import base64
from fastapi.responses import FileResponse
import numpy as np
app = FastAPI()
# Load YOLOv11 model
model = YOLO("best.pt")
# VideoCapture object
cap = cv2.VideoCapture(0)
# Define the classes to detect (Only "person" class, class ID 0)
classes = [0] # 0 is the class ID for "person"
# Define the confidence threshold
conf_thresh = 0.5
# Define the proximity threshold (how close to the ROI boundary a bounding box should be considered "near")
PROXIMITY_THRESHOLD = 30 # Pixels
async def stream_video(websocket: WebSocket):
try:
while True:
ret, frame = cap.read()
if not ret:
# Restart video if it ends
cap.set(cv2.CAP_PROP_POS_FRAMES, 0)
continue
# Define egg-shaped region (ellipse parameters)
center = (frame.shape[1] // 2, frame.shape[0] // 2) # Center of the frame
axes = (150, 70) # Major and minor axes
angle = 0 # Rotation angle
start_angle = 0
end_angle = 360
color = (0, 0, 255) # Red color for the ellipse
thickness = 2 # Line thickness
# Draw the egg-shaped region (ellipse) on the frame
cv2.ellipse(frame, center, axes, angle, start_angle, end_angle, color, thickness)
# Run YOLOv8 model on the frame, with class filtering (detect only "person" class)
results = model(frame, classes=classes, conf=conf_thresh)
# Get the annotated frame with bounding boxes
annotated_frame = results[0].plot()
# Variable to track if anyone is inside or near the ROI
someone_sitting = False
# Iterate over results and check if any bounding box is inside or near the ROI
for result in results[0].boxes:
x1, y1, x2, y2 = result.xyxy[0] # Get bounding box coordinates
class_id = int(result.cls[0]) # Convert the class index to integer
class_name = model.names[class_id] # Retrieve class name from model.names
# Calculate center of bounding box
bbox_center = ((x1 + x2) // 2, (y1 + y2) // 2)
# Check if the center of the bounding box is inside or near the ellipse (ROI)
if is_point_near_ellipse(bbox_center, center, axes):
# If a person is inside or near the ROI, set the flag to True
someone_sitting = True
# Show message based on the detection of someone inside or near the ROI
if someone_sitting:
cv2.putText(annotated_frame, "Someone Inside Restricted Area!", (50, 100), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)
else:
cv2.putText(annotated_frame, "No one is inside Restricted Area!", (50, 100), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 255, 0), 2)
# Encode frame to JPEG
_, buffer = cv2.imencode(".jpg", annotated_frame)
# Convert frame to base64
frame_b64 = base64.b64encode(buffer).decode("utf-8")
# Send frame to the client
await websocket.send_text(frame_b64)
# Introduce a small delay to reduce CPU usage
await asyncio.sleep(0.03) # ~30 FPS
except WebSocketDisconnect:
print("WebSocket disconnected.")
finally:
cap.release()
def is_point_near_ellipse(point, center, axes):
"""Check if a point is inside or near the ellipse with a proximity threshold."""
x, y = point
cx, cy = center
a, b = axes
# Equation of ellipse: ((x - cx)^2 / a^2) + ((y - cy)^2 / b^2) <= 1
distance_to_ellipse = ((x - cx) ** 2) / (a ** 2) + ((y - cy) ** 2) / (b ** 2)
# If the point is inside the ellipse or within the proximity threshold
if distance_to_ellipse <= 1:
return True
elif distance_to_ellipse < (1 + PROXIMITY_THRESHOLD / max(a, b)):
return True # The point is near the ellipse (within the proximity threshold)
return False
@app.get("/")
def home():
"""Serve the HTML file for WebSocket-based streaming."""
return FileResponse("index.html")
@app.websocket("/ws/video")
async def websocket_endpoint(websocket: WebSocket):
"""WebSocket endpoint for low-latency video streaming."""
await websocket.accept()
await stream_video(websocket)
Index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Real-Time Restricted Area Monitoring System</title>
<style>
/* General reset */
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: Arial, sans-serif;
background: #f3f4f6;
color: #333;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
min-height: 100vh;
}
h1 {
font-size: 2.5rem;
color: #1a73e8;
margin-bottom: 20px;
text-align: center;
}
#video-stream {
width: 90%;
max-width: 800px;
border: 5px solid #1a73e8;
border-radius: 10px;
box-shadow: 0px 4px 10px rgba(0, 0, 0, 0.1);
}
footer {
margin-top: 20px;
font-size: 0.9rem;
color: #555;
text-align: center;
}
footer a {
color: #1a73e8;
text-decoration: none;
}
footer a:hover {
text-decoration: underline;
}
</style>
<script>
let ws = new WebSocket("ws://localhost:8000/ws/video");
// Log WebSocket connection status
ws.onopen = function() {
console.log("WebSocket connection established.");
};
ws.onmessage = function(event) {
let img = document.getElementById("video-stream");
img.src = "data:image/jpeg;base64," + event.data;
};
ws.onerror = function(error) {
console.error("WebSocket error:", error);
};
ws.onclose = function() {
console.log("WebSocket connection closed.");
};
</script>
</head>
<body>
<h1>Real-Time Restricted Area Monitoring System</h1>
<img id="video-stream" alt="Video Stream Loading..." />
<footer>
Powered by <a href="https://ultralytics.com/" target="_blank">YOLOv11</a> | Created by <a href="https://apycoder.com" target="_blank">ApyCoder</a>
</footer>
</body>
</html>
Explanation
- YOLOv11 Model: The object detection model used here is YOLOv11, which is capable of detecting various objects in real-time, including people. The model is loaded with the weights file
best.pt
. - Region of Interest (ROI): The restricted area is defined as an ellipse on the video frame. The system checks whether a detected person is within or near this restricted area.
- WebSocket Streaming: The system uses FastAPI with WebSocket for real-time video streaming. It continuously captures frames, processes them with YOLOv11, and sends the annotated frames to the client.
- Proximity Detection: If a person enters or approaches the restricted area (based on proximity), the system will display a warning message on the video feed.
You’re all set! To start the application, run the following command
uvicorn yourapp:app --reload
Once the server is running, you can access the application locally at
Conclusion
This Real-Time Restricted Area Monitoring System is a powerful and efficient solution for security and surveillance. By combining YOLOv11 for object detection and FastAPI for high-speed video streaming, the system provides real-time alerts and continuous monitoring.
For more projects like this, make sure to check out my YouTube channel and blog for more tutorials and examples!