Advanced Computer Vision for Precision Orchard Yield Prediction
Image Placeholder 1
Add your project image here

Image Placeholder 2
Add your project image here

Image Placeholder 3
Add your project image here

Image Placeholder 4
Add your project image here

An advanced agricultural computer vision solution that quantifies flower density in tree fruit orchards using deep learning and image analysis. This system combines depth-based background removal, YOLO segmentation, intelligent clustering algorithms, and multi-factor density calculations to provide accurate bloom density assessments critical for yield prediction, crop-load management, and thinning decisions in commercial orchards.
The pipeline integrates Depth-Anything-V2 for monocular depth estimation, custom-trained YOLO models for instance segmentation, DBSCAN spatial clustering for biologically meaningful grouping, and sophisticated density metrics combining flower pixel coverage with edge detection analysis.
Accurate flower density estimation is essential for orchard management decisions including yield forecasting, optimal thinning, and crop-load planning. Manual counting is labor-intensive and impractical for commercial operations, while flowers present unique detection challenges due to dense clustering, high occlusions, variable lighting, and similar appearance to background foliage. Traditional object detection approaches struggle with densely packed flowers (>100 per image), exhibiting error rates five times higher than regression-based methods in high-density scenarios.
Depth-Anything-V2 generates monocular depth maps for intelligent background separation, isolating foreground flowers from distant foliage
Custom-trained YOLO model performs instance segmentation with precise polygon masks for individual flowers at pixel level
DBSCAN-based spatial clustering with geometric union operations groups flower detections into biologically meaningful clusters
Multi-factor density calculation combining flower pixel coverage and edge pixel metrics with configurable weighting schemes
Depth-based segmentation leveraging geometric cues to separate foreground flowers from complex backgrounds with 50th percentile threshold
Segmentation + regression pipeline maintains accuracy with >100 flowers per image, handling extreme occlusions in full-bloom conditions
DBSCAN spatial clustering with adaptive epsilon tuning (0.15-0.30) mirrors natural branch structure with ~8 clusters per image
Dual-factor calculation: (flower_weight Ă— Flower%) + (edge_weight Ă— Edge%) customizable for different crop morphologies
Dynamic brightness adjustment in HSV color space ensures consistent detection from harsh midday sun to overcast skies and shadowed areas
Automated Excel generation with per-cluster analysis, image-level summaries, and dataset-wide statistics with visual annotation overlays
Vision Transformer encoder generates relative depth maps from single RGB images. Background pixels (higher depth values) are identified using quantile-based thresholding and removed:
Adaptive image enhancement compensates for variable lighting in HSV color space:
Custom-trained YOLO model with pixel-level flower segmentation generating polygon masks. Optimized with low confidence threshold (0.0001) to capture faint/partial flowers, IOU threshold (0.45) balancing false positives vs. negatives, and 640Ă—640 input resolution for detail and efficiency.
Detected flowers grouped into biologically meaningful clusters with dynamically calculated epsilon:
Iterates through epsilon factors (0.15–0.30) to achieve target cluster counts (~8 per image) matching typical branch structure.
Density percentage combines two independent metrics with configurable weights:
Supports multiple ViT encoder sizes (Small/Base/Large/Giant). ViT-Large (256 features) default configuration. Pre-trained checkpoints with gradient-disabled evaluation mode for efficient inference.
Direct access to mask.xy coordinates preserving precise boundary information. Coordinates scaled from 640Ă—640 inference to original image dimensions maintaining geometric accuracy.
Comprehensive error handling for invalid geometries using Shapely's buffer(0) operation. Try-except blocks prevent pipeline failures from self-intersections and edge cases.
Fuses RGB, HSV, LAB, and grayscale representations. Masks combined via bitwise OR creating comprehensive flower pixel classifier robust to color variation.
End-to-end orchestration: depth → brightness → YOLO → clustering → density. Generates individual and aggregated Excel reports with tqdm progress bars.
Green polygon boundaries for clusters, red overlay (30% transparency) for flower pixels, green overlay (20%) for edge pixels, numbered labels with density percentages.
âś“ Instance segmentation with polygon masks captures overlapping flowers. Clustering + union operations merge partially occluded flowers into coherent representations without individual delineation
âś“ Shapely's buffer(0) repair operation and try-except blocks gracefully handle self-intersections, zero-area slivers, and geometric edge cases
âś“ Depth-based foreground segmentation removes background elements using geometric priors. Quantile-based threshold adapts to each image's depth distribution
âś“ Dynamic brightness adjustment in HSV color space with multi-color-space flower detection (RGB/HSV/LAB/grayscale) ensures illumination robustness
âś“ CUDA GPU acceleration with CPU fallback, 640Ă—640 YOLO inference resolution, NumPy vectorization, and batch processing amortizing model loading overhead
âś“ Maximum area constraints (15-30% of image) prevent oversized clusters. Adaptive DBSCAN epsilon tuning with ~8 cluster target ensures biologically meaningful groupings
Early-season flower density estimates predict harvest volume, enabling better market planning and resource allocation
Quantitative bloom density data informs thinning decisions (chemical/mechanical) to optimize fruit size and quality
Temporal density tracking across seasons identifies high/low-performing tree sections for targeted interventions
Automated phenotyping accelerates breeding programs by efficiently quantifying flowering traits across cultivars
Pipeline integrates with UAVs, ground robots, or autonomous tractors for large-scale automated orchard assessment
Data-driven insights empower orchard managers with precision agriculture tools for sustainable production
This Automated Flower Density Estimation System demonstrates the power of integrating modern computer vision techniques—monocular depth estimation, YOLO instance segmentation, spatial clustering, and multi-factor analysis—to solve complex agricultural challenges. By providing accurate, automated bloom density quantification, the system empowers orchard managers with data-driven insights for yield prediction, crop-load management, and precision thinning practices. The modular, configurable pipeline architecture ensures adaptability across diverse fruit crops and deployment scenarios, positioning it as a valuable tool for advancing precision agriculture.
This project showcases advanced agricultural computer vision, deep learning deployment, and production-grade system development. Feel free to reach out for technical discussions or collaboration opportunities.