Smart EOQ Models: Incorporating AI and Machine Learning for Inventory Optimization
Traditional Economic Order Quantity (EOQ) models rely on static assumptions (e.g., constant demand 𝐷𝐷, fixed holding cost ℎ), failing in volatile environments. This research advances dynamic inventory control through an AI-driven framework where: 1. Demand Forecasting: Machine learning (LSTM/GBRT) estimates time-varying demand : 𝐷𝐷ₜ = 𝑓𝑓(𝐗𝐗ₜ; 𝛉𝛉) + 𝜀𝜀ₜ (𝐗𝐗ₜ: covariates like promotions, seasonality; 𝜀𝜀ₜ: residuals) 2. Adaptive EOQ Optimization: Reinforcement Learning (RL) dynamically solves the following optimization problem: 𝐦𝐦𝐦𝐦𝐦𝐦 𝑸𝑸𝒕𝒕,𝒔𝒔𝒕𝒕 𝔼𝔼 𝒕𝒕 (𝒉𝒉 ⋅ 𝑰𝑰𝒕𝒕 + + 𝒃𝒃 ⋅ 𝑰𝑰𝒕𝒕 − +𝒌𝒌 ⋅ 𝜹𝜹(𝑸𝑸𝒕𝒕)) Subject to: 𝑰𝑰𝒕𝒕 = 𝑰𝑰𝒕𝒕−𝟏𝟏 +𝑸𝑸𝒕𝒕 − 𝑫𝑫𝒕𝒕 Where: • 𝑸𝑸𝒕𝒕: Order quantity at time 𝒕𝒕 • 𝒔𝒔𝒕𝒕: Reorder point at time 𝒕𝒕 • 𝒉𝒉: Holding cost per unit • 𝒃𝒃: Backorder (shortage) cost per unit • 𝒌𝒌: Fixed ordering cost • 𝜹𝜹(𝑸𝑸𝒕𝒕): Indicator function (1 if 𝑸𝑸𝒕𝒕 > 𝟎𝟎, else 0) • 𝑰𝑰𝒕𝒕 +: Inventory on hand (positive part of 𝑰𝑰𝒕𝒕) • 𝑰𝑰𝒕𝒕 −: Backordered inventory (negative part of 𝑰𝑰𝒕𝒕) • 𝑫𝑫𝒕𝒕: Demand at time 𝒕𝒕 Validation was performed using sector-specific case studies. • Pharma: Perishability constraint 𝐼𝐼ₜ⁺≤ 𝜏𝜏 (𝜏𝜏: shelf-life) reduced waste by 27.3% • Retail: Promotion-driven demand volatility (𝜎𝜎²(𝐷𝐷ₜ) ↑ 58%) mitigated, cutting stockouts by 34.8% • Automotive: RL optimized multi-echelo n coordination, reducing shortage costs by 31.5% The framework reduced total costs by 24.9% versus stochastic EOQ benchmarks. Key innovation: closed-loop control where 𝑄𝑄ₜ = RL(𝑠𝑠𝑡𝑡𝑎𝑎 𝑡𝑡𝑒𝑒ₜ) adapts to real-time supply-chain states.