Despite the automatization of many industrial and logistics processes, human workers are still often involved in the manual handling of loads. These activities lead to many work-related disorders that reduce the quality of life and the productivity of aged workers. A biomechanical analysis of such activities is the basis for a detailed estimation of the biomechanical overload, thus enabling focused prevention actions. Thanks to wearable sensor networks, it is now possible to analyze human biomechanics by an inverse dynamics approach in ecological conditions. The purposes of this study are the conceptualization, formulation, and implementation of a deep learning-assisted fully wearable sensor system for an online evaluation of the biomechanical effort that an operator exerts during a manual material handling task. In this paper, we show a novel, computationally efficient algorithm, implemented in ROS, to analyze the biomechanics of the human musculoskeletal systems by an inverse dynamics approach. We also propose a method for estimating the load and its distribution, relying on an egocentric camera and deep learning-based object recognition. This method is suitable for objects of known weight, as is often the case in logistics. Kinematic data, along with foot contact information, are provided by a fully wearable sensor network composed of inertial measurement units. The results show good accuracy and robustness of the system for object detection and grasp recognition, thus providing reliable load estimation for a high-impact field such as logistics. The outcome of the biomechanical analysis is consistent with the literature. However, improvements in gait segmentation are necessary to reduce discontinuities in the estimated lower limb articular wrenches.