The evaluation of bone diagenetic phenomena in archaeological timescales has a long history; however, little is known about the origins of the microbes driving bone diagenesis, nor about the extent of bone diagenesis in short timeframes—such as in forensic contexts. Previously, the analysis of non-collagenous proteins (NCPs) through bottom-up proteomics revealed the presence of potential biomarkers useful in estimating the post-mortem interval (PMI). However, there is still a great need for enhancing the understanding of the diagenetic processes taking place in forensic timeframes, and to clarify whether proteomic analyses can help to develop better models for estimating PMI reliably. To address these knowledge gaps, we designed an experiment based on whole rat carcasses, defleshed long rat bones, and excised but still-fleshed rat limbs, which were either buried in soil or exposed on a clean plastic surface, left to decompose for 28 weeks, and retrieved at different time intervals. This study aimed to assess differences in bone protein relative abundances for the various deposition modalities and intervals. We further evaluated the effects that extrinsic factors, autolysis, and gut and soil bacteria had on bone diagenesis via bottom-up proteomics. Results showed six proteins whose abundance was significantly different between samples subjected to either microbial decomposition (gut or soil bacteria) or to environmental factors. In particular, muscle- and calcium-binding proteins were found to be more prone to degradation by bacterial attack, whereas plasma and bone marrow proteins were more susceptible to exposure to extrinsic agents. Our results suggest that both gut and soil bacteria play key roles in bone diagenesis and protein decay in relatively short timescales, and that bone proteomics is a proficient resource with which to identify microbially-driven versus extrinsically-driven diagenesis.