Deep Neural Networks (DNNs) have revolutionized various fields through their ability to model complex patterns, yet their performance critically hinges on the availability of large-scale, accurately labeled datasets. The costly and time-consuming process of creating such datasets has motivated research into Learning with Noisy Labels (LNL), which aims to reduce reliance on perfect labels. This paper introduces CoFix, an innovative LNL framework that integrates sample selection with semi-supervised learning techniques to address the challenge of noisy labels in training. CoFix employs a Gaussian Mixture Model (GMM) to dynamically segment the training data into clean and noisy subsets, leveraging semi-supervised learning approaches on both. Inspired by the FixMatch algorithm, CoFix refines consistent regularization and pseudo-labeling strategies, enhancing augmentation strategies and temperature sharpening techniques. Additionally, CoFix explores label smoothing to augment the loss function, further refining model performance. Our experiments demonstrate CoFix's superiority over state-of-the-art methods, achieving significant improvements in fewer training epochs, particularly in lower noise scenarios. The robustness and versatility of CoFix are evident through its consistent performance across various benchmark datasets and noise levels. The contributions of this paper include a novel LNL method with enhanced generalization capability, an investigation into the impact of label smoothing on the loss function, and extensive testing that confirms CoFix's efficiency and adaptability to different noise levels. © 2024 World Scientific Publishing Company.