The ground-based microwave sounding radiometers installed at nine weather stations of Korea Meteorological Administration alongside with the wind profilers have been operating for more than 4 years. Here we apply a process to assess the characteristics of the observation data by comparing the measured brightness temperature (Tb) with reference data. For the current study, the reference data are prepared by the radiative transfer simulation with the temperature and humidity profiles from the numerical weather prediction model instead of the conventional radiosonde data. Based on the 3 years of data, from 2010 to 2012, we were able to characterize the effects of the absolute calibration on the quality of the measured Tb. We also showed that when clouds are present the comparison with the model has a high variability due to presence of cloud liquid water therefore making cloudy data not suitable for assessment of the radiometer's performance. Finally we showed that differences between modeled and measured brightness temperatures are unlikely due to a shift in the selection of the center frequency but more likely due to spectroscopy issues in the wings of the 60 GHz absorption band. With a proper consideration of data affected by these two effects, it is shown that there is an excellent agreement between the measured and simulated Tb. The regression coefficients are better than 0.97 along with the bias value of better than 1.0 K except for the 52.28 GHz channel which shows a rather large bias and variability of-2.6 and 1.8 K, respectively.