Journal of Sensors

Journal of Sensors / 2019 / Article

Research Article | Open Access

Volume 2019 |Article ID 8392583 | 7 pages |

Minimizing Device-to-Device Variation in the Spectral Response of Portable Spectrometers

Academic Editor: Jesús Lozano
Received24 Aug 2018
Accepted15 Nov 2018
Published28 Jan 2019


As portable spectrometers have been developed, the research of spectral analysis has evolved from a traditional laboratory-based closed environment to a network-connected open environment. Consequently, its application areas are expanding in combination with machine learning techniques. The device-to-device variation in the spectral response of portable spectrometers is a critical issue in a machine learning-based service scenario since the classification performance is highly dependent on the consistency of spectral responses from each spectrometer. To minimize device-to-device variation, a cuboid prism is employed instead of a combination of mirrors and prism to construct an optical system for the spectrometer. The spectral responses are calibrated to correct pixel shift on the image sensor. Experimental results show that the proposed method can minimize the device-to-device variation in spectral response of portable spectrometers.

1. Introduction

Visible/near-infrared (VIS/NIR) spectroscopy has been widely used in various domains, such as diagnostics, assessing food quality, and pharmaceutical testing. However, most spectroscopy studies have been limited to controlled laboratory environments, where experiments that employ VIS/NIR analysis can be performed. In addition, conventional spectrometers are generally expensive and bulky, which makes them impractical for many applications. Fortunately, several portable spectrometers have become commercially available due to advancements in sensor and IT technology; e.g., startup companies such as Stratio, SCiO, and TellSpec have developed pocket-sized spectrometers [13]. Several research groups recently demonstrated that a low-cost smartphone spectrometer can be designed using off-the-shelf components [46].

These portable spectrometers are generally coupled with a smart device and connected to a server via a network. Figure 1 depicts a typical service scenario for the use of a portable spectrometer for spectral analysis. The portable spectrometer scans an object and transfers the measured spectral response to a smartphone app. The primary role of the smartphone app is two-fold, as it controls and operates the spectrometer and analyzes the spectral response scanned by the spectrometer using classification parameters. The classification parameters depend on the type of application, so they are different from one another. Classification parameters used in the analysis are precalculated by a server and downloaded via the internet depending on the application. The final classification results can be displayed on the smartphone app.

A conventional way of analyzing the spectral response of an object is to search for specific spectrum peaks, which are associated with a resonance frequency of a molecular structure. Therefore, to apply this method to classify an object, the molecular structure of an object and its resonance frequency need to be fully understood. It is expensive and time-consuming to compile a comprehensive database of resonance frequencies for different objects. However, the process of spectral analysis with a portable spectrometer offers certain advantages. Machine learning is employed to classify an object with a portable spectrometer [713]. The advantage of utilizing machine learning is that the classification no longer searches only for the spectral peak of an object but also considers its overall shape by analyzing properties such as slope and intensity ratio. Thus, spectral analysis can be performed without the exact knowledge about the molecular structure of the target object.

The process of classifying an object using a portable spectrometer and machine learning is depicted in Figure 2. Machine learning algorithms such as SVM (support vector machine), AdaBoost, and neural networks require a training phase to calculate the classification parameters for objects from a database of spectral responses. Many of the spectral responses obtained from portable spectrometers are required to get reliable classification parameters during the training phase. Once the parameters for analysis are calculated, they can be preloaded or downloaded by a smartphone app. The spectrum of an object can be measured with a portable spectrometer, and the spectral response is transmitted to a smartphone. The smartphone app analyzes the spectral response using the classification parameters calculated by machine learning and then displays the classification results.

Considering this new service scenario for the use of a portable spectrometer, we note that it is very important to minimize the device-to-device variation in spectral responses. If a significant device-to-device variation in spectral responses exists, the classification performance of the spectrometer cannot be guaranteed. First, it is important to maintain consistency across spectral responses used to calculate the classification parameters. To simplify the collection of sufficient spectral responses used during the training phase, it is necessary to obtain spectral response data from different spectrometers. Second, variation in the device-to-device spectral response may result in misclassification during the measurement phase. If spectral response used in the measurement phase differs from that in the training phase, it is not possible to expect an accurate classification result. Therefore, the device-to-device spectral responses must be as consistent as possible. In other words, spectral response measured by one spectrometer should be similar to the response measured by another spectrometer.

To minimize the device-to-device variation, careful attention should be paid to the manufacturing process. A slight mismatch in the optical system or image sensor of the portable spectrometer can result in a considerable discrepancy in spectral responses. In this paper, we suggest a method for coping with imperfections in the manufacturing process and compensating for a spectrum shift in the image sensor.

The rest of the paper is organized as follows. In Section 2, we explain the causes of device-to-device variation in spectral response of portable spectrometers and their effect on classification performance. In Section 3, we propose a novel method to minimize this device-to-device variation, and the performance of the proposed scheme is evaluated in Section 4. Finally, conclusions are presented in Section 5.

2. Device-to-Device Variation in Spectral Response of a Portable Spectrometer

2.1. Optical System and Device-to-Device Variation

Figure 3 presents an example of the optical system adopted in a typical portable spectrometer. It consists of a number of mirrors, lens, and a prism. The path of light from a source can be controlled using mirrors, and lenses can be used to collect or scatter light. The aligned light then passes through the prism, where it is refracted twice with the refraction angle dependent on wavelength. Finally, the dispersed spectrum is projected on the detector or image sensor. In other words, the purpose of a portable spectrometer is to transmit the light of a specific wavelength to the predetermined position on the image sensor. When there is a discrepancy between devices, the position of the light of a specific wavelength on the image sensor will be inconsistent. Any device-to-device variation finally results in a shift of the spectrum on the image sensor along the wavelength dimension.

There are several causes of device-to-device variation in spectral response. First, the angle of the mirror should be fine-tuned during assembly because a small rotational displacement of the mirror is magnified on the image sensor, which results in a huge amount of spectral response variation. It is also important to maintain an exact angle of the prism for the same reason. The axis of the optic system and the image sensor can also be misaligned, and this leads to the distortion or shifting of the image displayed on the sensor.

2.2. The Effect of Spectral Shift on the Classification Performance

To investigate the effect of spectral shift on object classification using machine learning, we evaluated eight common food powders that were visually indistinguishable, including salt, sugar, cream, flour, bean, corn, rice, and potato powder [13]. The food powders were measured using a LinkSquare spectrometer [14], and the spectral data were acquired. Data were collected as 100 spectra of each different food powder and used to train a model for classification based on machine learning. A supervised machine learning algorithm was employed to solve the classification problem, with a 6-layer CNN (convolutional neural network) followed by a 3-layer fully connected neural network [15] based on the Torch framework [16].

After the training was completed, food powders were identified using the trained CNN model. Each input spectral data was classified by the index of the largest value at the output of the last fully connected layer of the trained model. The classification accuracy was evaluated by shifting the spectral data of food powders along the wavelength dimension.

Figure 4 displays the classification accuracy for cream powder depending on the wavelength shift. An accuracy of 100% was achieved when the wavelength shift was less than 2 pixels. The accuracy, however, dramatically decreased as the wavelength shift increased. When the wavelength shift is largely positive, the cream powder was misclassified as bean powder. Contrastingly, it was misclassified as flour powder when largely negative. As shown in Figure 5, typical raw spectra of the three food powders mentioned above, namely, flour, cream, and bean powder, can overlap when there is a shift in either direction. For instance, as the spectrum of cream gets shifted in either direction, it overlaps more with that of bean or that of flour, respectively.

Our classification experimental results show that robust and accurate object identification cannot be achieved if there is spectral variation between the training data and test data. Consequently, it is crucial to minimize the device-to-device spectral response variation across all spectrometers for the technology to be useful in commercial applications.

3. Methods to Minimize the Device-to-Device Variation

3.1. Use of a Cuboid Prism

Figure 6(a) shows the optical system of the conventional portable spectrometer which adopts dual mirrors and a simple prism. While the mirror is one of the most error-prone components in the optical system, a prism is not robust for small rotations of position. Contrastingly, a multiple-component cuboid prism is installed as shown in Figure 6(b), where the troublesome mirrors are no longer used. The assembly of the prism can be accomplished easily and accurately due to its cuboid shape. The alignment of the prism and other optical components during the assembly process is improved, which eventually reduces the device-to-device variation.

The cross-sectional view of the multiple-component cuboid prism is shown in Figure 7. The prism consists of multiple triangular prisms . Light enters from the left surface S(1) and is dispersed at the surfaces . Multiple-component prisms may need to be affixed by a high refractive index adhesive to match the refractive index with optical materials of prisms to reduce reflection from internal surfaces .

3.2. Calibration of the Spectral Response

The device-to-device spectral response variation finally results in the spectrum shift on an image sensor along the wavelength dimension. Therefore, it is important to compensate for this shift to standardize the devices. In a conventional optical system, even if a slight rotation error occurs during assembling of mirrors or a prism, the pixel shift is likely to exceed the calibration limit of the image sensor. On the other hand, when a cuboid prism is used, the calibration process can be applied since the pixel shift is relatively small, which is within the range of the image sensor.

First, an oversized image sensor (i.e., an image sensor that is larger than that of a spectral image) is ideal. By introducing this margin, the spectral image is always projected within the image sensor, even in the worst case. The reference samples with known spectral response are prescanned during assembly and testing. The obtained spectrum in a specific device is compared to the reference spectrum to estimate the amount of shift. The shift is then used as the compensation so that the subsequently scanned spectrum is compensated to match the reference spectrum. This calibration process can be done either within the spectrometer or in any application receiving the spectrum. Figure 8 illustrates the overall calibration process.

Meanwhile, the responsivity of pixels on an image sensor is not uniform, which means there may be some defective pixels that have significantly higher or lower responsivity than the standard pixels. Dust or scratches may also be introduced during the manufacturing process. As a result, some part(s) of the spectrum may not reach the image sensor correctly. To prevent the effect from such nonuniformity, each row of an image sensor is examined during the calibration process. Since spectral responses of a reference sample are known, when the spectral responses at each row are measured, it can be compared with a reference spectrum to determine whether it is the defect-free row or not. The information about defect-free row is stored in each spectrometer, and only spectral responses from reliable rows are considered in subsequent scanning.

4. Experimental Results

In this section, the device-to-device variation is examined after employing the proposed method. First, we evaluate the robustness of a rotational error of the cuboid prism, compared to the conventional prism. We then illustrate the spectra measured by eight different portable spectrometers after employing the cuboid prism and calibrating the spectral responses.

4.1. Robustness to the Rotational Error of a Cuboid Prism

Figure 9 illustrates the effect of rotation of an optical component in the two optical systems previously shown in Figure 6. The first mirror in the optical system shown in Figure 6(a) was rotated by one degree, and the positions of several wavelengths of light on the image plane were compared to the positions without the rotation. In Figure 9(a), shaded boxes represent the detector, and each wavelength is color-coded. One degree of rotation resulted in a shift of the spectrum in the -axis by approximately 1.101 mm. As a result, the blue dot in the lower part of the left plot disappeared in the right plot with the rotation. On the other hand, the rotation of the three-component cuboid prism in the optical system shown in Figure 9(b) results in the shift of the spectrum in the -axis by only 0.014 mm. This experiment shows that the optical system with the properly-implemented multiple-component cuboid prism is about 100 times more robust to assembly errors than the conventional optical system with mirrors and simple prism. As we increase the number of triangular prisms in the cuboid prism, the positional variation of the spectrum signal introduced by the rotation error can be further reduced, though at the expense of higher manufacturing costs.

Therefore, the multiple-component cuboid prism can reduce the device-to-device spectral variation by minimizing the effect of the rotational displacement of the prism on the positional variation of the spectrum.

4.2. Device-to-Device Variation after the Proposed Method

To examine the device-to-device variation after employing the cuboid prism and calibrating spectral responses, the spectra of the same object measured by eight different portable spectrometers were evaluated. The resulting eight spectra are shown in Figure 10(a). By comparing the peak positions, it is obvious that the device-to-device variation is negligible. Figure 10(b) is the zoomed plot of Figure 10(a) around the peak near pixel number 350 on the horizontal axis. The purple line from device #4 has a peak at pixel number 346 while the orange line from device #6 has a peak at pixel number 350.

From this experiment, we demonstrated that the proposed method can successfully minimize the device-to-device variation through the use of the cubic prism and the calibration of the spectral response. As expected, the assembly errors were limited within the tolerance range of ±2 pixel shifts.

5. Conclusions

In this paper, we proposed a method to minimize the device-to-device variation, which is one of the important issues in the emerging network-connected open environment for spectral analysis. As portable spectrometers have been developed, machine learning can be used instead of traditional analytical methods. To calculate the classification parameters, many spectral data obtained from many different devices are required. Lowering the device-to-device variation better is the consistency of the training database. It is also important to consider the classification of the spectral data measured by a specific spectrometer since the classification parameters are calculated from spectral data of other devices. Instead of using a combination of mirrors and a simple prism, which are the most error-prone components in the conventional optical system, a cuboid prism was employed in the proposed method to keep the pixel shift on the image sensor as small as possible. The spectral response of a specific device was calibrated to further compensate for the pixel shift. From the experiment, we demonstrated that the device-to-device variation could be limited within the tolerance range of ±2 pixel shifts.

Data Availability

The raw data is available at “” or from the corresponding author upon request.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.


This work was supported by the Institute for Information and communications Technology Promotion (IITP) grant funded by the Korea Government (MSIT) (No. 2016-0-00080, Germanium on silicon based sensor which covers visible spectrum to short wavelength infra-red range (400–1600 nm) for materials identification and application system development).


  4. A. J. Das, A. Wahi, I. Kothari, and R. Raskar, “Ultra-portable, wireless smartphone spectrometer for rapid, non-destructive testing of fruit ripeness,” Scientific Reports, vol. 6, no. 1, article 32504, 2016. View at: Publisher Site | Google Scholar
  5. P. Edwards, C. Zhang, B. Zhang et al., “Smartphone based optical spectrometer for diffusive reflectance spectroscopic measurement of hemoglobin,” Scientific Reports, vol. 7, no. 1, article 12224, 2017. View at: Publisher Site | Google Scholar
  6. A. J. S. McGonigle, T. C. Wilkes, T. D. Pering et al., “Smartphone spectrometers,” Sensors, vol. 18, no. 2, p. 223, 2018. View at: Publisher Site | Google Scholar
  7. E. Elhariri, N. El-Bendary, M. Mostafa et al., “Multi-class SVM based classification approach for tomato ripeness,” in Innovations in Bio-inspired Computing and Applications, A. Abraham, P. Krömer, and V. Snášel, Eds., vol. 237 of Advances in Intelligent Systems and Computing, pp. 175–186, Springer, Cham, 2014. View at: Publisher Site | Google Scholar
  8. N. El-Bendary, E. Hariri, A. Ella Hassanien, and A. Badr, “Using machine learning techniques for evaluating tomato ripeness,” Expert Systems with Applications, vol. 42, no. 4, pp. 1892–1905, 2015. View at: Publisher Site | Google Scholar
  9. S. Dan, S. X. Yang, F. Tian, and L. Den, “Classification of orange growing locations based on the near-infrared spectroscopy using data mining,” Intelligent Automation & Sof Computing, vol. 22, no. 2, pp. 229–236, 2016. View at: Publisher Site | Google Scholar
  10. A. Guillemain, K. Dégardin, and Y. Roggo, “Performance of NIR handheld spectrometers for the detection of counterfeit tablets,” Talanta, vol. 165, pp. 632–640, 2017. View at: Publisher Site | Google Scholar
  11. O. Gupta, A. J. Das, J. Hellerstein, and R. Raskar, “Machine learning approaches for large scale classification of produce,” Scientific Reports, vol. 8, no. 1, article 5226, 2018. View at: Publisher Site | Google Scholar
  12. T. Wang, J. Chen, Y. Fan, Z. Qiu, and Y. He, “SeeFruits: design and evaluation of a cloud-based ultra-portable NIRS system for sweet cherry quality detection,” Computers and Electronics in Agriculture, vol. 152, pp. 302–313, 2018. View at: Publisher Site | Google Scholar
  13. H. You, Y. Kim, J. Lee, B. Jang, and S. Choi, “Food powder classification using a portable visible-near-infrared spectrometer,” Journal of Electromagnetic Engineering and Science, vol. 17, no. 4, pp. 186–190, 2017. View at: Publisher Site | Google Scholar
  15. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” in Proceedings of the 26th Annual Conference on Neural Information Processing Systems (NIPS ‘12), pp. 1097–1105, Lake Tahoe, NV, USA, December 2012. View at: Google Scholar

Copyright © 2019 Sunwoong Choi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

721 Views | 230 Downloads | 0 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder
 Sign up for content alertsSign up

You are browsing a BETA version of Click here to switch back to the original design.