Multispectral Camera MDR report

Belkin, Finken, Walczak Multispectral Camera System 1 Multispectral Camera MDR report S.Belkin, EE, A.C.Finken. EE, M.Walczak, EE  hobbyists curi...
Author: Clare Miles
8 downloads 0 Views 1MB Size
Belkin, Finken, Walczak

Multispectral Camera System

1

Multispectral Camera MDR report S.Belkin, EE, A.C.Finken. EE, M.Walczak, EE 

hobbyists curious about their surroundings.

Abstract— Multi-spectral cameras capture images through special optical filters, essentially band-pass filters, which allow only a certain range of wavelengths through while blocking the rest. By carefully selecting these optical filters, the small range of wavelengths that were allowed through can show certain optical characteristics of the target image. These optical characteristics can indicate what substances, and even concentrations, are present within the target. The multi-spectral camera will be operated by a microcontroller system. This system will position an individual filter in front of a monochrome camera and an image will be captured and displayed to the operator.

I. INTRODUCTION Most common CCD cameras capture images that display the entire visible spectrum that a human eye can detect [1]. These images containing the entire visible spectrum display the wavelength ranges between approximately 400-700 nanometers [2]. Having an image that displays the entire visible spectrum in one image masks certain visible “information” that may be present within the target being imaged, since all spectra are combined; capturing the spectra in individual bands, while blocking out others, allows for finer details to emerge. This detail can show the spectral signature [3] of the objects that are captured in the image, such as the moisture content within a plant, or the minerals present in soil [2].

One interesting application of hyperspectral imaging is in the field of mineralogy. Figure 1 shows the “reflectance spectra of (the mineral) pyroxene as a function of grain size. As the grain size becomes larger, more light is absorbed and the reflectance decreases. Note the trace tremolite contamination (a silicate mineral [6]) causing the narrow absorption features near 1.4 and 2.3 µm [7].” Hyperspectral imagery can be used not only to determine the size of particulates within a sample, but also if that sample is pure, or contains other substances. The field of hyperspectral imagery is growing in popularity because of its ability to capture image data of the earth’s surface with non-contact, non-disruptive, aerial or space-based imagery. Rough and otherwise dangerous terrain, as well as large surface areas, can be imaged in multiple small spectral wavelength bands, which can reveal certain spectral signatures contained within [5].

Multispectral imagery relates to hyperspectral imagery in that they both capture images using optical filters that filter out certain spectral bands. Hyperspectral imagery uses tight spectral bands for each image it captures, such as 10nm wide bands, which are usually in continuous sequence. Multispectral imagery tends to use a little bit wider spectral bands that are not necessarily in a continuous sequence, but defined in chunks, like the ‘visible spectrum’ or the ‘near-IR’ spectrum [4]. In essence, there is not a large distinction between these two terms Fig. 1. Reflectance Spectra of Mineral Pyroxene as a Function of Grain Size. of spectral imagery. The significance of having the ability to obtain this type of selective spectrum information from an image, which is nondestructive to the environment and relatively easily captured, can be very useful. Knowing what material is present simply by studying a multispectral image is an extremely powerful and useful tool for many different scientific fields, such as agriculture and mineralogy, as well as for non-professional

Hyperspectral cameras have the resolution to identify materials such as minerals and vegetation on the surface of the earth based only on their spectral signature [3]. These spectral signatures are matched to actual specific materials and compiled into libraries to be used as references. This type of spectral library has been created for minerals, plants, manmade materials, vegetation stress, and many other materials [5]. Multi-spectral imaging has been in use for many years but had been relatively cost prohibitive since the technology was

Belkin, Finken, Walczak

Multispectral Camera System

new and that it was generally used on aircraft, which left this technology primarily in the hands of government agencies, such as NASA and the US Geological Survey (USGS). However, its usefulness has spurred the development of lower cost camera devices, which has allowed many new industrial uses, but still not very affordable for non-professional hobbyists or novice scientists. The primary goal of the SDP13 Multi-Spectral Camera system is to develop a system that will be very useful, while also being relatively affordable. Affordability of such a powerful imagery tool will have a very positive societal impact by enabling laypeople to view and learn about their surroundings in a very personal and individual way. Individuals with such knowledge will be better able to make informed decisions about their immediate environment because this knowledge was previously unavailable to them.

II. DESIGN A. Overview Our multispectral camera system will have a USB CMOS sensor camera attached to a filter wheel assembly, which was donated by Seahorse Bioscience, containing multiple interference type, narrow band wavelengths optical filters. On the other side of the filter wheel will be a monofocal, manual iris, and manual zoom lens assembly. The intended functionality will be to capture images of geological objects from a mobile ‘Mars Rover’ type of platform, from distances anywhere between .3 meters to approximately 2 meters. A microcontroller device, either a Raspberry Pi, or an Arduino, will command a stepper motor driver to rotate the filter wheel for the selection of the optical filters. While the camera system is positioned at one particular geological object, an image will be captured through each filter. The resulting images can then be analyzed with spectral libraries in an attempt to determine the composition of the object [14]. The block diagram of the multispectral camera is shown in Figure 2.

Fig. 2. Block Diagram of Multispectral Camera System.

2

contains various band-pass optical filters through which the images will be taken. The microcontroller is responsible for positioning the filter wheel and controlling the camera. The general specifications for the Mightex brand camera include USB connectivity, 1/3” CMOS imaging sensor, and Cmount monofocal lens. The camera will be controlled using Linux platform for controlling the system. The lens is a Pentax brand C-mount mono-focal manual iris lens with a 12.5mm focal length. The filter wheel contains 8 filter locations, has Cmount connections for the camera and lens, and has a stepper motor for positioning the lens. The microcontroller board must be able to support the camera and run the code for the image collection. B. Optics – Lens and Camera The lens and camera units form the imaging component of the multi-spectral camera system. In order to choose these components correctly, knowledge of certain photographic principles needed to be learned, such as f-stop, sensor size, iris functionality. The correct combination of parameters will allow an optimum image to be taken. Optical imperfections such as geometric, color, and radiometric aberrations will need to be dealt with during image analysis. The analysis of these aberrations will be attempted with software simulation. A digital monochromatic camera was chosen for our system because it has better spatial resolution than a color camera does. The lower resolution in a color camera is due to the fact that they need to interpolate, (also called demosaicing), the color that is captured within each individual pixel with the pixels that are directly surrounding it, which reduces the sharpness of an image’s contrast, called ‘color aliasing’ or ‘edge artifacts’. A Bayer filter, which is a checkerboard of red, green, and blue filters, (in a 25%, 50%, 25% ratio, respectively, Figure 3), sits on top of the cameras’ sensor, one color filter covering one individual pixel, and blocks out the wavelengths of the other two colors (see Figure 4). Each pixel in a monochromatic camera contains the intensity of the light captured, regardless of the wavelength, while the pixels in a color camera, (which also has a monochromatic sensor but with a Bayer Filter over the top of it), contain the light intensity only of the filtered wavelengths [16]. Interference filters (also called dichroic filters) reject certain wavelengths of light while allowing others to pass through. The rejection of certain wavelengths occurs because these filters are constructed with different layers (either material or coatings), each having a different index of refraction. Certain wavelengths travelling from a lower index of refraction will be reflected by material with a higher index [8]. This is also termed “destructive interference” [15].

The optical lens and the camera form the optical part of the project. This portion is responsible for capturing the images. Snell’s Law gives the relationship between the light The filter wheel is located between the camera and the lens. It incidence angle and refraction between two types of lens media

Belkin, Finken, Walczak

Multispectral Camera System

with different indices of refraction. Snell’s equation is used to determine the index of refraction of a given lens; n1*sin(θ 1) = n2*sin(θ 2), where n1 and n2 are the indices of refraction and θ 1 and θ 2 are the angles from the normal of the incident and refracted waves, respectively [13].

Fig. 3. Bayer Filter.

3

the most cost effective USB camera was CMOS from Mightex.com. CMOS cameras require less power to operate [17], which may play a role in their being able to be utilized with only USB connectivity. “In a CCD sensor, every pixel's charge is transferred through a very limited number of output nodes (often just one) to be converted to voltage, buffered, and sent off-chip as an analog signal. All of the pixel can be devoted to light capture, and the output's uniformity (a key factor in image quality) is high. In a CMOS sensor, each pixel has its own charge-to-voltage conversion, and the sensor often also includes amplifiers, noise-correction, and digitization circuits, so that the chip outputs digital bits. These other functions increase the design complexity and reduce the area available for light capture. With each pixel doing its own conversion, uniformity is lower. But the chip can be built to require less off-chip circuitry for basic operation [17].” An alternative to out multispectral camera could be an off the shelf industrial style camera that are most commonly used for aerial-type of imagery applications such as detecting forest fires or observing oil traces near oil or gas drilling operations [4][5].

The f-stop setting is a parameter of the iris/aperture diameter within the lens assembly. An iris is a light blocking device that determines how much light enters into the lens assembly and Another choice of filters are called ‘absorptive filters’ ultimately onto the camera sensor or film. A smaller f-stop because they absorb the light of wavelengths that they are number allows more light into the lens, while a larger number allows in less light. Generally, a higher f-stop (allowing in less designed to stop from passing through [8][15]. light) produces sharper images [18]. Interference filters were determined to be a better choice for Using a ‘ray trace’ diagram, see Figure 11, it can be shown our design because they almost completely reflect unwanted wavelengths rather than the less effective absorptive type filters that a bi-concave lens with a -50mm focal length would extend [8][15]. Our images will contain more accurate filtered out the focal length to a suitable distance, which would put spectral responses by using the interference type filters that most, if not all, of the image fully onto the bi-convex lens. This diagram also shows that a bi-convex lens having a focal length block most of the wavelengths they are intended to block. of about +18mm should place the focused image directly onto A common type of aberration is called “transverse chromatic the cameras’ sensor. The unknown factors are how the optical aberration (TCA) which occurs when red, yellow, and blue filters will affect the focal length, if at all, of the camera wavelengths focus at separate points in a vertical plane [9]”. system. Chromatic aberrations occur because the “index of refraction in Utilizing the thin lens equation, 1/(focal length) = 1/(object a medium varies with the wavelength [10]”. This type of aberration can be avoided by using an achromatic lens [11], distance) + 1/(image distance), where the object distance equals which usually consists of two or even three lenses cemented the Minimum Object Distance of our Pentax lens, which is together, which corrects for the misalignment of the focal point 300mm, and the -50mm focal length of the bi-concave lens, we differences of each wavelength [12]. Our multispectral camera can determine that the image distance produced from the biutilizes a mono-chromatic imaging sensor viewing an object concave lens should be positioned at -60mm along the optical through narrow bandwidth filters; therefore, no chromatic path. misalignment will be present due to the narrow spectrum range being used for each image. C. Filter Wheel The chosen multispectral camera design uses a filter-wheel We chose a CMOS camera instead of a CCD camera because consisting of six or more optical bandpass interference filters. of our specification of utilizing a USB camera for our system; Through the positioning of individual filters into the optical Fig. 4. Theory of Bayer Filter Blocking the Two Non-Filter Colors.

Belkin, Finken, Walczak

Multispectral Camera System

path, the electromagnetic spectrum can be selectively acquired by the CMOS camera sensor and captured as an image. The technology used in this system includes the filter wheel, bipolar stepper motor, stepper motor driver, and a variety of bandpass optical filters. The bandpass interference filters only transmit a certain wavelength band and block the others [19].The filter wheel came with several unidentified filters which needed to be analyzed; through the use of a spectrometer, dark room, and a white light source, we were able to identify the different wavelength spectrums of these filters[20]. The graphs produced by the spectroscopic software shows the wavelength ranges of the filters; Figure 8 and Figure 9 show two different wavelengths that the filters allow pass, while blocking all others. The main objective that has yet to be achieved with the filter wheel is its ability to rotate the filters into the field of view by using a stepper motor. A stepper motor turns a specific amount of degrees when given a sequence of voltage levels. A linear actuator produces linear motion, back and forth along one axis, when given a voltage. Our filter wheel has a 4 pin stepper motor in it which must be bipolar as there aren’t any possible configurations. After doing a back EMF test in which you short leads together and check if it is hard to move the motor it made it possible to identify phase A and phase B [21]. Below is a basic image of a bipolar stepper motor.

Fig. 5. Picture of Bipolar 4 wire stepper motor [22] The next part for the stepper motor is to fully explore the alternatives available to drive it. The alternatives being the stepper motor driver, H-bridge or Darlington Array [23][24][25]. The stepper motor driver can drive a bipolar stepper motor with 4, 6 or 8 wires. The stepper motor has to be powered with an Arduino or alternate microcontroller. Similarly the H-Bridge needs to be drive by a micro controller circuit but it’s an alternate way to approach the motor know that it is functional. An H-bridge allows the polarity of the power applied to each end of each winding to be controlled independently [24]. The Darlington array with the use of a microcontroller can also be used to drive a 4 wire stepper motor. The Darlington array is a compound structure consisting of two bipolar transistors connected in a particular way that the current is amplified by both transistors [26]. These design alternatives will be further explored and adjusted with the stepper motor provided to us in the filter wheel. D. Microcontroller A microcontroller is a small computer on an integrated circuit (IC). Microcontrollers are widely used in projects due to their

4

small size and it being capable of accomplishing a lot of tasks. The use a microcontroller involves writing a program on a computer, downloading it on the microcontroller and it will be able to control a robotic device. We decided to use a microcontroller because it will be take care of positioning the filter wheel as well as controlling the operation of the camera. There are different types of microcontrollers available. But for the scope of our project, a Raspberry Pi (RPi) is the one we will use.

Fig. 6. Picture of a Raspberry Pi model B The Raspberry Pi is a 700MHz ARM small computer that runs on custom Debian Linux [19]. Its primary programming language is Python. As seen from the Fig. 6 above, the RPi has two USB ports for a mouse and a keyboard, an HDMI port to connect to a computer, a micro USB port for power, and Ethernet jack and General Programming Input Output (GPIO) pins. We chose a Raspberry Pi because it is a small, capable computer that does almost anything a desktop computer can do. The microcontroller block is responsible of controlling the filter wheel and the camera. The microcontroller will have a safe guard that will stop the filters using the necessary step size to position them into the cameras field of view. To achieve this block, we had to learn the Python programming language. We ran a few programs on the RPi using python to gain more proficiency in the language. We wrote a Python code that should be able to turn the stepper motor of the filter wheel with the proper step sizes. To use the RPi to control the filter wheel, we had to use a stepper motor driver. We decided to use a stepper motor driver to prevent damage to filter wheel from to the high current supplied from the RPi and also to have an easier control of stepping and direction. We ordered the EasyDriver stepper from SparkFun for its flexibility. The whole circuit was connected as shown below.

Belkin, Finken, Walczak

Multispectral Camera System

5

possible from our final images, and running camera tests to prove out our results. The team has been very good on communication. We share important files and documents through a shared Dropbox. We talk regularly via text messages and emails. Each member of our team is working very closely from now on with the microcontroller since the project is now heavily centered on that. And with the expertise we got from working on our individual parts, we expect to be able to produce the needed results by CDR. Fig. 7. Stepper motor of the filter wheel connected to stepper driver and GPIO of RPi. The stepper motor driver (red board) was connected to the stepper motor and a 5V source. The pins connected to the RPi correspond to ground, direction and step. We tried to test the above circuit with a python code in lab but encountered a technical problem. The Raspberry Pi froze. One of the drawbacks of using a RPI is it very delicate GPIO pins. The GPIO pins are directly connected to the Broadcom of the RPi which is the heart of the RPi. Basically, there is no protection with those pins. A wrong voltage down the wrong pin could damage the RPi. [27]. The RPi froze during testing and wouldn’t turn back on. Through the online trouble shooting website of the raspberry pi, we were advised to let it off for a while. Due to the fact that we were very behind in our project, Fig. 8. Filter Spectrum Wavelength 510 – 525nm we decided to use an Arduino board as a backup to the RPi. An Arduino is another type of microcontroller. It is an open source single board microcontroller that was derived from the Wiring platform. The Arduino board has an Atmel AVR processor [28]. We used the Arduino Uno as it was readily accessible in the lab. We wrote an Arduino code to control the stepper motor based on similar codes in the Arduino library. We switched the RPi for the Arduino in Fig 7 utilizing the necessary pins. Unfortunately, we were not successful in getting the filter wheel to turn and couldn’t pinpoint the source of error. To use the RPi to control the camera, we installed the Linux driver of the Mightex camera on the RPi. We haven’t been able to test it as the RPi froze.

III. PROJECT MANAGEMENT Our MDR goals are shown in the Gant chart in Figure 10. We have accomplished the goals of choosing and ordering the components needed to build our multispectral camera system. The remaining goals for MDR focus around the integration of the different parts with the microcontroller. The parts include integration of the stepper motor driver with the microcontroller to control the wheel rotation. The second part is the integration of the CCD camera with the microcontroller through the use of Linux, the optical aberration calibrations to remove as many as

Fig. 9. Filter Spectrum Wavelength 433 – 438nm IV. CONCLUSION Our project is moving forward. We have almost all the parts needed and have a lot of research done to get the right parts. We did not get all of our MDR deliverables but we have learned a lot all this semester and are will double our efforts to meet our CDR goals.

Belkin, Finken, Walczak

Multispectral Camera System

Our plan for the future is to get more practical work done as much of what we did was in theory and research. We are going to assemble all the parts together and make it work. We expect some difficulties with the camera if we don’t get a different driver for the camera. We anticipate difficulties using the delicate GPIOs of the RPi. We also expect a hard time coding since we are all EEs. Despite all these issues, we will get there with hard work and team work.

6

[6] Tremolite, mindat.org, [Online] 2012, URL http://www.mindat.org/min-4011.html (Accessed: 4 Dec 2012). [7] R Clark, et al., Imaging Spectroscopy: Earth and Planetary Remote Sensing with the USGS Tetracorder and Expert Systems, speclab.cr.usgs.gov, [Online] 2003, URL http://speclab.cr.usgs.gov/PAPERS/tetracorder/ (Accessed: 4 Dec 2012).

Lifelong learning technologies that were/are used include the use of Matlab, which is a tool that was evaluated early in the design of the multispectral camera system and may be used extensively in the analysis and image manipulations required as part of the ultimate goal.

[8] Optical Filters, edmundoptics.com, [Online] 2012, URL http://www.edmundoptics.com/learning-andsupport/technical/learning-center/applicationnotes/optics/optical-filters. (Accessed: 8 Dec 2012).

Another life-long learning tool, which is really only a skill, is the use of ‘ray tracing’ as a way to determine the optical characteristics of an imaging system. Once a basic familiarity is obtained, it can be utilized as a tool.

[9] Chromatic and Monochromatic Optical Aberrations, edmundoptics.com, [Online] 2012, URL http://www. edmundoptics.com/learning-and-support/technical/learningcenter/application-notes/optics/chromatic-and-monochromaticoptical-aberrations/ (Accessed: 8 Dec 2012).

[10] A Roorda. (2011), “OPTICAL ABERRATIONS Laboratory 7”, URL http://vision.berkeley.edu/ We would like to thank Seahorse Bioscience for donating the roordalab/VS203BWebsite/Labs/VS203B_LAB7.pdf. (Date downloaded 8 Dec 2012). filter wheel that we are currently using for our senior design project. [11] J Alves., Correcting and Preventing Chromatic Aberration, [Online] 2010 URL http://www.tutorial9.net /tutorials/ photography-tutorials/correcting-and-preventing-chromaticREFERENCES aberration. (Accessed: 8 Dec 2012). [1] C McFee. “An introduction to CCD operation”, (mssl.ucl.ac.uk), [Online] 2012, URL http://www.mssl.ucl. [12] Why Use an Achromatic Lens?, edmundoptics.com, ac.uk/www_detector/ccdgroup/optheory/ccdoperation.html#beh [Online] 2012, URL http://www.edmundoptics.com/ learningaviour, (Accessed: 4 Dec 2012). and-support/technical/learning-center/applicationnotes/optics/why-use-an-achromatic-lens. (Accessed: 8 Dec [2] R Smith. (2012, Jan). “Introduction to Hyperspectral 2012). Imaging,” URL http://www.microimages.com/ documentation/ Tutorials/hyprspec.pdf. (Date downloaded 4 Dec 2012). [13] E Weisstein. Snell’s Law, scienceworld.wolfram.com [Online] 2007, URL http://scienceworld.wolfram.com/ [3] Space Computer Corporation, (2007). “An Introduction to physics/SnellsLaw.html (Accessed: 8 Dec 2012). Hyperspectral Imaging Technology,” URL https://www. spacecomputer.com/documents/Introduction_to_HSI_Technolo [14] USGS Digital Spectral Library, speclab.cr.usgs.gov, gy.pdf. (Date downloaded 4 Dec 2012). [Online] 2007, URL http://speclab.cr.usgs.gov/spectral-lib.html (Accessed: 8 Dec 2012). [4] J Hintz. (2001, Jan). "Using Multi-spectral Imagery to Help Fight Fires” Earth Observable Magazine. [Online]. 10(1), [15] Light Filtration, olympusmicro.com, [Online] 2012, URL Available: http://web.archive.org/web/20081013192544/ http://www.olympusmicro.com/primer/lightandcolor/ filter.html http://www.eomonline.com/ (Accessed: 9 Dec 2012). Common/Archives/2001jan/01jan_hintz.html [16] Fluid Imaging Technologies, (2010). “Color Versus Black [5] J Ellis. (2001, Jan). "Searching for Oil Seeps & Oil& White Cameras,” URL http://www.fluidimaging. com/ impacted Soil with Hyperspectral Imagery,” Earth Observable bae35316-08b9-4861-beb0-1c39079e41fb/download.htm Magazine. [Online]. 10(1), Available: (Date downloaded 9 Dec 2012). http://web.archive.org/web/20080723151224/ http://www.eomonline.com/Common/Archives/2001jan/01jan_ [17] CCD vs. CMOS, teledynedalsa.com, [Online] 2012, URL ellis.html http://www.teledynedalsa.com/ corp/markets/CCD_vs_CMOS.aspx (Accessed: 9 Dec 2012). ACKNOWLEDGMENT

Belkin, Finken, Walczak

Multispectral Camera System

7

[18] P Hill. The Easy Guide To Understanding Aperture (f Stop) [Online] 2010, URL http://www.redbubble.com/ people/peterh111/journal/5725038-the-easy-guide-tounderstanding-aperture-f-stop (Accessed: 9 Dec 2012).

[25] Admin (2011, Mar). Driving a Bipolar Stepper motor with Arduino and ULN2803AG. eLABZ.com [Online]. Available: http://elabz.com/driving-a-bipolar-stepper-motorwith-arduino-and-uln2803ag/

[19] Interference Filter, Optometrics.com [Online] 2009, URL http://www.optometrics.com /interference_filter.html (Accessed: 9 Dec 2012).

[26] D. A. Hodges (1999, Jan). “Darlington’s Contributions to Transistor Circuit Design.” Berkeley [Online] Available: andros.eecs.berkeley.edu/~hodges/DarlingtonCircuit.pdf (Date downloaded 7 Dec 2012).

[20] BWSpec, Nano Ram Website [Online] 2011, URL www.nanoram.com/bwspec-2/ (Accessed: 9 Dec 2012). [21] Avayan (2011, Dec). “Having Fun With Brushless Motors.” EBLDC Magazine [Online]. Available: www.ebldc.com/?p=253 [22] M Colin. (2010, Nov). “Stepper Motor Controller,” Talking Electronics Newsletter [Online]. Available: www.talkingelectronics.com/projects/Stepper%20Motor%20C ontroller/StepperMotor.html

[27] Raspberry Pi, [online] 2012, .Accessed:10 Dec. 2012) [28] “Arduino” Wikipedia. Foundation, Inc. 22 July 2004. Web. 10 Dec. 2012. < http://en.wikipedia.org/wiki/Arduino>

[23] D.Thompson (2010, May). “EasyDriver 4.2 Tutorial.” Dan Thompsons Blog [Online]. Available: http://danthompsonsblog.blogspot.com/2010/05/easydriver-42tutorial.html [24] B. Stephnes (2006, July). “Stepper Motor Intro,” Northwestern University Mechatronics Design Laboratory [Online]. Available: mechatronics.mech.northwestern.edu/ design_ref/actuators/stepper_intro.html

Figure 10 - Gant Chart

Belkin, Finken, Walczak

Multispectral Camera System

Figure 11 - Ray Trace of Multispectral Camera System

8