Differentiation is necessary between "measured particle sizes range ", e.g. 0.02 mm – 2000 μm, quoted in specifications of various laser diffraction enabled measurement instruments, and their “adjustable measuring range”.
Besides the measured particle sizes range, there is quite limited adjustable measuring range of instrument, i.e. the distance between minimum and maximum sizes measurable at the same setting of an optical-electronic system. This is qualification of so called measurement dynamics.
In order to determine it, parameters must be known of the analog-to-digital converter that measures intensity of light on the laser micrometer’s photovoltaic cells.
Each illuminated photovoltaic cell generates electric current that may be converted into A/D converter-measurable voltage.
A/D converter is a voltage measuring system with a specific number of measurement intervals/ channels corresponding to gradual voltage increases. The lowest interval should correspond to the photovoltaic cell’s illumination by a single particle, while the highest interval to its illumination by the maximum number of particles. No larger number of particles will be distinguished by the A/D converter anymore. If in A/D converter’s first size class there “is” one particle, then in the last, i.e. highest, class there should “be” as many particles as there are the A/D converter’s classes. This is the converter’s resolution at single measurement.
The single measurement of maximum particle is very important, because one 2 mm particle is equivalent to 20003, i.e. 8,000,000.000, 1 μm particles.
A failure to measure the single maximum particle will question the sense of measurement of the entire set of particles.
If we adopted the minimum particle size, a question should be asked what would be the maximum particle size at a specific setting of the optical system and measuring capacity of the A/D converter.
For particles of various sizes the following formula may be offered to describe and compare their volumes:
|after simplification :
After consideration of A/D converter’s average total resolution n = 100 000 and calculation after the formula D ≈ 50 d for instrument’s various measuring ranges, we receive:
|for minimum sizes
|d = 0,02 μm
|D = 1 μm
|for popular sizes
|d = 1 μm
|D = 50 μm
|for maximum sizes
|d = 40 μm
|D = 2000 μ
For a number of particles in a set that is many times higher than the converter resolution the measurement results are summed and averaged. Existence of the largest single particles then disappears, and the instrument’s measuring range gets even narrower.
Why is it possible?
Because in the laser diffraction method every particle is represented by an analog signal and only the sum of analog signals is converted by an analog-to-digital converter into a digital signal input to a computer.
Such sum of analog signals was originally the sum of intensities of light on a photovoltaic cell converted first into current and then into voltage measurable with a A/D converter.
A lot of conversion – a little of accuracy, because – in addition - this light is reflected on a particle and when the particle is not spherical, then the measurement result does not always match the actual size.
Moreover, the measurement result depends on the particle’s optical properties.
The Mie theory applies to perfectly spherical and transparent particles only. Where a particle’s surface is mat and its shape irregular, the measurement result is even less accurate.
That’s why sieve analysis-based measurements are hardly comparable with laser diffraction-enabled measurements, while such comparison should be completely obvious.
In the era of computers and electronic (digital) measurement methods analogue measurements are anachronistic.
Even if they are laser-enabled. Anyway the laser diodes now in use are lacking exact operating stability over time and geometric homogeneity of radiation intensity. Yet it suffices for analogue measurements.
Now, with very fast electronic elements, each particle may be measured separately, and a multi-million set of particles may be counted, measured, and sorted into fractions within a few minutes.
Modern measurement methods are free of laser diffraction measurement’s several significant weaknesses.
By scanning each particle separately, its shape at measurement may be determined, which enables automatic adoption of relevant algorithm for calculation of its volume. When measured with a laser micrometer, particle shape and optical properties should have been identified prior to measurement. Otherwise the masurement's accuracy could be easily predicted.
The point is in comparison of what is seen under a microscope or is measurable by any classical method, e.g. on sieves, with the result obtained by use of a laser micrometer.
Laser diffraction micrometers’ greatest success is that many have believed that a measurement can be made within 20 seconds. It is indeed possible in particular cases, when the measured substance is perfectly suitable for such measurement. But in 20 seconds neither all pudding stones nor all conglomerates can be split, and especially no entire representative sample can be measured in dry condition. Actual measurements take much longer.
Where a digital measurement is used, the entire representative sample should be measured and it takes a few minutes, even if a small set of particles may be measured within as little as 10 seconds.
A major problem in laser diffraction measurement poise multimodal distributions, i.e. “multihump” distributions. To obtain such distribution using a laser diffraction micrometer is quite a challenge. Typically popular two-modal distributions result from an error of measurement result underrating, and from adding background and noises to the distribution. In such a case the typical first “hump” generally is situated at circa 1 μm and stretches under this value.
Fast voltage scanning in digital methods automatically determines the zero level. Since a single particle is always measured, the whole A/D converter range is available to it. It may me accurately measured without any other particles’ interaction and digitally recorded in a computer memory. With all particle sizes saved in the memory multimodal results don to pose a problem anymore.
Laser diffraction micrometers can determine one dimension only, and what digital instruments can do?
Using recorded scanning duration, at known velocity of a particle’s movement through a defined measurement space, the particle’s second dimension may be determined.
Where digital technology-based instruments are used, measuring circuits in various geometric configurations may be added to individual particle measurement in order to measure the particles three-dimensionally.
Some selected issues in laser diffraction measurement are described above and compared with new measurements methods involving state of the art technologies of optical/electronic measurement. Those, who’d like to familiarize in detail with comprehensive assessment of laser diffraction micrometers, are referred to NIST (National Institute of Standards and Technology) publication entitled NIST Recommended Practice Guide, Special Publication 960-1 ”Particle Size Characterization” (http://www.msel.nist.gov/practiceguides/SP960_1.pdf)
The decisive argument in many tenders is reference to compliance with ISO 13320-1 standard.
The standard describes in general the laser diffraction-based measurement method without detailed reference to measurement instrument’s specification, and measurement mode and accuracy.
The measurement methodology described below involves measurement and count of each individual particle using an optical/electronic system. This may be accomplished in various ways, typically by measuring radiation dispersion; sometimes diffraction may affect the dispersion. A particle’s two dimensions are measured by a 12-bit or 16-bit A/D converter with sampling frequency up to 500 kHz. An analog impulse corresponding to particle size is measured by the A/D converter, and the impulse's amplitude corresponds to the particle’s maximum dimension, while the impulse’s width corresponds to the particle’s thickness. The maximum particle dimension so measured depends on the particle’s mode of movement in the measurement space and may be controlled by an appropriate dispenser device.
Results of individual impulse measurements are in digital format and directly input to a computer, where they are saved on disk. Particle’s optical properties do not affect such measurements, because to disperse light by a perfectly transparent particle the particle’s specific gravity other than that of air or water is enough.
What’s important, measurement is made in a parallel radiation beam, in determined dimensions of measurement space, which is selected depending on the measured objects’ sizes. Measurement spaces with intersections from a few sq mm up to 60,000 sq mm (in various instruments) are available.
Particle’s actual dimensions are obtained by means of analyzing its scans.
In this method air is typically used for granular materials dispensing, but suspended matter may be dispensed with water or, subject to dispenser device’s alteration, with water solutions.
Measuring all dry materials in air facilitates measurement preparation and performance. Such measurement may be applied to sticking together or wet materials.
The maximum grain size and the electric impulse amplitude, as well as the minimum grain size determined by the impulse width, are strictly interrelated. Measured and counted impulses enable grain set’s unambiguous, accurate, and repeatable determination in electric units, i.e. in converters channels that may be recorded in a computer memory.
A set of grains saved in computer memory as a statistical distribution of quantities and sizes, after its transformation into a volume distribution, may be compared with actual measurements taken by classical measurement methods. Optical/electronic measurement instrument’s calibration characteristic may be obtained from each such comparison.
Digital micrometer’s calibration is assigned to a specific measurement method or grain shapes. For the same measurement results of grain-size distribution may be obtained according to spherical, sieve, and – for instance – sedimentation calibrations.
A question may be asked, which of these calibrations is the best? They are all good, if consistently used in industrial process control. The problem is not in calibration, but in unambiguous and accurate measurement of grains, which is ensured by digital micrometers, i.e. instruments that measure individual grains one after another.
With measurements so made, which correspond to the actual dimensions, sieve or densimetric analysis may be simulated in 100 %.
Usually digital micrometer’s entire measuring range is divided into a number of sub-ranges because of measurements’ simple optimization. If there are no large particles at hand, the measuring range may me narrowed. An appropriate measuring range may be matched with each particle size. For each of such ranges the converter operates with its maximum resolution, which sets new standards in particle size measurements.
Dr eng Stanisław Kamiński
P.S. Digital micrometers can also operate based on laser light scattering method.
Infoline: +48 22 666 93 32
Our instruments presented accoriding to measurement type and range.
What do you want to measure ?
Quality confirmed by ISO 9001 cert.