IEEE.org | IEEE Xplore Digital Library | IEEE Standards | IEEE Spectrum | More Sites
Semiconductor chip manufacturing cost consists of die cost, package cost, and test cost. The trends of increasing design complexity, increasing quality needs, and new process nodes and defect models are pushing test cost to the forefront. This is especially true for high-resolution data converters, whose accurate testing requires expensive instruments and is extremely time-consuming. As a result, linearity test of data converters often dominates the overall test cost of SoCs. This talk will present several recently developed techniques for reducing linearity test cost by dramatically reducing measurement time and dramatically relaxing instrumentation requirements.
The IEEE standard for ADC linearity test requires the stimulus signal to be at least 10 times more accurate than the ADC under test. To relax this stringent requirement, the SEIR (stimulus error identification and removal) algorithm is developed to accurately test high-resolution ADCs using nonlinear stimuli. It has been demonstrated by industries that more than 16 bits of ADC test accuracy were achieved using 7-bit linear ramps instead of 20-bit linear ramps as required by IEEE, a relaxation of well over 1000 times on the instrumentation accuracy requirement.
The biggest contributor to test cost is the long measurement time. The recently developed uSMILE (ultrafast Segmented Model Identification for Linearity Errors) algorithm can dramatically reduce the measurement time needed for ADC linearity test. With a system identification approach using a segmented model for the integral nonlinearity, the algorithm can reduce the test time by a factor of over 100 and still achieve test accuracies superior to the standard histogram test method. This method has been extensively validated by industry and has been adopted for production test for multiple product families.
By combining the salient features of both SEIR and uSMILE, the ultrafast stimulus error removal and segmented model identification of linearity errors (USER- SMILE) algorithm is developed. The USER-SMILE algorithm uses two nonlinear signals as input to the ADC under test. One signal is shifted by a constant voltage with respect to the other nonlinear signal. By subtracting the two sets of output codes, input signal is canceled and the nonlinearity of ADC, modeled by a segmented non-parametric INL model, will be identified with the least square method.
A completely on-chip ADC BIST circuit is developed based on the USER-SMILE algorithm and demonstrated on a 28nm CMOS automotive microcontroller. The ADC test subsystem includes a nonlinear DAC as signal generator, a built-in voltage shift generator, a BIST computation engine, and dedicated memory cells. The silicon measurement results show accurate test results. The INL test results are further used to correct ADC linearity errors, thus providing a method for reliably calibrating the ADC. Measurement results demonstrated that the BIST-based calibration method achieved >10dB THD/SFDR improvements over the existing calibration method used by industry.
The convergence of healthcare, instrumentation and measurement technologies will transform healthcare as we know it, improving quality of healthcare services, reducing inefficiencies, curbing costs and improving quality of life. Smart sensors, wearable devices, Internet of Things (IoT) platforms, and big data offer new and exciting possibilities for more robust, reliable, flexible and low-cost healthcare systems and patient care strategies. These may provide value-added information and functionalities for patients, particularly for those with neuro-motor impairments. It has great importance in developed countries in the context of population ageing. In this invited talk the focus will be on: hardware and software infrastructure for neuro-motor rehabilitation; highlighting the developed solutions for motor rehabilitation based on virtual reality and serious games. As part of these interactive environments, 3D image sensors for natural user interaction with rehabilitation scenarios and remote sensing of user movement, as well as thermographic camera for remote evaluation of muscle activity will be presented. Additionally technologies for unobtrusive monitoring of patient posture, balance and walking gait monitoring during neuro-motor rehabilitation as so as the developed prototypes such as smart walkers and force platform that provide quantitative information associated with physical rehabilitation process outcome will be presented.
Challenges related to simple and secure connectivity, signal processing, data storage, data representation, data analysis including the development of specific metrics that can be used to evaluate the progress of the patients during the rehabilitation process will be discussed.
Scientific and industrial worlds have started recently to look again with interest to the basic rules to perform reliability, availability and safety analysis and design on complex electro-mechanical systems. The main failure modes on electronic devices and sensors as well as the main techniques for failure mode investigation are of interest in modern system design. Statistical characterization of the main probability density functions and degradation models of innovation is mandatory to build lasting and safe products. The main reliability design techniques such as: fault tree analysis, cut set method, minimal path approach, critical block analysis for reliability are requested by companies worldwide as well as the knowledge of the main failure modes and reliability databases and handbooks as MIL-HDBK217, OREDA, BELLCORE, etc… Maintenance policies with special attention to corrective and preventive ones are also affected by reliability design in terms of advantages and disadvantages when applied to electro-mechanical systems. The main safety standards as IEC61508, IEC 61511 and EN50129, EN50128, EN50126 are usually considered in industrial design. The aim of this talk is to enable companies to develop inner confidence on advanced modelling techniques involving reliability, availability and safe design. Under this spotlight in addition to traditional and well known statistical models, innovative modelling techniques based on statistical data representation will be introduced and tailored to some specific case studies in the fields of bio instruments, transportations and oil & gas contexts.
The talk by Prof. Sergio Saponara will focus on sensor and measurement systems for new generations of vehicles with driver-assisted/autonomous capability. This is the main trend that is revolutionizing vehicles and the mobility of people and goods and is also making our cities smart.
The economic and social impacts of this application field are huge and the borders between automotive and ICT are blurring with huge investments from conventional automotive carmakers and tier-1 component providers but also from ICT and semiconductor companies.
The key enabling technologies for this scenario are sensing and measurement systems, needed for accurate vehicle positioning and navigation, vehicle context-awareness, obstacle detection and collision avoidance, and driver-assistance.
During the lecture, Prof Saponara will outline innovation and market trends in the above domain including discussions about new context-aware sensors (Radar, Lidar, cameras) and their data processing needing AI & high-performance computing platforms, as well as on-board sensors for positioning and navigation, including recent advances in MEMS accelerometers and gyroscope. Finally, we will analyze the trend in computing platforms, where multi-core and heterogeneous architectures and machine learning/AI (artificial intelligence) techniques are used to manage in real-time multiple and heterogeneous sources of measurements and take autonomous decisions.
The heart is an amazing organic engine that converts chemical energy into work. Each heartbeat begins with an electrically-released 'spark' of calcium, which triggers force development and cell shortening, at the cost of energy and oxygen, and the release of heat. We have developed new measurement systems to measure all of these processes simultaneously while subjecting isolated samples of heart tissue to realistic contraction patterns that mimic the loads experienced by the heart with each beat. These devices are effective 'dynamometers' for the heart, that allow us to measure the performance of the heart and its tissues, much in the same way that you might test the performance of your motor vehicle on a 'dyno.'
In this talk, I will overview how we have developed our own actuators, force transducers, heat sensors, and optical measurement systems to study nature's engine: heart muscle. Heart muscle force and length are measured and controlled, beat by beat, to microNewton, and nanometer precision by a laser interferometer. At the same time, the muscle is scanned in the view of an optical microscope equipped with a fluorescent calcium imaging system. The changing muscle geometry is monitored in 4D by a custom-built optical coherence tomograph, and the spacing of muscle-proteins is imaged in real-time by transmission-microscopy. Muscle heat production is measured to nanoWatt precision using thermopile sensors. We combine all of these technologies with a hardware-based real-time acquisition and control environment and interpret results with the aid of a computational model.
Our dynamometer allows us to diagnose the 'performance' heart tissue, even as it is affected by disease, exercise, drugs, and diet. By applying this 'bioengineering approach' to the study of heart tissues, we have gathered new insight into the function of the heart - information that we hope will lead to better treatment and management of the engine upon which we all rely!
Modern control algorithms in the emerging power systems process information delivered mainly by distributed, synchronized measurement systems, and available in data streams with different reporting rates. Beyond existing measurement approaches currently embedded in SCADA framework and smart meters, there are more and more deployed high-reporting rate, synchronized measurement devices like phasor measurement units (PMUs).
There are several applications where synchronized data received with high reporting rate has to be used together with aggregated data from measurement equipment having a lower reporting rate (complying, for example, with power quality data aggregation standards) and the question is how adequate the energy transfer lossy information models are and how to correlate the measurement data streams.
The talk will address:
Medicine 4.0 is definitely a great revolution in patient care. New horizons are possible today. Relocation of services, which means remote monitoring, and remote diagnoses without direct contact between the doctor and the patient. Hospitals are freed from routine tests that could be performed by patients at home and reported by doctors on the internet. Telemedicine is not a WhatsApp where an elder tries to chat with a doctor. Telemedicine is a complete remote medical center connected to smart devices able to measure objective vital parameters. Medicine 4.0 requires new technologies for smart sensors, but also Artificial Intelligence is required to perform smart analysis using these smart sensors. A.I. is used both to manage intensive care rooms and to perform better and faster analyses. In this webinar, we’ll see how to use Machine learning techniques to improve ECG analyses and leg ulcer treatment.
Keywords: Telemedicine, Artificial Intelligence, Artificial Neural Networks, Electronic Health, Smart Sensors
The full video tutorial will be available on the IEEE Learning Network in the coming weeks.
Topical administration and transdermal delivery are advantageous in comparison with systemic administration routes because complications as first-pass metabolism, toxicity, and side effects are attenuated for the patient. Their effectiveness is evaluated by numerous studies in laboratory. However, performance outside of a laboratory, in the context of daily clinical application, is less investigated.
After an introduction on EIS for prosthesis osseointegration diagnostics in audiology and Artificial Intelligence for model definition in EIS, this tutorial presents a measurement methods (in aesthetic medicine) to evaluate the amount of a substance delivered into a tissue . A second measurement method for indirectly assessing the insulin bioavailability is discussed. The leakage of a given amount of insulin produces a corresponding variation in the measured equivalent impedance in the administration volume.
Electrochemical Impedance Spectroscopy, drug bioavailability assessment, transdermal delivery, personalized medicine,
View the Full Video Tutorial
Low-noise current and charge sensing circuits are pivotal in a large variety of biomedical instruments, spanning from electrochemical biosensors to photodetectors for visible and gamma radiation. In this tutorial, the common electronic design challenges and guidelines will be discussed, for circuits detecting charge and current, with special focus on front-end CMOS ASICs. Key aspects of the signal readout chain will be discussed spanning from the front-end to the back-end processing for charge, current and impedance sensing. Applicative examples will focus on diagnostics, both from the (apparently opposite) perspectives of electrochemical nano-biosensors, leveraging molecular affinity (miniaturized and integrated with lab-on-chip microfluidics), monitoring single cells, as well as up to hospital-based scanners for multi-modal medical imaging.
Keywords: Impedance, low-noise, analog electronics, charge detection, lab-on-chip, bio-sensors.
Design of experiments (DOE) is an approach to experimental work that aims to produce the best predictions in the most economical way.
The world is noisy and multifactorial. Experiments are necessary to assess reality. But experiments can have a high cost in terms of time, delay and resources. A trade-off between costs and benefits of the experimental campaign must be done. Based on statistics, DOE offers tools for academic and industrial research. It ensures that the quality of the data is adequate to answer the experimenter’s questions.
The presentation shows with two examples how DOE is implemented and the type of insight it can bring.
Keywords: Design of experiments, fractional factorial design, Plackett-Burman design, experimental variance, empirical modeling, factor interactions
View Full Video Tutorial
In this tutorial, the technique of impedance and dielectric spectroscopy will be described in the simplest possible terms. The recording has been divided into two parts. It will begin by describing the expected responses for the real and imaginary impedance of a wide range of material and device types. Then the conversion to three other formalisms known as admittance, electric modulus and permittivity will be used to demonstrate the detailed information that is often hidden inside of the partially analyzed data. Examples will be provided that will help not only understand the physical processes that are happening inside of the material/device but also develop an understanding of how to control the outcome. Examples ranging from materials used in insulating layers in integrated circuits and packaging materials to highly conducting materials used in solar cells and batteries will be provided. It will be shown that it is possible to relate the spectra obtained to the presence of certain key responses: charge storage, electronic conduction, surface adsorption, switching phenomena and many others. Complementary techniques that are used to corroborate the physical assignments will also be included. The tutorial will end with examples that demonstrate that this technique is exceptionally good for establishing quality control in a production environment and/or to assess service life of electronic and non-electronic components in a non-destructive way.
Keywords: dielectric spectroscopy, impedance, ieee, ims, tutorials, education, rosari gerhardt
View Part 1 of this Video Tutorial Here
View Part 2 of this Video Tutorial Here
Heartrate monitors are becoming ubiquitous and are being used by both athletes and the general public to keep track of their health. Heartrate monitors are just an example of the wearables currently available to the public; other examples include oxygen saturation monitors, activity monitors, and muscle activity monitors. Wearables are typically not used in a controlled environment; therefore, the quality of the collected signals might be questionable. Even in a controlled environment such as a hospital, deterioration in the quality of the collected signals can lead to false alarm and reduction in the quality of patient care. As the signals are used to inform users about their health, it is imperative that the signals are of acceptable quality. Signal Quality is the field of identifying and improving the quality of collected signals. Signal Quality can be divided into four categories: 1) detection; 2) identification; 3) quantification; and 4) mitigation. Detection is the acknowledgement of the presence of noise in the signal. Identification is the determination of the type of noise. Quantification is the estimation of the level of the noise. Mitigation is the reduction of the noise through noise removal techniques. This tutorial will provide a high-level overview of the different techniques in each of the Signal Quality categories.
Keywords: wearables, hospital, mohamed abdelazez, ieee, ims, signal quality, tutorials, education
View Video Tutorial Here
Among various industrial tomography modalities, electrical capacitance tomography (ECT) is the most mature and has been used for many challenging applications. ECT is based on measuring very small capacitance from a multi-electrode sensor and reconstructing the permittivity distribution in a cross section of an industrial process. Compared with other tomography modalities, ECT has several advantages: no radioactive, fast response, both non-intrusive and non-invasive, withstanding high temperature and high pressure and of low-cost. Because of very small capacitance to be measured (much smaller than 1 pF) and the “soft-field” nature, ECT does present challenges in capacitance measurement and solving the inverse problem. The latest AC-based ECT system can generate online images typically at 100 frames per second with a signal-to-noise ratio (SNR) of 73 dB. Examples of industrial applications include gas/oil/water flows, wet gas separation, pneumatic conveyors, cyclone separators, pharmaceutical fluidised beds, and clean use of coal by circulating fluidised bed combustion and methanol-to-olefins conversion. During this tutorial, ECT is discussed from principle to industrial applications, together with demonstration of an AC-based ECT system.
Keywords: electrical, capacitance, tomography, ieee, ims, wugiang yang, tutorials, education, applications
This tutorial aims at revisiting the very basic concept of the measurement of magnetic behavior and magnetic permeability, as well as at providing a discussion towards possible novel magnetic materials, even with very unusual arbitrary B-H relationships. Starting from the spatial averaging of unobservable microscopic fields and the identification of the observable macroscopic fields as introduced by Lorentz, the measurement of the magnetic permeability in composite materials is discussed from its basic definition. Particular composite resonator structures are considered and it is shown how they can exhibit even negative magnetic permeability values even at industrial frequencies.
Keywords: magnetism, composite, materials, ieee, ims, bernardo tellini, measurement, tutorials, education
Autonomous systems are nowadays having an undisputed pervasiveness in the modern society. Autonomous driving cars as well as applications of service robots (e.g. cleaning robots, companion robots, intelligent healthcare solutions, tour guided systems) are becoming more and more popular and a general acceptance is now developing around such systems in the modern societies. Nonetheless, one of the major problems in building such applications relies on the capability of autonomous systems to understand their surroundings and then plan proper counteractions. The most popular solutions, which are gaining more and more attention, rely on artificial intelligence and deep learning as a means to understand the structured and complex natural environment. Nonetheless, besides the importance of such complex tools, classical concept of metrology, such as uncertainty and precision, are still unavoidable to a clear and effective application of modern autonomous systems applications.
In this tutorial, some measurement concepts will be revised in light of the autonomous systems domain. In particular, we will cover the main concepts of the statistical approach to measurements that will then be applied to:
Keywords: electrical, capacitance, tomography, ieee, ims, wuqiang yang, tutorials, education, applications