IEEE.org | IEEE Xplore Digital Library | IEEE Standards | IEEE Spectrum | More Sites
This lecture presents a detailed focus on the use of machine vision techniques in industrial inspection applications. The lecture will provide insights on a range of inspection tasks, drawn from their cutting-edge work in academia and industry, covering practical issues of vision system integration for real-world applications.
Advanced machine vision systems may incorporate multiple imaging and/or vision modalities to provide robust solutions to complex situations and problems in industrial applications. A diverse range of industries, including aerospace, automotive, electronics, pharmaceutical, biomedical, semiconductor, and food/beverage, and manufacturing, etc., have benefited from recent advances in multi-modal inspection technologies. This lecture highlights both the advances in technologies and vision system integration for practical applications. The advances provide an insight into recent progresses and developments of imaging and vision techniques for varied industrial inspection tasks while the applications present the state-of-the-art of imaging and vision system integration, implementation, and optimization.
Topics and features:
Bridging the gap between theoretical knowledge and engineering practice, this lecture will attract graduate students interested in imaging, machine vision, and industrial inspection. The lecture also provides an excellent reference for researchers seeking to develop innovative solutions to tackle practical challenges, and for professional engineers who will benefit from the coverage of applications at both system and component level.
Microwave and Millimeter wave signals span the frequency range of 300MHz – 30 GHz and 30 GHz – 300 GHz respectively. Signals at these frequencies can easily penetrate inside dielectric materials and composites and interact with their inner structures. At millimeter wave frequencies, the relatively small wavelengths and wide bandwidths associated with these signals enable the production of high spatial-resolution images. Imaging techniques can be primarily classified as either near-field or far-field techniques depending on the distance between the probe from the structure. Near-field imaging techniques are simple yet powerful, producing very high image resolutions with relatively inexpensive systems. On the other hand, far-field imaging coupled with synthetic aperture radar (SAR)-based 3D imaging techniques, have demonstrated sub-wavelength image resolution, and high image clarity. SAR 3D imaging techniques rely on constructively combining reflected signal data from many measurement points within the synthetic aperture (i.e., a multi-view technique) which greatly enhances the image quality. More importantly, SAR 3D imaging techniques can be implemented using imaging arrays that yield imaging results in real-time. Moreover, utilizing techniques such as non-uniform sampling, spectrum estimation, adaptive image formations, and multi-static or multiple-input multiple-output (MIMO) imaging for 3D SAR imaging are some of the methods that be utilized for the goal of enhancing image quality while reducing measurement time and hardware complexities.
This talk will present the chronology of developing portable, high-resolution, 3D and real-time millimeter wave imaging systems. These systems are developed to address the needs of several diverse and critical applications related to nondestructive testing, security, radar tracking, localization, and biomedical fields. This talk will focus on the designs of the developed imaging systems focusing on:
Finally, practical considerations for designing real-time imaging arrays will be discussed and the capabilities of each imaging system will be demonstrated.
In our daily life, when we are sick, we usually go to the hospital and ask experienced doctors for diagnosis and treatment. The long-term operation of machines will also produce "disease", that is the so-called fault. We need to seek the help of "doctor" of the machine to diagnose the occurrence of the fault and predict its development trend, and then provide guidance for its operation and maintenance. The new generation of artificial intelligence technology represented by deep learning provides a new way of intelligent diagnosis for machine doctors. On the basis of introducing the development history of artificial neural network, this talk introduces the concept and characteristics of deep learning, and then discusses several typical deep network models and their application in intelligent diagnosis of machines, as well as the development trend of deep learning in the future.
With the rapid development of big data and the Internet of Things, data-driven technology, especially deep learning (DL), is becoming increasing important in intelligent maintenance. However, the “black box” nature of DL-based intelligent maintenance still seriously hinders wide applications in industry, especially safety-critical applications. In fact, before the rise of DL, the physics-driven approach, as a white box model that relies on the causality to establish physics law from first principles, is also a popular way, but it is not accurate enough. As two ways of observing the laws of the physical world, data-driven and physics-driven models are not opposite, but two sides of one coin, and they have consistent insight. Therefore, integrating physics model into DL, namely physics-informed deep learning (PIDL), is a nature and promising pathway towards scientific intelligent maintenance. This talk mainly aims to emphasize the importance of PIDL in scientific intelligent maintenance, where users can understand the operation mechanism inside the model and realize human-in-the-loop. At last, some applications of PIDL are discussed to illustrate the merit of scientific intelligent maintenance.
Modern control algorithms in the emerging power systems process information delivered mainly by distributed, synchronized measurement systems, and available in data streams with different reporting rates. Multiple measurement approaches are used: on one side, the existing time-aggregation of measurements are offered by currently deployed IEDs (SCADA framework), including smart meters and other emerging units; on the other side, the high-resolution waveform-based monitoring devices like phasor measurement units (PMUs) use high reporting rates (50 frames per second or higher) and can include fault-recorder functionality.
There are several applications where synchronized data received with a high reporting rate has to be used together with aggregated data from measurement equipment having a lower reporting rate (complying with power quality data aggregation standards) and the accompanying question is how adequate are the energy transfer models in such cases. For example, state estimators need both types of measurements: the so-called “classical” one, adapted for a de facto steady-state paradigm of relevant quantities, and the “modern” one, i.e. with fewer embedded assumptions on the variability of same quantities. Another example is given by emerging active distribution grids operation, which assumes higher variability of the energy transfer and consequently, a new model approximation for its characteristic quantities (voltages, currents) is needed. Such a model is required not only in order to be able to correctly design future measurement systems but also for better assessing the quality of existing “classical” measurements, still in use for power quality improvement, voltage control, frequency control, network parameters’ estimation, etc.
The main constraint so far is put by the existing standards where several aggregation algorithms are recommended, with a specific focus on the information compression. The further processing of RMS values (already the output of a filtering algorithm) results in significant signal distortion.
Presently there is a gap between (i) the level of approximation used for modeling the current and voltage waveforms which are implicitly assumed by most of the measurement devices deployed in power systems and (ii) the capabilities and functionalities exhibited by the high fidelity, high accuracy and a high number of potential reporting rates of the newly deployed synchronized measurement units.
The talk will address:
o The measurement paradigm in power systems;
o Measurement channel quality and models for energy transfer
o Applications and challenges
The presentation provides an overview of these techniques, with examples from worldwide measurement solutions for smart grids deployment.
Brain-Computer Interfaces (BCIs) are a novel means of human-computer interaction relying on the direct measurement of brain signals. Possible applications consist of replacing, restoring, improving, enhancing and supplementing the natural outputs of the central nervous system, as well as for brain functions investigation. In considering daily-life constraints, researchers are exploring the possibility to provide visual stimuli by means of smart glasses or visors, which are increasingly exploited in extended reality (XR). Moreover, to detect the elicited potentials, commercial devices for electroencephalography (EEG) are taken into account. Nevertheless, those studies were more application-oriented and they did not deal with a metrological characterization of the stimulation and detection equipment.
In bridging this gap, metrology was considered for a significant enhancement of the BCI systems both in terms of designing and in operational understanding. It was demonstrated that, although often overlooked, applied metrology plays an important role in this field. Indeed, if the stimulation and detection equipment is not fully characterized, the measures of interest for the brain-computer interface system may result in misleading interpretation of the brain functioning. Instead, by means of the mentioned results, one can compare the measured brain signals with the behaviour of the equipment in the time and frequency domains, so to correctly identify the contribution of the “human transducer” in the BCI measurement chain.
Nowadays when the global population is growing by more than 80 million a year reported studies are predicting an increasing pressure on the planet's natural resources including food resources. The situation is going worst when unpredictable meteorologic events are running up in the context of great climate changes related to the global effect of anthropogenic greenhouse emissions. In this context precision agriculture (PA) combines technologies and practices to optimize agricultural production through specific farm management are considered.
At the same PA focuses on the accuracy of operations considering the place, time to act and method to be applied. Agricultural operations are carried out to reach the production goals using the information provided by the smart sensors and instrumentation increasing the sustainability of operations. Distributed smart sensing system characterized by fixed and mobile nodes (associated with Unnamed Aerial Vehicle (UAV)) is used to turn farming operations into data, and to make future operations a data-driven one. These new including edge and cloud computing that are capable to run artificial intelligence algorithms may contribute to a slight replacement of human decisions based on their accumulated experience with a machine-based decision. This new way to act in agriculture in a digital form combining technologies such as smart sensors, cloud, and mobile computing, data science is related to the fact that classical decisions cannot be applied nowadays when the cultivated areas are much extended, and the adverse meteorological events are occurring frequently that conduct to miss-management with yield losses.
Using smart sensors computation and data analysis the applied quantity of water and fertilizers is optimized. Weather stations could provide additional information such as ambient temperature, relative humidity, and wind velocity that are also used together soil measured quantities such as moisture, pH, conductivity, temperature, and macronutrients concentration (Nitrogen, Potassium, Calcium) to create models to be used for farm operation optimization. Data from distributed sensing systems on the crop field can be also used to avoid plant stress phenomena (e.g. plant water stress). Data mining is successfully applied in PA being associated with data analysis of massive data.
In this talk, we’ll see together the meaning of precision agriculture in the context of heavy uncertainty associated with climate change. IoT ecosystem for precision agriculture will be discussed including multimodal sensing and artificial intelligence. Referring to sensing as part of the IoT ecosystem in-situ and remote sensing is considered. The agriculture UAV imagery and satellite imagery solutions as so as the relation between the data coming from the smart sensors distributed in the field and acquired images using multispectral imagery techniques will be part of the presentation. Metrological characteristics of smart sensors as so as the calibration procedure for in-situ and remote measurement smart sensing systems will be part of the talk.
Another important technology associated with innovative precision agriculture is related to the development of AI data-driven models for farming operations considering data coming from different sources Examples of data-driven models for smart irrigation and nutrient delivery will be considered.
Challenges to precision agriculture adoption by regular farmers and how the agricultural operation can support the important transformation to become more environmentally sustainable for increased crop quality will be discussed. A specific part of the talk will be climate change, and how this reality will affect the adoption of smart sensing and AI technologies for PA.
Thermal imaging, or thermography, consists in measuring and imaging the thermal radiation emitted by every object above the absolute zero temperature. As this radiation is temperature-dependent, the infrared images recorded can be converted into temperature maps, or thermograms, allowing retrieving valuable information about the object under investigation. Thermal imaging has been known since the middle of the 20th century and recent technological achievements concerning the infrared imaging devices, together with the development of new procedures based on transient thermal emission measurements revolutionized the field. Nowadays, thermography is a method whose advantages are undisputed in engineering. It is routinely used for the non-destructive testing of materials, to investigate electronic components, or in the photovoltaic industry to detect defects in solar cells.
Despite an early interest, thermal imaging is currently rarely used for biomedical applications and even less in clinical settings. One reason is probably the initial disappointing results obtained solely with static measurement procedures, where the sample is investigated in its steady state, and using unhandy and performance-limited first-generation infrared cameras. In addition, the retrieval of quantitative data using dynamic thermal imaging procedures often requires complex mathematical modelling of the sample which can be demanding in biomedical applications due to the large variability intrinsic to life science field.
The goal of this lecture is to a) demonstrate the potential of dynamic thermal imaging for biomedical applications and b) give the reader the necessary background to successfully translate the technology to his/her specific biomedical applications.
In a first step, the basics of thermal radiation and thermal imaging device technology will be reviewed. Rather than giving an exhaustive description of the technology, we aim to familiarize the reader with key concepts that will allow selecting an optimal infrared camera depending on the specific application. In a subsequent part, we will present the foundation of thermodynamics needed to understand and be able to mathematically model heat transfer processes happening inside the sample under investigation and between the sample and its environment. Such thermal exchanges are responsible for the sample surface temperature. As next steps, we will present in detailed and compare the different procedures used in dynamic thermal imaging. Dynamic thermal imaging means that the sample surface thermal emission is monitored in its transient state and exhibit superior capabilities compared to passive thermal imaging. The thermal stimulation can be achieved with different modalities depending on the sample under investigation (LASER or flash lamps to investigate thin coatings, alternating magnetic fields to detect magnetic material, microwave to heat up water, or ultrasound to monitor cracks). Various procedures are possible: stepped- and pulsed-thermal imaging, pulsed-phase and lock-in thermal imaging. Each approach exhibiting specific characteristics in term of signal to noise ratio or measurement duration.
As an illustration, we will demonstrate how lock-in thermal imaging can be advantageously used to build extremely sensitive instruments to detect and characterize stimuli-responsive nanoparticles (both plasmonic and magnetic) in complex environments like cell cultures, tissue or food. In this example, we will present the research instrument in detail with the choice of the various components, the digital lock-in demodulation implemented, the mathematical modelling of the sample required to extract quantitative information as well as the resulting setup performances. The goal being to allow the reader to translate the dynamic thermal imaging measurement principles to its own biomedical application.
Impedance Spectroscopy is a measurement method used in many fields of science and technology including chemistry, medicine, and material sciences. The possibility to measure the complex impedance over a wide frequency range involves interesting opportunities for separating different physical effects, accurate measurements, and measurements of non-accessible quantities. Especially by sensors, a multifunctional measurement can be realized so that more than one quantity can be measured at the same time and the measurement accuracy and reliability can be significantly improved.
In order to realize impedance spectroscopy-based solutions, several aspects should be carefully addressed such as measurement procedures, modelling and signal processing, parameter extraction. Development of suitable impedance models and extraction of target information by optimization techniques is one of the most used approaches for calculation of target quantities.
Different presentations can be provided to specific topics to show the chances of application of this method in the fields of battery diagnosis, bioimpedance, sensors, and material sciences. The aim is to attract scientists to be able to apply impedance spectroscopy in different fields of instrumentation and measurement in an adequate way.
Optical Instrumentation, computer vision, and augmented reality are powerful platform technologies. In this lecture, we will discuss how these technologies can be used for medical applications. I will give an overview of the technologies and current challenges relevant to medical and surgical settings. The recent advances in image acquisition, computer vision, photonics, and instrumentation present the scientific community with the opportunity to develop new systems to impact healthcare. Leveraging an integrated design, advantages of hardware and software approach can be combined, and shortcomings can be complemented. I will present new approaches of fluorescence imaging for surgical applications. We will discuss hardware instrumentation, algorithm development, and system deployment. New development in multimodal imaging and image registration will also be discussed. For example, a combination of real-time intraoperative optical imaging and CT-based surgical navigation represents a promising approach for clinical decision support. Integration of 3D imaging and augmented reality provides surgeons with an intuitive way to visualize surgical data. In addition to technological development, I will discuss the clinical translation of systems and cross-disciplinary collaboration. Interdisciplinary approaches to solving complex problems in surgically relevant settings will be described.
Industry 4.0 is considered the great revolution of the past few years. New technologies, the Internet of things, the possibility to monitor everything from everywhere changed both plants and the approaches to the industrial production. Medicine is considered a slowly changing discipline. The human body model is a difficult concept to develop. But we can identify some passages in which medicine can be compared to industry. Four major changes revolutionized medicine:
Medicine 1.0: James Watson and Francis Crick described the structure of DNA. This was the beginning of research in the field of molecular and cellular biology
Medicine 2.0: Sequencing the Human genome. This discovery made it possible to find the origin of the diseases.
Medicine 3.0: The convergence of biology and engineering. Now the biologist’s experience can be combined with the technology of the engineers. New approaches to new forms of analysis can be used.
Medicine 4.0: Digitalization of Medicine: IOT devices and techniques, AI to perform analyses, Machine Learning for diagnoses, Brain Computer Interface, Smart wearable sensors.
Medicine 4.0 is definitely a great revolution in the patient care. New horizons are possible today. Covid 19 has highlighted problems that have existed for a long time. Relocation of services, which means remote monitoring, remote diagnoses without direct contact between the doctor and the patient. Hospitals are freed from routine tests that could be performed by patients at home and reported by doctors on the internet. Potential dangerous conditions can be prevented. During the Covid emergency everybody can check his condition and ask for a medical visit (swab) only when really necessary. This is true telemedicine. This is not a whatsapp where an elder tries to chat with a doctor. This is a smart device able to measure objective vital parameters and send to a health care center. Of course Medicine 4.0 requires new technologies for smart sensors. These devices need to be very easy to use, fast, reliable and low cost. They must be accepted by both people and doctors.
In this talk we’ll see together the meaning of telemedicine and E-Health. E-health is the key to allowing people to self monitor their vital signals. Some devices already exist but a new approach will allow to everybody (especially older people with cognitive difficulties) to use these systems with a friendly approach. Telemedicine will be the new approach to the concept of hospital. A virtual hospital, without any physical contact but with an objective measurement of every parameter. A final remote discussion between the doctor and the patient is still required to feel comfortable. But the doctor will have all the vital signal recorded to allow him to make a diagnosis based on reliable data.
Another important aspect of medicine 4.0 is the possibility of using AI both to perform parameter measurement and to manage the monitoring of multiple patients. The new image processing based on Artificial Neural Networks allows doctors to have a better and faster analysis. But AI algorithms are also able to manage intensive care rooms with several patients reducing the number of doctors involved in the global monitoring of the situation.
Medicine today has the availability of advanced technologies and new devices for diagnosis. Telemedicine gives a new scenario that allows remote diagnosis, control and treatment of patients at home without physical contact with the doctor. Routine checkups can be outsourced to small care facilities or even to the patient's own home. In Europe the elderly are more than the young but the funds for the health system are decreasing. The medicine paradigm must be rethought. E-Health can be the solution to support for the delocalization of some medical services: new micro and nano electronic circuits, IOT for pervasive and efficient communication, Artificial Intelligence to solve problems where models are not easy to apply but a lot of data is available. The ability to combine the power of AI algorithms and data from different sensors and databases can greatly increase the reliability of the final choice of the right therapy. This is the new Medicine 4.0. The digitalization of the processes and the improvement of technology allow interfacing the human body with computers and Artificial Intelligence allows you to work with a large amount of data (big data) and identify unknown correlations between the parameters to allow a new diagnosis. Several new perspectives will be discussed in this presentation.
We will investigate both new technologies showing wearable devices that can be used both to monitor patients at home (this topic was very important with the Covid 19) and Artificial Intelligence applied to medical image processing to perform remote diagnoses (once again used to distinguish pneumonia from lung problems due to Covid 19).
After this difficult period Medicine 4.0 will change several aspects of the interface between doctors and patients by improving the performance of national health services and reducing unnecessary costs. The future will provide a new digital hospital and a comprehensive monitoring system that integrates the interface between patients and hospitals.