IEEE.org | IEEE Xplore Digital Library | IEEE Standards | IEEE Spectrum | More Sites
The applications of microwave technology in the biomedical realm have undergone a remarkable evolution, revolutionizing various aspects of diagnosis and treatment. From its humble beginnings to cutting-edge applications, microwave engineering has continually pushed the boundaries of medical innovation, offering new solutions for improved health services and patient care.
This talk will review the chronology of the application of microwave-based instrumentation and measurement systems for the advancement of medicine and healthcare. During our travel through this thrilling timeline, we will discuss:
• The early stages of microwave technology adoption in medicine, focusing on its use for therapeutic purposes. • The immediate recognition by the scientific community of the potential of microwaves for non-invasive measurements in different domains, such as non-invasive medical imaging. • The following evolution of microwave technology in medicine led by the seminal development of microwave ablation systems. • The modern exploration of new frontiers in the application of microwaves for biomedical purposes, such as microwave-induced hyperthermia for targeted drug delivery. • The current intriguing research for microwave instruments and sensors in outpatient healthcare scenarios, envisioning sophisticated wearable and implantable devices for remote health monitoring and treatment.
Throughout this journey we will also highlight that the evolution of microwave technology in the biomedical realm has not been without challenges. We will thereby analyze some of the main concerns and their potential solutions, such as the risk for tissue heating and damage, especially in sensitive areas of the body, or the need to optimize the performance and reliability of microwave-based imaging systems for clinical use.
This talk will show that the future of microwave technology in biomedicine holds immense promise. With ongoing research and technological advancements, microwave-based techniques are expected to play an increasingly prominent role in personalized medicine, medical imaging, precision therapeutics, and remote, non-invasive patient monitoring. From early detection and diagnosis to targeted treatment and long-term management, microwave technology continues to drive innovation in healthcare, ultimately improving outcomes and quality of life for patients worldwide.
During the last years we have witnessed unprecedented advancements of electronic technology in many fields of application, and particularly in the biomedical realm. Among the countless possibilities, the technologies based upon the propagation of electromagnetic fields, such as microwaves or millimeter waves, raise as potential instruments as for non-invasive measurement of certain biomarkers.
During this lecture, we will acquaint ourselves with the fundamental principles of operation of these technologies, with a particular focus on the measurement of blood glucose concentration, the quintessential marker for diabetes management. Indeed, self-measuring the blood glucose level (BGL) is part and parcel of diabetes daily routines. Currently, most of the measuring methods are invasive and uncomfortable, often leading to a reduced, intermittent number of measurements. The development of a reliable non-invasive method able to provide the user with their BGL in a comfortable way, with capabilities of continuous BGL measurement, seems therefore highly desirable.
In this sense, we will see how the scientific community is endeavoring to develop a suitable technological solution for the craved non-invasive measurement of glucose concentration, leveraging the benefits of electromagnetic technologies. During our review of the battle against this technical challenge, we will focus on:
• The scientific foundations of remote measurement by electromagnetic means, underlining the interesting properties of microwaves and millimeter waves for the particular requirements of biomedical contexts. • The main sensing approaches, with the resonator as the overarching element for these measurement systems. • The initial works demonstrating the measurement of glucose concentrations in aqueous and biological solutions. • The current challenges, including path-breaking human trials, sensitivity boosting and selectivity analysis. • The required instrumentation and driving electronics for such devices in out-of-the-lab applications. • Other potential applications of these instrumentation and measurement systems for biomarkers detection. • The zestful future prospects and expectations in the burgeoning pursue of reliable non-invasive BGL measurement, especially considering the arrival of modern artificial intelligence techniques.
All in all, we will show that the desired non-invasive, continuous BGL measurement might no longer be a figment of our imagination in the near future. With intriguing ongoing research and technological advancements, we will highlight the potential of current electromagnetic technologies for instrumentation and measurement purposes in the biomedical domain. This talk will allow us to gain insights on the basic working principles that inspired the past pioneering works, made possible the current thrilling advancements, and will facilitate the unthinkable future applications.
Electric Vertical Take-Off and Landing (eVTOL) aircraft are poised to transform urban airspace, enabling both commercial deliveries and passenger transportation. Ensuring the safety of this future airspace necessitates highly precise health management systems that can actively predict and prevent potential failures in these vehicles. Such proactive measures are crucial not only for maintaining high safety standards but also for optimizing maintenance and enabling autonomous decision-making in Urban Air Mobility (UAM) systems.
Real-time understanding of an aircraft's health condition is key to moving from fixed maintenance schedules to a condition-based predictive maintenance model. Accurate health prediction relies on both the current health state and an understanding of future usage patterns.
Traditionally, complex system health prediction relied on either model-based or data-driven approaches. Hybrid modeling, combining the strengths of both, is gaining traction. This approach leverages existing knowledge of the system's physics-based principles with the data-driven learning capabilities of machine learning. Particularly suited to the challenges of complex, evolving electric aircraft propulsion systems, Hybrid Physics-Informed Neural Networks (H-PINNs) offer the potential for accurate and adaptable health prediction models. This hybrid approach has been successfully applied to predicting the health of electric powertrains in Unmanned Aerial Vehicles (UAVs), using deep learning to learn the uncertain, degrading parameters in the physics-based model.
In conclusion, this hybrid framework represents a significant advancement in monitoring and predicting the health of complex systems. This has far-reaching implications for improving safety, optimizing maintenance, and enhancing the reliability and efficiency of electric aircraft.
The Data Deluge, in which data is generated faster than it can be efficiently managed, analyzed and used to make informed decisions, represents a commonly used mantra of the ICT world. From an engineering and science perspective, Big Analog Data, has been previously coined by the National Instruments company as a suitable term to characterize high sample rate, digitized, measurements from sensors which can eventually produce high fidelity and (almost) infinitely complex digital twin representations of the physical world.
In practice, we consider that distributed measurement systems generate large quantities of online and streaming datasets that need to be processed in real-time for decision support and/or control purposes. In many situations the resulting data cannot be used directly by intelligent algorithms and suitable data preprocessing pipelines need to be defined and implemented. Furthermore, the heterogeneous reporting rates, embedded measurement models and spatial scales at which the measurements are collected need to be aligned in a robust manner for many tasks. In particular, for (Industrial) Internet of Things systems the dynamic trade-off between high spatial and time resolution measurements and data quality from large numbers of low-cost distributed sensors has to be accounted for, in conjunction with the application (metrological) requirements. These inherent compromises can be mitigated, albeit to a limited extent, through advanced data processing methods that lead to an improved reconstruction of the original signal.
The focus of the talk is thus how to best exploit increasingly available and quality data sources within a rigorous and robust instrumentation and measurement context while leveraging cross-domain interactions with the computing and control technical communities?
The talk will also introduce well-established programming and scientific computing libraries and frameworks that can be used to extract information and lead to accurate characterization of the underlying dynamic processes, with replicable and computationally efficient results. Relevant case studies will focus on smart meter data in real-world scenarios, where effective labelling and classification of microscale features can lead to improved energy management and an environmentally friendly and resilient electrical grid of the future. We focus on methods and techniques to first detect and label such features as anomalies in a data processing and learning pipeline. Subsequently, the labeled datasets are used in a forecasting framework as an early-warning system for potential imbalances in the local energy network. One key novelty is the combination of extracted features using time series data mining methods, such as the matrix profile, with state-of-the-art machine learning algorithms, including deep learning to optimize classification metrics in real time, across various model/algorithm structures and hyper-parametrization options.
Tactile sensing is important for a robot to interact with the external environment. Currently, most robots can handle a known object at specific location, but they are vulnerable to an unknown object and/or unknown environment. Tactile sensors play an important role in interaction between a robot hand and an unknown object because tactile sensors can provide necessary information on touch detection and for feedback control. Among various tactile sensing techniques, capacitive sensors have gained popularity, due to their simple structure, high sensitivity, low power consumption, quick response, wide dynamic range and low cost. During my Lecture, I will start from the need of tactile sensors for robot applications, and review existing tactile sensing technologies. Then, I will introduce the principle and implementation of a type of capacitive tactile sensor, which can sense 3D force, and a dedicated design of a digital-analogue hybrid chip, which contains a capacitance-to-digital converter (CDC), a 32-channel multiplexer, an ARM microcontroller and a router, facilitating collaborative capacitance measurement across multiple chips. I will describe applications of the capacitive tactile sensors and the developed chip on robots with video demonstration, and also discuss other possible applications, including implementation of artificial skin, intelligent functions for cars, elderly care, and the possibility to develop a very low-cost electrical capacitance tomography (ECT) device.
This lecture presents a detailed focus on the use of machine vision techniques in industrial inspection applications. The lecture will provide insights on a range of inspection tasks, drawn from their cutting-edge work in academia and industry, covering practical issues of vision system integration for real-world applications.
Advanced machine vision systems may incorporate multiple imaging and/or vision modalities to provide robust solutions to complex situations and problems in industrial applications. A diverse range of industries, including aerospace, automotive, electronics, pharmaceutical, biomedical, semiconductor, and food/beverage, and manufacturing, etc., have benefited from recent advances in multi-modal inspection technologies. This lecture highlights both the advances in technologies and vision system integration for practical applications. The advances provide an insight into recent progresses and developments of imaging and vision techniques for varied industrial inspection tasks while the applications present the state-of-the-art of imaging and vision system integration, implementation, and optimization.
Topics and features:
Bridging the gap between theoretical knowledge and engineering practice, this lecture will attract graduate students interested in imaging, machine vision, and industrial inspection. The lecture also provides an excellent reference for researchers seeking to develop innovative solutions to tackle practical challenges, and for professional engineers who will benefit from the coverage of applications at both system and component level.
Microwave and Millimeter wave signals span the frequency range of 300MHz – 30 GHz and 30 GHz – 300 GHz respectively. Signals at these frequencies can easily penetrate inside dielectric materials and composites and interact with their inner structures. At millimeter wave frequencies, the relatively small wavelengths and wide bandwidths associated with these signals enable the production of high spatial-resolution images. Imaging techniques can be primarily classified as either near-field or far-field techniques depending on the distance between the probe from the structure. Near-field imaging techniques are simple yet powerful, producing very high image resolutions with relatively inexpensive systems. On the other hand, far-field imaging coupled with synthetic aperture radar (SAR)-based 3D imaging techniques, have demonstrated sub-wavelength image resolution, and high image clarity. SAR 3D imaging techniques rely on constructively combining reflected signal data from many measurement points within the synthetic aperture (i.e., a multi-view technique) which greatly enhances the image quality. More importantly, SAR 3D imaging techniques can be implemented using imaging arrays that yield imaging results in real-time. Moreover, utilizing techniques such as non-uniform sampling, spectrum estimation, adaptive image formations, and multi-static or multiple-input multiple-output (MIMO) imaging for 3D SAR imaging are some of the methods that be utilized for the goal of enhancing image quality while reducing measurement time and hardware complexities.
This talk will present the chronology of developing portable, high-resolution, 3D and real-time millimeter wave imaging systems. These systems are developed to address the needs of several diverse and critical applications related to nondestructive testing, security, radar tracking, localization, and biomedical fields. This talk will focus on the designs of the developed imaging systems focusing on:
Finally, practical considerations for designing real-time imaging arrays will be discussed and the capabilities of each imaging system will be demonstrated.
In our daily life, when we are sick, we usually go to the hospital and ask experienced doctors for diagnosis and treatment. The long-term operation of machines will also produce "disease", that is the so-called fault. We need to seek the help of "doctor" of the machine to diagnose the occurrence of the fault and predict its development trend, and then provide guidance for its operation and maintenance. The new generation of artificial intelligence technology represented by deep learning provides a new way of intelligent diagnosis for machine doctors. On the basis of introducing the development history of artificial neural network, this talk introduces the concept and characteristics of deep learning, and then discusses several typical deep network models and their application in intelligent diagnosis of machines, as well as the development trend of deep learning in the future.
With the rapid development of big data and the Internet of Things, data-driven technology, especially deep learning (DL), is becoming increasing important in intelligent maintenance. However, the “black box” nature of DL-based intelligent maintenance still seriously hinders wide applications in industry, especially safety-critical applications. In fact, before the rise of DL, the physics-driven approach, as a white box model that relies on the causality to establish physics law from first principles, is also a popular way, but it is not accurate enough. As two ways of observing the laws of the physical world, data-driven and physics-driven models are not opposite, but two sides of one coin, and they have consistent insight. Therefore, integrating physics model into DL, namely physics-informed deep learning (PIDL), is a nature and promising pathway towards scientific intelligent maintenance. This talk mainly aims to emphasize the importance of PIDL in scientific intelligent maintenance, where users can understand the operation mechanism inside the model and realize human-in-the-loop. At last, some applications of PIDL are discussed to illustrate the merit of scientific intelligent maintenance.
Modern control algorithms in the emerging power systems process information delivered mainly by distributed, synchronized measurement systems, and available in data streams with different reporting rates. Multiple measurement approaches are used: on one side, the existing time-aggregation of measurements are offered by currently deployed IEDs (SCADA framework), including smart meters and other emerging units; on the other side, the high-resolution waveform-based monitoring devices like phasor measurement units (PMUs) use high reporting rates (50 frames per second or higher) and can include fault-recorder functionality.
There are several applications where synchronized data received with a high reporting rate has to be used together with aggregated data from measurement equipment having a lower reporting rate (complying with power quality data aggregation standards) and the accompanying question is how adequate are the energy transfer models in such cases. For example, state estimators need both types of measurements: the so-called “classical” one, adapted for a de facto steady-state paradigm of relevant quantities, and the “modern” one, i.e. with fewer embedded assumptions on the variability of same quantities. Another example is given by emerging active distribution grids operation, which assumes higher variability of the energy transfer and consequently, a new model approximation for its characteristic quantities (voltages, currents) is needed. Such a model is required not only in order to be able to correctly design future measurement systems but also for better assessing the quality of existing “classical” measurements, still in use for power quality improvement, voltage control, frequency control, network parameters’ estimation, etc.
The main constraint so far is put by the existing standards where several aggregation algorithms are recommended, with a specific focus on the information compression. The further processing of RMS values (already the output of a filtering algorithm) results in significant signal distortion.
Presently there is a gap between (i) the level of approximation used for modeling the current and voltage waveforms which are implicitly assumed by most of the measurement devices deployed in power systems and (ii) the capabilities and functionalities exhibited by the high fidelity, high accuracy and a high number of potential reporting rates of the newly deployed synchronized measurement units.
The talk will address:
o The measurement paradigm in power systems;
o Measurement channel quality and models for energy transfer
o Applications and challenges
The presentation provides an overview of these techniques, with examples from worldwide measurement solutions for smart grids deployment.
Brain-Computer Interfaces (BCIs) are a novel means of human-computer interaction relying on the direct measurement of brain signals. Possible applications consist of replacing, restoring, improving, enhancing and supplementing the natural outputs of the central nervous system, as well as for brain functions investigation. In considering daily-life constraints, researchers are exploring the possibility to provide visual stimuli by means of smart glasses or visors, which are increasingly exploited in extended reality (XR). Moreover, to detect the elicited potentials, commercial devices for electroencephalography (EEG) are taken into account. Nevertheless, those studies were more application-oriented and they did not deal with a metrological characterization of the stimulation and detection equipment.
In bridging this gap, metrology was considered for a significant enhancement of the BCI systems both in terms of designing and in operational understanding. It was demonstrated that, although often overlooked, applied metrology plays an important role in this field. Indeed, if the stimulation and detection equipment is not fully characterized, the measures of interest for the brain-computer interface system may result in misleading interpretation of the brain functioning. Instead, by means of the mentioned results, one can compare the measured brain signals with the behaviour of the equipment in the time and frequency domains, so to correctly identify the contribution of the “human transducer” in the BCI measurement chain.
Nowadays when the global population is growing by more than 80 million a year reported studies are predicting an increasing pressure on the planet's natural resources including food resources. The situation is going worst when unpredictable meteorologic events are running up in the context of great climate changes related to the global effect of anthropogenic greenhouse emissions. In this context precision agriculture (PA) combines technologies and practices to optimize agricultural production through specific farm management are considered.
At the same PA focuses on the accuracy of operations considering the place, time to act and method to be applied. Agricultural operations are carried out to reach the production goals using the information provided by the smart sensors and instrumentation increasing the sustainability of operations. Distributed smart sensing system characterized by fixed and mobile nodes (associated with Unnamed Aerial Vehicle (UAV)) is used to turn farming operations into data, and to make future operations a data-driven one. These new including edge and cloud computing that are capable to run artificial intelligence algorithms may contribute to a slight replacement of human decisions based on their accumulated experience with a machine-based decision. This new way to act in agriculture in a digital form combining technologies such as smart sensors, cloud, and mobile computing, data science is related to the fact that classical decisions cannot be applied nowadays when the cultivated areas are much extended, and the adverse meteorological events are occurring frequently that conduct to miss-management with yield losses.
Using smart sensors computation and data analysis the applied quantity of water and fertilizers is optimized. Weather stations could provide additional information such as ambient temperature, relative humidity, and wind velocity that are also used together soil measured quantities such as moisture, pH, conductivity, temperature, and macronutrients concentration (Nitrogen, Potassium, Calcium) to create models to be used for farm operation optimization. Data from distributed sensing systems on the crop field can be also used to avoid plant stress phenomena (e.g. plant water stress). Data mining is successfully applied in PA being associated with data analysis of massive data.
In this talk, we’ll see together the meaning of precision agriculture in the context of heavy uncertainty associated with climate change. IoT ecosystem for precision agriculture will be discussed including multimodal sensing and artificial intelligence. Referring to sensing as part of the IoT ecosystem in-situ and remote sensing is considered. The agriculture UAV imagery and satellite imagery solutions as so as the relation between the data coming from the smart sensors distributed in the field and acquired images using multispectral imagery techniques will be part of the presentation. Metrological characteristics of smart sensors as so as the calibration procedure for in-situ and remote measurement smart sensing systems will be part of the talk.
Another important technology associated with innovative precision agriculture is related to the development of AI data-driven models for farming operations considering data coming from different sources Examples of data-driven models for smart irrigation and nutrient delivery will be considered.
Challenges to precision agriculture adoption by regular farmers and how the agricultural operation can support the important transformation to become more environmentally sustainable for increased crop quality will be discussed. A specific part of the talk will be climate change, and how this reality will affect the adoption of smart sensing and AI technologies for PA.