You are here

Distinguished Lecturer Program

The I&M Society Distinguished Lecturer Program (DLP) is one of the most exciting programs offered to our chapters, I&M members, and IEEE members. It provides I&M chapters around the world with talks by experts on topics of interest and importance to the I&M community. It, along with our conferences and publications, is the way we use to disseminate knowledge in the I&M field. Our lecturers are among the most qualified experts in their own field, and we offer our members a first-hand chance to interact with these experts during their lectures. The I&M Society aids chapters financially so that they might use this program.

All distinguished lecturers are outstanding in their fields of specialty. Collectively, the Distinguished Lecturers possess a broad range of expertise within the area of I&M. Thus, the chapters are encouraged to use this program as a means to make their local I&M community aware of the most recent scientific and technological trends and to enhance their member benefits. Although lectures are mainly organized to benefit existing members and Chapters, they can also be effective in generating membership and encouraging new chapter formation. Interested parties are encouraged to contact the I&M DLP Chair regarding this type of activity.

Looking for a DL topic for an upcoming event that isn’t covered by our current DL’s? Contact the DL Chair, Kristen Donnell, to suggest a topic or find a DL who may be able to adapt his or her topic for your event.

Please review the DL reports and take a peek at the pictures by sending a request to the DLP Chair.

DLP Chair:

    I&M AdCom (2016-2019, 2012-2015); Distinguished Lecturer Program Chair


Distinguished Lecturers

Mihaela Albu Headshot Photo
Distinguished Lecturer (2016 to 2019)

Talk Title: High Reporting Rate Measurements for Smart[er] Grids

Abstract: Modern control algorithms in the emerging power systems process information delivered mainly by distributed, synchronized measurement systems, and available in data streams with different reporting rates. Multiple measurement approaches are used: on one side, the existing time-aggregation of measurements are offered by currently deployed IEDs (SCADA framework), including smart meters and other emerging units; on the other side, the high-resolution waveform-based monitoring devices like phasor measurement units (PMUs) use high reporting rates (50 frames per second or higher) and can include fault-recorder functionality.

There are several applications where synchronized data received with high reporting rate has to be used together with aggregated data from measurement equipment having a lower reporting rate (complying with power quality data aggregation standards) and the accompanying question is how adequate are the energy transfer models in such cases. For example, state estimators need both types of measurements: the so-called “classical” one, adapted for a de facto steady-state paradigm of relevant quantities and the “modern” one, i.e. with fewer embedded assumptions on the variability of same quantities. Another example is given by emerging active distribution grids operation, which assumes higher variability of the energy transfer and consequently a new model approximation for its characteristic quantities (voltages, currents) is needed. Such a model is required not only in order to be able to correctly design future measurement systems but also for better assessing the quality of existing “classical” measurements, still in use for power quality improvement, voltage control, frequency control, network parameters’ estimation etc.

The main constraint so far is put by the existing standards where several aggregation algorithms are recommended, with specific focus on the information compression. The further processing of rms values (already the output of a filtering algorithm) results in significant signal distortion.

Presently there is a gap between (i) the level of approximation used for modeling the current and voltage waveforms which is implicitly assumed by most of the measurement devices deployed in power systems and (ii) the capabilities and functionalities exhibited by the high fidelity, high accuracy and high number of potential reporting rates of the newly deployed synchronized measurement units.

The talk will address:

o The measurement paradigm in power systems;

  • System inertia, real time and steady-state
  • Instrument transformers; limited knowledge on the infrastructure
  • PQ, SCADA and PMUs
  • Power system state estimation; WAMCS
  • IEDs, PMUs, microPMUs
  • Time-stamped versus synchronized measurements

o Measurement channel quality and models for energy transfer

  • Voltage and frequency variability; rate of change of frequency
  • The steady-state signal and rapid voltage changes (RVC); rms-values reported with 100 frames/s;
  • Measurement data aggregation; filtering properties
  • Time- aggregation algorithms in the PQ framework
  • Statistical approaches;

o Applications and challenges

  • Communication channel requirements; delay assessment in WAMCS
  • Smart metering with high reporting rate (1s)

The presentation provides an overview of these techniques, with examples from worldwide measurement solutions for smart grids deployment.

Olfa Kanoun Photo
Distinguished Lecturer (2016 to 2019)

Talk Title: Impedance Spectroscopy for Measurement and Sensor Solutions

Impedance Spectroscopy is a measurement method used in many fields of science and technology including chemistry, medicine and material sciences. The possibility to measure the complex impedance over a wide frequency range involves interesting opportunities for separating different physical effects, accurate measurements and measurements of non-accessible quantities. Especially by sensors a multifunctional measurement can be realized, so that more than one quantity can be measured at the same time and the measurement accuracy and reliability can be significantly improved. 

In order to realize impedance spectroscopy based solutions, several aspects should be carefully addressed such as, measurement procedures, modelling and signal processing, parameter extraction. Development of suitable impedance models and extraction of target information by optimization techniques is one of the most used approaches for calculation of target quantities. 

Different presentations can be provided to specific topics to show the chances of application of this method in the fields of battery diagnosis, bioimpedance, sensors and material sciences. The aim is to attract scientist to be able to apply impedance spectroscopy in different fields of instrumentation and measurement in an adequate way.

 

Distinguished Lecturer (2016 to 2017)

Talk Title: Intelligent Production Testing and Data Processing for Six Sigma Based Manufacturing Process Improvement

Abstract: This three-part talk series deals with challenging problems of the modern hi-tech manufacturing industry (electronic and memory products), such as (1) Screening for Reliability, (2) Detecting Systematic Defects, and (3) Test for Yield Learning. The offered solutions conform to the systematic data-driven Six Sigma methodology which is based on setting extremely high objectives, collecting and deep analysing comprehensive production data with the aim to defect elimination toward the level below the six standard deviations between the mean and the nearest specification limit in any process. In particular, the following successfully developed real-world industry-originated projects will be discussed and generalised for extended implementation and application of the obtained results: 

1. Eliminating the Burn-in Bottleneck in IC Manufacturing

Reliability screening is one of several types of testing that are performed at different stages of the IC manufacturing process. It plays an important role in controlling and ensuring the quality and consistency of integrated circuits. One of the most popularly used forms of reliability test is burn-in testing (i.e., accelerated testing performed under elevated temperature and other stress conditions). Burn-in is normally associated with a long test time and high cost. As a result, the burn-in testing is often a bottleneck of the entire IC manufacturing process, limiting its throughput. It is no surprise therefore, that much attention and efforts have been dedicated towards possible reduction or even elimination of the burn-in testing.

This presentation offers a step-by-step methodology for the burn-in test time reduction of up to 90% based on the extended use of the High-Voltage Stress Test (HVST) technique. The Weibull statistical analysis is used to model the infant mortality failure distribution.

2. Defect Cluster Recognition for Fabricated Semiconductor Wafers

Many systematic failures in the wafer fabrication (so-called frontend process) can only be caught during the IC manufacturing (i.e., during the backend process). Thus, there is a need for a simple yet accurate system to perform a wafer defect cluster analysis based on fast knowledge extraction from the production test data. The talk will cover the design and development of an automation tool to carry out this task - Automatic Defect Cluster Analysis System (ADCAS). It is aimed at supporting the backend initiated efforts, such as defect root-cause identification, die-level neighbourhood analysis as well as yield analysis and improvement. It is suitable for a plug-and-play type application on semiconductor production databases while providing an excellent trade-off between the simplicity of implementation and high accuracy of the analysis.

3. Automatic Media Inspection in Magnetic Memory Drive Production

In the modern high-volume hard disk drive production process, if an assembled product fails the final test it is normally not discarded, but instead it is sent for so-called Teardown. There it is disassembled to the constituent components. These components are thoroughly examined and retested for their individual functionality. If found to be in a good operational condition, they are redeployed in new products. To retest the magnetic disk (or media) the Laser Doppler Vibrometry (LDV) has been traditionally employed. Unfortunately, LDV test is normally lengthy thus causing a bottleneck in the Teardown, and thus reducing the overall manufacturing efficiency. In order to address the problem, manual visual inspection is often performed as a preliminary filtering step. Such an arrangement is not optimal as it is open to human error factor. It still could be costly and has throughput limitations.

In this part of the talk series, the factors influencing successful and rapid image acquisition of micrometer level defects on a specular surface are explored, namely, camera spatial resolution, spectral properties, image system Signal-to-Noise Ratio and lighting methods. A detection as well as classification scheme is offered to classify four major types of commonly occurring media defects. 

 

Distinguished Lecturer (2016 to 2019)

Talk Title: Measurement and Instrumentation at the Tissue:Machine interface

Abstract: Biomedical Instrumentation is designed to provide clinicians, scientists and consumers with useful information about the performance and properties of the human body in health and disease. In many applications in medicine and biology, scientific instruments and medical devices operate at the interface between the “soft” world of living organisms and biological substances and the “hard” world of measurement and actuation systems. The development of such instruments and devices requires the designer to draw from and master a wide range of techniques and tools that span biology, chemistry, materials science, optics, mechanics, mathematics, electronics, and computing.

Moreover, the characteristics of biological specimens can be very challenging to measure. Many of the properties of living organisms, and their tissues, are highly non-linear, and can vary throughout time and space. The measurement techniques required to quantify these properties usually span physical domains – electrical, chemical, mechanical, and optical – and may need to account for changes that occur temporally and spatially within the specimen under study. The complexity of the interactions between measured variables often demands that measurements be separated and interpreted with the aid of multi-scale computational models. 

In this talk, I provide a brief example of the challenges and opportunities in bioinstrumentation and measurement, by overviewing the development of a unique scientific and medical instrument for studying heart muscle in health and disease. 

The heart is a complex organic engine that converts chemical energy into work. Each heartbeat begins with an electrically-released pulse of calcium, which triggers force-development and cell shortening, at the cost of energy and oxygen, and the dissipation of heat. My group have developed new instrumentation systems to measure all of these processes simultaneously, while subjecting isolated samples of heart tissue to realistic contraction patterns that mimic the pressure-volume-time loops experienced by the heart with each beat. This demanding undertaking has required us to develop our own actuators, force transducers, heat sensors, and optical measurement systems. Our instruments make use of several different measurement modalities which are integrated in a robotic hardware-based real-time acquisition and control environment and interpreted with the aid of a computational model.

In this way, we can now resolve (to within a few nanoWatts) the heat released by living cardiac muscle fibers as they perform work at 37 °C. Muscle force and length are controlled and measured to microNewton and nanometer precision by a laser interferometer, while the muscle is scanned in the view of an optical microscope equipped with a fluorescent calcium imaging system. Concurrently, the changing muscle geometry is monitored in 4D by a custom-built optical coherence tomograph, and the spacing of muscle-proteins is imaged in real-time by transmission-microscopy and laser diffraction systems. Oxygen consumption is measured using fluorescence-quenching techniques. 

Equipped with these unique capabilities, we have probed the mechano-energetics of failing hearts from rats with diabetes. We have found that the peak stress and peak mechanical efficiency of tissues from these hearts was normal, despite prolonged twitch duration. We have thus shown that the compromised mechanical performance of the diabetic heart arises from a reduced period of diastolic filling and does not reflect either diminished mechanical performance or diminished efficiency of its tissues. In another program of research, we have demonstrated that despite claims to the contrary, dietary supplementation by fish-oils has no effect on heart muscle efficiency. Neither of these insights were fully revealed until the development of this instrument. 

In this talk I demonstrate that in the field of bioinstrumentation and measurement, there is a need for greater use of measurement techniques and experimental protocols that can allow researchers to gather functional multi-scale, multiphysics data from the same tissue or material source. In many cases, these data can be best interpreted holistically, with the aid of multi-physics sample-specific computational models. 

 Other measurement tools our lab has developed that also highlight the advantages of this approach include programmable multi-axis soft-tissue robots for measuring the mechanical/optical properties of skin, pericardium, the pelvic floor, and other biological tissues. We have also invented and developed a new class of devices for automated, controlled needle-free injection and extraction of fluids through skin and other biological tissues. These devices are being applied to monitoring change and managing disease in a range of human, animal and agricultural applications. 

 

Jacob Scharcanski Headshot Photo
Distinguished Lecturer (2015 to 2018)

Talk Title: Computer Vision in Medical Imaging Measurements: Making Sense of Visual Data

In this talk, we discuss how computer vision can facilitate the interpretation of medical imaging data,  or help  making  inferences  based  on models  of such  data.  In order  to illustrate this presentation, several applications of medical imaging measurements and modeling are discussed, focusing in areas such as the correction of imaging artifacts that may occlude visual information, tumor detection, modeling and measurement in different imaging modalities.

 

When interpreting medical imaging data with computer vision, usually we are trying to describe anatomic structures (or medical phenomena) using one or more images, and reconstruct some of its properties based on imaging data (like shape, texture or color). Actually, this is an ill-posed problem that humans can learn to solve effortlessly, but computer algorithms often are prone to errors. Nevertheless, in some cases computers can surpass humans and interpret medical images more accurately, given the proper choice of models, as we will show in this talk.

 

Reconstructing interesting properties of real world objects or phenomena from captured imaging data involves solving an inverse problem, in which we seek to recover some unknowns given insufficient information to specify a unique solution. Therefore, we disambiguate   between   possible   solutions   relying   on   models   based   on   physics, mathematics or statistics. Modeling the real world in all its complexity still is an open problem. However, if we know the phenomenon or object of interest, we can construct detailed models using specialized techniques and domain specific representations, that are efficient at describing reliably the measurements (or obtaining measurements in some cases). In this talk, we briefly overview some challenging problems in computer vision for medical imaging and measurements, with illustrations and insights about model selection and model-based prediction. Some of the applications discussed in this talk are: modeling  tumor  shape and  size,  and  making  inferences  about  its  future  growth  or shrinkage; modeling relevant details in the background of medical images to discriminate them from useless background noise; and modeling shading artifacts to minimize their influence when detecting and measuring skin lesions in standard camera images.

 

Medical  images  contain  a wealth  of information,  which  makes  modeling of medical images a challenging task. Therefore, medical images often are segmented into multiple elementary parts, simplifying their representation and changing the image model into something that is more meaningful, or easier to analyze and measure (e.g. by describing the objects boundaries by lines or curves, or the image segments by their textures, colors, etc.). Nevertheless, these simpler image elements may be easy to perceive visually but difficult  to  describe.  For  example,  the  texture  of  a skin lesion may  not  have  an identifiable texture element or a model known a priori, and regardless of that skin lesion detection   must  be  accurate   and  precise.   Segmentation   of  medical imaging  data segmentation and analysis still is an open question, and some current directions are discussed in this talk.

 

Computer vision and modeling are interrelated. Modeling imaging measurements often involves errors, and estimating the expected error of a model can be important in applications  (e.g. estimating a  tumor  size  and  its  potential  growth,  or  shrinkage,  in response to a treatment). This issue can be approached by adapting machine learning and pattern recognition techniques to solve problems in medical imaging measurements. Typically, a model has tuning parameters, and these tuning parameters may change the model complexity. We wish to minimize modeling errors and the model complexity, in other words, to get the ‘big picture’ we often sacrifice some of the small details. For example, estimating tumor growth (or shrinkage) in response to treatment requires modeling the tumor shape and size, which can be challenging for real tumors, and simplified models may be justifiable if the predictions obtained are informative (e.g. to evaluate the treatment effectiveness). To conclude this talk, we outline the current trends in computer vision in medical imaging measurements, and discuss some open problems.

Yong Yan Headshot Photo
Distinguished Lecturer (2015 to 2018)

Talk Title: Measurement and monitoring through electrostatic sensing

This presentation will review the recent advances in electrostatic sensors and signal processing techniques for industrial measurement and monitoring applications. The fundamental sensing principle and characteristics of electrostatic sensors will be introduced. Case studies that are covered in this presentation include pulverized fuel flow metering, on-line particle sizing, advanced flame monitoring, and linear and rotational speed measurement. Results from recent experimental and modelling studies as well as industrial trials will be reported. This presentation will focus on the following application domains.

 

Pulverized fuel flow metering

New techniques have been developed to tackle the well-known measurement challenges in the measurement of coal/air, biomass/air and biomass/coal/air mixture flows in power plant pipes. The flow parameters include velocity, concentration and mass flow rate of particles. Electrostatic sensors in conjunction with signal processing algorithms have been developed to measure the flow parameters. Demonstration trials on coal and biomass fired power stations will be reported.

 

On-line particle sizing

Particle size distribution is an important input parameter in combustion optimization. On-line continuous measurement of particle size distribution has been achieved through the use of low-cost electrostatic sensors and signal processing algorithms. As particles move through the electrostatic sensing field and the resulting signal is processed to extract particle sizing information. Experimental results will be reported and discussed.

 

Advanced flame monitoring

Electrostatic sensors have been developed to measure a range of physical parameters of a burner flame, including oscillation frequency, speed and stability. Such characteristic parameters have been used to achieve in-depth understanding and advanced monitoring of burner flames for the improvement of combustion efficiency and reduction in pollutant emissions. 

 

Linear and rotational speed measurements

Low-cost, strip-type electrostatic sensors along with correlation signal processing algorithms have been used to measure the speed of a moving surface in linear motion or the speed of rotating machinery. Performance and applicability of electrostatic sensors for the condition monitoring of mechanical systems have been assessed through computational modelling and experimentation. 

 

Distinguished Lecturer (2015 to 2018)

Talk Title: Automated Measuring Systems for Environmental Monitoring

Environmental monitoring can be essentially described as a set of continuous or frequent measurements of environmental parameters which are fundamental to assess the state of the environment, the achievement of predefined objectives, law enforcement, the detection of new environmental issues, and environmental short and medium term forecasting.  Within the scope of environmental monitoring are namely the following topics:

  • Analytical measurement of the three main environmental media: air, water, soil, and the following materials: gases, solid and liquid waste, human, plant and animal bodies, as well as organs, and synthetic products.
  • Instrumental methods for the measurement of environmental noise and vibration pollution.
  • Development and improvement of analytical methods for measuring environmental pollutants.
  • Development of new analytical methods, sensors and instruments for monitoring of air, water and soil quality in residential, industrial, as well as, agricultural areas.
  • Development and improvement of remote sensing methods for measuring environmental pollution.
  • Development of new platforms (e.g. satellites and aerial configurations) and image processing techniques for detection of potential sources of contamination and monitoring of air, water and earth quality.
  • Development of Geographic Information Systems to perform spatial analysis of remote sensing-based and ground-based environmental assessment.
  • Evaluation and assessment of environmental data.
  • Quality assurance and quality control of environmental measurements.

Monitoring at a micro-scale is related to monitor and track one or more parameters in a small and limited geographical context, such as the control of gaseous emissions of a factory. In terms of micro-scale, environmental monitoring is generally used to control emissions of pollutants, whether gaseous or liquid. By opposition, macro-scale monitoring involves a vast geographical area, such as the control of water quality of a lake.

Vibration and particular noise are important namely in industrial/manufacturing environments because of their influence on structures safety and on workers comfort and health. Notwithstanding, soil, air and water are the more important and thus the more monitored environmental media, not only because of their direct impact on all fauna and flora and human activity, but also indirectly through the influence they have on the weather and climate.

Soil is a non-renewable resource. As interface between the earth, the air and the water, it performs vital functions such as providing the basis for food and biomass production, storing carbon and maintaining the balance of gases in the air, providing valued habitats and sustaining biodiversity, and providing raw materials. The quantities that are more pertinent to assess soil status, and thus more usually measured, are moisture, soil acidity (pH), carbon, total nitrogen and carbon to nitrogen ratio, extractable phosphorous, extractable potassium and magnesium, potentially toxic elements, microbial biomass carbon, and earthworms.

Clean air is essential to our health and to the environment. The quality of the air we breathe has deteriorated considerably mainly as a result of human activities: rising industrial and energy production, burning of fossil fuels, the rise in traffic on our cities. Carbon monoxide, nitrogen oxides, ozone, particulate matter, sulfur dioxide, hydrocarbons and volatile organic compounds, lead and heavy metals, and toxic organic micro-pollutants are the undesired air constituents most commonly monitored.

Water covers about 70% of the Earth's surface but freshwater represents only 2.5% of the Earth's water, with 98.8% of that water being ice and groundwater. This means that freshwater, which is essential for human activity, is a scarce resource in large regions of the Earth and its quality must thus be monitored and controlled. It is also important to monitor the quality of salty water of seashores and rivers because of its impact on ecosystems and on human activity. The parameters that are taken into consideration depend on the use – drinking, washing, agriculture, fishing, food processing, recreation, industrial applications, etc. – but can be organized in 5 groups: (1) biological (algae, bacteria), (2) physical (temperature, turbidity and clarity, color, salinity, suspended solids, dissolved solids (sediment); (3) chemical (pH, dissolved oxygen, biological oxygen demand, nutrients (including nitrogen and phosphorus), organic and inorganic compounds (including toxicants), (4) aesthetic (odors, taints, color, floating matter), and (5) radioactive (alpha, beta and gamma radiation emitters).

For occasional measurement of environmental pertinent quantities, the natural solution is either to use dedicated, manually operated instruments if the measurement is to be made on site, or to take a sample of the media and make the measurements in an adequate laboratory. The last solution is sometimes required because of the difficulty of measuring on site namely chemical and biological quantities.

For continuous environmental monitoring, automated measuring systems are needed. When monitoring has to be made at several locations, the system must be of the distributed type. Very often the connection of the different monitoring sites recommends or even forbids the use of wires leading to the implementation of a wireless sensor network.

This talk will address the motivation for research on environmental monitoring, highlighting the basic issues underlying the choice of solutions, reviews recent advances in the development of sensing solutions, and presents selected examples of solutions for monitoring the three main environmental media – water, air, and soil – developed and implemented by the Instrumentation and Measurement Group in the Institute for Telecommunications, Lisbon, Portugal. 

Reza Zoughi Headshot Photo
Distinguished Lecturer (2014 to 2017)

Talk Title: Evolution of Microwave and Millimeter Wave Imaging for NDE Applications

Abstract:  Millimeter-wave signals span the frequency range of 30 GHz to 300 GHz, corresponding to a wavelength range of 10 mm to 1 mm. Signals at these frequencies can easily penetrate inside dielectric materials and composites and interact with their inner structures. The relatively small wavelengths and wide bandwidths associated with these signals enable the production of high spatial-resolution images of materials and structures. Incorporating imaging techniques such as lens-focused and near-field techniques, synthetic aperture focusing, holographical methods based on robust back-propagation algorithms with more advanced and unique millimeter wave imaging systems have brought upon a flurry of activities in this area and in particular for nondestructive evaluation (NDE) applications. These imaging systems and techniques have been successfully applied for a wide range of critical NDE-related applications. 

Although, near-field techniques have also been prominently used for these applications in the past, undesired issues related to changing standoff distance have resulted in several innovative and automatic standoff distance variation removal techniques. Ultimately, imaging techniques must produce high-resolution (in 3D) holographical images, become real-time, and be implemented using portable systems. To this end and to expedite the imaging process while providing a high-resolution images of a structure, recently the design and demonstration of a 6” by 6” one-shot, rapid and portable imaging system (Microwave Camera), consisting of 576 resonant slot elements, was completed. Subsequently, efforts have been expended to design and implement several different variations of this imaging system to accommodate one-sided and mono-static imaging, while enabling 3D image production using non-uniform rapid scanning of an object, as well as increasing the operating frequency into higher millimeter wave frequencies. This presentation provides an overview of these techniques, along with illustration of several typical examples where these imaging techniques have effectively provided viable solutions to many critical NDE problems.

Distinguished Lecturer (2014 to 2017)

Talk Title: Unobtrusive Smart Sensing and Pervasive Computing for Healthcare

Abstract:  The world’s population is ageing fast. According to the United Nations the median age for all world countries will rise from 28 now to 38 by 2050. Also, is estimated that by 2050, the population over 60 years will increase worldwide from 11% to 22%, a higher percentage (33%) of elderly population will be in developed countries. In this context, governments and private investors, in addition to work for increase efficiency and quality of healthcare, are searching for sustainable solutions to prevent increase expenditure on healthcare related with higher care demands of elderly people. As such, instrumented environments, pervasive computing and deployment of a seemingly invisible infrastructure of various wired and/or wireless communication networks, intelligent, real-time interactions between different players such as health professionals, informal caregiver and assessed people, are created and developed in various research institutions and healthcare system.

This presentation reviews the recent advances in the development of sensing solutions for vital signals and daily activity monitoring. Will be highlighted:

- Vital signals acquisition and processing by embedded devices in clothes and/or accessories (e.g. smart wrist worn) or in walking aids and transportation equipment such as walker or manual wheelchair. The strength and drawbacks regarding cardiac and respiratory assessment capabilities, the studies on cardiac sensing accuracy estimation and artefacts influence on cardiac function sensing through capacitive coupled electrocardiography, electromechanical film sensor and microwave Doppler radar ballistocardiography, reflective photoplethismography will be discussed. Blood pressure, heart rate variability and autonomous nervous system activity estimation based on virtual sensors included in wearable or object embedded devices will also be presented.

- Daily activity signals acquisition and processing through microwave motion sensor, MEMS inertial measurement units, infrared multi-point and Laser motion sensors. Acquisition and conditioning of signals for motion assessment and theragames based on motion sensing and recognition will be presented. Using a set of metrics that are calculated using the information delivered by the unobtrusive sensors for motion capture, objective evaluation of rehabilitation session effectiveness can be performed. Several methods for diagnosis and therapy monitoring, as time frequency analysis, principal component analysis and pattern recognition of motion signals with application to gait rehabilitation evaluation will described. The work under project Electronic Health Record for Physiotherapy promoted by Fundação para Ciência e Tecnologia, Portugal, for developing serious games for physiotherapy based on Kinect technology will be presented.

Concerning the embedded processing, communication and interoperability requirements for smart sensing devices a critical analysis of the existent solutions and a proposed innovatory solutions are discussed. Special attention is granted to wireless sensor network, M2M and IoT as so as to ubiquitous computing particularly smartphone apps applications for healthcare. A fast prototyping vital signs and motor activity monitor as so as the usage of IEEE1451.X smart sensor standards for biomedical applications are included in the presentation.

The creation of novel smart environments including remote vital signs and motor activity monitoring devices for health monitoring and physiotherapy interventions promote preventive, personalized and participative medicine, as in-home rehabilitation that can provide more comfort to the patients, better efficiency of treatments, and lower recovery periods and healthcare costs. The use of unobtrusive smart sensing and pervasive computing for health monitoring and physiotherapy interventions allow better assessment and communication between health professionals and clients, and increase likelihood of development and adoption of best practice based on adopting recognized research-based techniques and technologies, and sharing knowledge and expertise.

Robert X. Gao Photo
Distinguished Lecturer (2014 to 2017)

Talk Title: Advanced Sensing for Intelligent Manufacturing

Advanced sensing presents the prerequisite for realizing intelligent manufacturing. Sensors monitorproduction operations in real-time, often in harsh environment, provide input for diagnosing the root cause of quality degradation and fault progression such that subsequent corrective measures can be formulated and executed online to control a machine’s deviation from its optimal state.  With the increasing convergence among measurement science, information technology, wireless communication, and system miniaturization, sensing has continually expanded the contribution of mechatronics to intelligent manufacturing, enabling functionalities that were not feasible before in terms of in-situ state monitoring and process control. New sensors not only acquire higher resolution data at faster rates, but also provide local computing resources for autonomously analyzing the acquired data for intelligent decision support.

This talk presents research on advanced sensing for improved observability in manufacturing process monitoring, using polymer injection molding and sheet metal microrolling as two examples. The design, characterization, and realization of multivariate sensing and acoustic-based wireless data transmission techniques in RF-shielded environment are first introduced. Next, computational methods for solving an ill-posed problem in data reconstruction are described.  The talk highlights the significance of advanced sensing and data analytics for advancing the science base and state-of-the-technology to fully realize the potential of intelligent manufacturing.

Wuqiang Yang Headshot Photo
Distinguished Lecturer (2013 to 2016)

Talk Title: Electrical capacitance tomography for imaging industrial processes

Electrical capacitance tomography (ECT) is an imaging technique for industrial applications. ECT is based on measuring capacitance from a multi-electrode capacitance sensor and reconstructing cross-sectional images, aiming to visualise the distribution of dielectric materials, such as gas/oil flows in an oil pipeline and gas/solids distribution in a fluidised bed. The internal information is valuable for understanding complicated phenomena, verifying computational fluid dynamic (CFD) models, measurement and control of industrial processes, which are difficult with conventional process instruments. Compared with other tomography modalities, ECT is the most mature and offers advantages of no radiation, rapid response, non-intrusive and non-invasive, withstanding high temperature and high pressure and low-cost.

Research into ECT involves sensor and electronic circuit design, data acquisition, computer interface, mathematics, finite element analysis, software programming, and general knowledge in process engineering. Because of extremely small capacitance to be measured (down to 0.0001 pF) and the nature of soft-field, ECT presents challenges in engineering and mathematics. The University of Manchester (formerly UMIST) pioneered research into ECT. The latest ACECT system presents the state-of-the-art technology, which can generate on-line images at 100 frames per second with 73 dB signal-to-noise ratio (SNR) and has been used for many challenging industrial applications, such as gas-oil-water flows in oil pipelines, wet gas separators, pneumatic conveyors, cyclone separators and fluidised bed dryers. It is foreseen that ECT will make major contributions to the gas/oil, pharmaceutical, power and other industries. In this Lecture, the principle of ECT, capacitance measuring circuits, image reconstruction algorithms and some applications will be discussed, together with a demonstration of an ACECT system.

 

Pawel Niewczas
Distinguished Lecturer (2013 to 2016)

Talk Title: Advanced Photonic Sensors for Power and Energy Industries

Optical sensors and photonic devices have technically matured to the point that they are increasingly considered as alternatives for their electronic counterparts in numerous applications across the industry. In particular, the utilization of optical sensors has been considered for harsh, high-voltage or explosive environments where conventional transducers are difficult to deploy or where their operation is compromised by electromagnetic interference.

This prospective talk will explain the motivation for research on fiber-optic sensors, highlight the basic theories underlying their operation, and present selected examples of R&D projects carried out within the Advanced Sensors Team in the Institute for Energy and Environment at the University of Strathclyde, Glasgow, UK, targeting a range of industrial applications. The goal is to highlight great potential of optical sensors and to enrich recipients’ experience in instrumentation and measurement using alternative, non-electronic methods.

Alternatively, for audiences with greater photonics sensors awareness, the presentation can be tailored to solely focus on reporting the most recent progress in fiber sensing research for power and energy industries carried out within the team. In this instance, it will highlight specific examples of the measurement needs within the power and energy sectors and report on the novel approaches in fiber sensing to address these needs. In particular, it will illustrate such applications as downhole and subsea electrical plant monitoring; voltage and current measurement for power system metering and protection in the context of distributed generation; force and magnetic field monitoring in the context of thermonuclear fusion research; and, measurement of the loss of loading within concrete prestressing steel tendons in nuclear power plant applications. As the potential good solutions to these respective measurement needs, this talk will introduce such emerging technologies as the hybrid fiber Bragg grating (FBG) voltage and current sensors; novel solid-state FBG interrogation schemes utilizing wavelength division multiplexing (WDM) and time domain multiplexing (TDM) architectures (not requiring tunable spectral filters or lasers); and novel FBG sensors and interrogation schemes utilizing some promising intrinsic sensing mechanisms capable of measuring such quantities as magnetic and electric fields or bend.