You are here

Distinguished Lecturer Program

The I&M Society Distinguished Lecturer Program (DLP) is one of the most exciting programs offered to our chapters, I&M members, and IEEE members. It provides I&M chapters around the world with talks by experts on topics of interest and importance to the I&M community. It, along with our conferences and publications, is the way we use to disseminate knowledge in the I&M field. Our lecturers are among the most qualified experts in their own field, and we offer our members a first-hand chance to interact with these experts during their lectures. The I&M Society aids chapters financially so that they might use this program.

All distinguished lecturers are outstanding in their fields of specialty. Collectively, the Distinguished Lecturers possess a broad range of expertise within the area of I&M. Thus, the chapters are encouraged to use this program as a means to make their local I&M community aware of the most recent scientific and technological trends and to enhance their member benefits. Although lectures are mainly organized to benefit existing members and Chapters, they can also be effective in generating membership and encouraging new chapter formation. Interested parties are encouraged to contact the I&M DLP Chair regarding this type of activity.

Please review the DL reports and take a peek at the pictures by sending a request to the DLP Chair.

DLP Chair:

    I&M AdCom (2012-2015); Distinguished Lecturer Program Chair


Distinguished Lecturers

Yong Yan Headshot Photo
Distinguished Lecturer

Talk Title: Advanced measurement and monitoring techniques for coal and biomass fired power plant optimization

Despite the growing deployment of other energy sources, coal and biomass use is increasing worldwide to meet the rising global demand for electricity, which is predicted to rise by 2.6% per annum in the next 20 years. Global fluctuations in coal price and logistic uncertainties in coal supply mean that many power stations are burning a diverse range of coals (indigenous and imported) and the type and quality of coal being fired at any moment is often unknown for various practical reasons. Although biomass can be used to generate energy in different ways, co-firing with coal at existing power stations remains a practical option available to power plant operators, and is widely adopted as one of the main technologies for reducing greenhouse gas emissions from power generation. Biomass originates from a diverse range of sources in a wide variety of forms. In general, biomass has a higher moisture content and higher volatile matter than coal, but its density and calorific value are lower than coal. The inherent differences in combustion properties between biomass and coal and the unknown changes in the type and quality of coals and fluctuations in electricity demand have posed significant challenges to the power generation industry. Measurement and monitoring techniques have an important part to play in tackling the challenges.

This presentation reviews the recent advances in the development and applications of measurement and monitoring techniques to optimize the operation of coal and biomass fired power plants. The techniques that are covered in this presentation include pulverized fuel flow metering, on-line particle sizing, flame imaging, flame stability monitoring, and on-line fuel tracking. Fundamental principles of the measurement and monitoring techniques along with the design and implementation of prototype sensors and instruments will be introduced. Results from recent practical evaluations on industrial-scale combustion test facilities and demonstration trials on full-scale power plants will be reported.

Distinguished Lecturer

Talk Title: Unobtrusive Smart Sensing and Pervasive Computing for Healthcare

Abstract:  The world’s population is ageing fast. According to the United Nations the median age for all world countries will rise from 28 now to 38 by 2050. Also, is estimated that by 2050, the population over 60 years will increase worldwide from 11% to 22%, a higher percentage (33%) of elderly population will be in developed countries. In this context, governments and private investors, in addition to work for increase efficiency and quality of healthcare, are searching for sustainable solutions to prevent increase expenditure on healthcare related with higher care demands of elderly people. As such, instrumented environments, pervasive computing and deployment of a seemingly invisible infrastructure of various wired and/or wireless communication networks, intelligent, real-time interactions between different players such as health professionals, informal caregiver and assessed people, are created and developed in various research institutions and healthcare system.

This presentation reviews the recent advances in the development of sensing solutions for vital signals and daily activity monitoring. Will be highlighted:

- Vital signals acquisition and processing by embedded devices in clothes and/or accessories (e.g. smart wrist worn) or in walking aids and transportation equipment such as walker or manual wheelchair. The strength and drawbacks regarding cardiac and respiratory assessment capabilities, the studies on cardiac sensing accuracy estimation and artefacts influence on cardiac function sensing through capacitive coupled electrocardiography, electromechanical film sensor and microwave Doppler radar ballistocardiography, reflective photoplethismography will be discussed. Blood pressure, heart rate variability and autonomous nervous system activity estimation based on virtual sensors included in wearable or object embedded devices will also be presented.

- Daily activity signals acquisition and processing through microwave motion sensor, MEMS inertial measurement units, infrared multi-point and Laser motion sensors. Acquisition and conditioning of signals for motion assessment and theragames based on motion sensing and recognition will be presented. Using a set of metrics that are calculated using the information delivered by the unobtrusive sensors for motion capture, objective evaluation of rehabilitation session effectiveness can be performed. Several methods for diagnosis and therapy monitoring, as time frequency analysis, principal component analysis and pattern recognition of motion signals with application to gait rehabilitation evaluation will described. The work under project Electronic Health Record for Physiotherapy promoted by Fundação para Ciência e Tecnologia, Portugal, for developing serious games for physiotherapy based on Kinect technology will be presented.

Concerning the embedded processing, communication and interoperability requirements for smart sensing devices a critical analysis of the existent solutions and a proposed innovatory solutions are discussed. Special attention is granted to wireless sensor network, M2M and IoT as so as to ubiquitous computing particularly smartphone apps applications for healthcare. A fast prototyping vital signs and motor activity monitor as so as the usage of IEEE1451.X smart sensor standards for biomedical applications are included in the presentation.

The creation of novel smart environments including remote vital signs and motor activity monitoring devices for health monitoring and physiotherapy interventions promote preventive, personalized and participative medicine, as in-home rehabilitation that can provide more comfort to the patients, better efficiency of treatments, and lower recovery periods and healthcare costs. The use of unobtrusive smart sensing and pervasive computing for health monitoring and physiotherapy interventions allow better assessment and communication between health professionals and clients, and increase likelihood of development and adoption of best practice based on adopting recognized research-based techniques and technologies, and sharing knowledge and expertise.

Robert X. Gao Photo
Distinguished Lecturer

Talk Title: Advanced Sensing for Intelligent Manufacturing

Advanced sensing presents the prerequisite for realizing intelligent manufacturing. Sensors monitorproduction operations in real-time, often in harsh environment, provide input for diagnosing the root cause of quality degradation and fault progression such that subsequent corrective measures can be formulated and executed online to control a machine’s deviation from its optimal state.  With the increasing convergence among measurement science, information technology, wireless communication, and system miniaturization, sensing has continually expanded the contribution of mechatronics to intelligent manufacturing, enabling functionalities that were not feasible before in terms of in-situ state monitoring and process control. New sensors not only acquire higher resolution data at faster rates, but also provide local computing resources for autonomously analyzing the acquired data for intelligent decision support.

This talk presents research on advanced sensing for improved observability in manufacturing process monitoring, using polymer injection molding and sheet metal microrolling as two examples. The design, characterization, and realization of multivariate sensing and acoustic-based wireless data transmission techniques in RF-shielded environment are first introduced. Next, computational methods for solving an ill-posed problem in data reconstruction are described.  The talk highlights the significance of advanced sensing and data analytics for advancing the science base and state-of-the-technology to fully realize the potential of intelligent manufacturing.

Jacob Scharcanski Headshot Photo
Distinguished Lecturer

Talk Title: Modeling in Imaging Measurements: Making Sense of Data

In this talk, modeling in imaging measurements is proposed as a way to facilitate the interpretation of phenomena based on imagery, or to make inferences based on models of such phenomena. In order to illustrate this presentation, several applications of imaging measurements and modeling are discussed, focusing in areas such as medicine, biometrics, pulp and paper, soil sciences, porous media, surveillance, and human-machine interfaces.

When modeling imaging measurements, usually we are trying to describe the world (or a real world phenomenon) using one or more images, and reconstruct some of its properties based on imagery data (like shape, texture or color). Actually, this is an ill-posed problem that humans can learn to solve effortlessly, but computer algorithms often are prone to errors. Nevertheless, in some cases computers can surpass humans and interpret imagery more accurately, given the proper choice of models, as we will show in this talk.

Reconstructing interesting properties of real world objects or phenomena from captured imagery data involves solving an inverse problem, in which we seek to recover some unknowns given insufficient information to specify a unique solution. Therefore, we disambiguate between possible solutions relying on models based on physics, mathematics or statistics. Modeling the real world in all its complexity still is an open problem. However, if we know ahead of time about the phenomenon or object of interest, we can construct detailed models using specialized techniques and domain specific representations, that are efficient at describing reliably the measurements. In this talk, we give a brief overview of challenging domain specific modeling problems, and use them as illustrations of the concepts involved in modeling imaging measurements (in the form of a tutorial). We discuss modeling issues in 2D and 3D stochastic textures (e.g. pulp and paper, soil science and agriculture). Also, we provide some insights on model selection and model-based prediction using examples in medicine (e.g. modeling tumor shape and size, and make inferences about its future growth or shrinkage) and biometrics (e.g. measurements of the pose of a human head). 

Modeling imaging measurements is challenging, especially when dealing with textures. Texture is a widespread phenomenon in several segments of the industry and science (e.g. pulp and paper, soil science, oil reservoirs evaluation, agriculture, etc.). Nevertheless, texture is easy to perceive visually but difficult to describe. Typically, texture is a phenomenon that depends on the scale it is perceived, and its pattern depends on what we are looking for in a texture. Texture may consist of organized patterns of regular sub-elements, but in some cases it may be stochastic (e.g. paper, non-tissue materials, soil, etc.). For example, an image of a paper sample may be seen as a stochastic texture which does not have an identifiable texture element, or it may be seen as a collection of fibers forming a fiber network which, in this case, has fibers as identifiable texture elements. An important question to be answered is how should we describe/interpret a given texture ? This is an open question, and prior knowledge of the application and its textural properties helps identifying effective models for texture interpretation (and classification). Particularly, the structural characterization and classification of stochastic textures of porous materials has attracted the attention of researchers in different application areas, because of its great economic importance. For example, problems related to mass transfer and retention of solids in multi-phase fluid flow through stochastic porous materials are ubiquitous in different areas of chemical engineering. Agricultural engineering is one of the sectors that has received attention recently, mostly because of the changing practices in agriculture in developing countries, and in developed countries, with great environmental impact, and mass transfer in porous media like soils depends strongly on the morphological aspects of the media - such as the shape and size of voids, and depends also on the topological attributes of these media, such as the network connectivity. More recently, researchers have proposed geometrical and statistical approaches for porous media characterization. The statistical characterization and classification of stochastic porous media, is essential for the simulation and/or prediction of the mass transfer properties of a particular stochastic medium. Much work has been done on the characterization of stochastic porous media but the discrimination between different media types from measurements still remains a challenging issue. In this talk, we discuss the application of gamma statistics to model the distribution of voids in stochastic porous media, which has admitted a direct statistical geometric representation of stochastic fibrous networks and their fluid transfer properties. A related issue is the classification of stochastic textures and porous media, which we discuss by introducing a gamma manifold and embedding of stochastic texture and porous media representations. In order to measure the similarity of such stochastic textures and porous media, different approaches to measure stochastic texture similarity are overviewed. Experimental results based on porous media data obtained from tomographic images of soil, and images of fibrous materials are presented to illustrate this presentation.

Modeling imaging measurements often involves errors, and estimating the expected error of a model can be important in some applications (e.g. when estimating a tumor size and its potential growth, or shrinkage, in response to treatment). This issue is closely related to machine learning and pattern recognition, and techniques of these areas can be adapted to resolve problems in imaging measurements. Typically, a model has tuning parameters, and these tuning parameters may change the model complexity. We wish to minimize modeling errors and the model complexity, in other words, to get the ‘big picture’ we often sacrifice some of the small details. For example, estimating tumor growth (or shrinkage) in response to treatment requires modeling the tumor shape and size, which can be challenging for real tumors, and simplified models may be justifiable if the predictions obtained are informative (e.g. to evaluate the treatment effectiveness). To conclude this talk, open problems in imaging measurements model selection and assessment are discussed in some detail, particularly in biometrics and medicine.

Reza Zoughi Headshot Photo
Distinguished Lecturer

Talk Title: Evolution of Microwave and Millimeter Wave Imaging for NDE Applications

Abstract:  Millimeter-wave signals span the frequency range of 30 GHz to 300 GHz, corresponding to a wavelength range of 10 mm to 1 mm. Signals at these frequencies can easily penetrate inside dielectric materials and composites and interact with their inner structures. The relatively small wavelengths and wide bandwidths associated with these signals enable the production of high spatial-resolution images of materials and structures. Incorporating imaging techniques such as lens-focused and near-field techniques, synthetic aperture focusing, holographical methods based on robust back-propagation algorithms with more advanced and unique millimeter wave imaging systems have brought upon a flurry of activities in this area and in particular for nondestructive evaluation (NDE) applications. These imaging systems and techniques have been successfully applied for a wide range of critical NDE-related applications. 

Although, near-field techniques have also been prominently used for these applications in the past, undesired issues related to changing standoff distance have resulted in several innovative and automatic standoff distance variation removal techniques. Ultimately, imaging techniques must produce high-resolution (in 3D) holographical images, become real-time, and be implemented using portable systems. To this end and to expedite the imaging process while providing a high-resolution images of a structure, recently the design and demonstration of a 6” by 6” one-shot, rapid and portable imaging system (Microwave Camera), consisting of 576 resonant slot elements, was completed. Subsequently, efforts have been expended to design and implement several different variations of this imaging system to accommodate one-sided and mono-static imaging, while enabling 3D image production using non-uniform rapid scanning of an object, as well as increasing the operating frequency into higher millimeter wave frequencies. This presentation provides an overview of these techniques, along with illustration of several typical examples where these imaging techniques have effectively provided viable solutions to many critical NDE problems.

Wuqiang Yang Headshot Photo
Distinguished Lecturer

Talk Title: Electrical capacitance tomography for imaging industrial processes

Electrical capacitance tomography (ECT) is an imaging technique for industrial applications. ECT is based on measuring capacitance from a multi-electrode capacitance sensor and reconstructing cross-sectional images, aiming to visualise the distribution of dielectric materials, such as gas/oil flows in an oil pipeline and gas/solids distribution in a fluidised bed. The internal information is valuable for understanding complicated phenomena, verifying computational fluid dynamic (CFD) models, measurement and control of industrial processes, which are difficult with conventional process instruments. Compared with other tomography modalities, ECT is the most mature and offers advantages of no radiation, rapid response, non-intrusive and non-invasive, withstanding high temperature and high pressure and low-cost.

Research into ECT involves sensor and electronic circuit design, data acquisition, computer interface, mathematics, finite element analysis, software programming, and general knowledge in process engineering. Because of extremely small capacitance to be measured (down to 0.0001 pF) and the nature of soft-field, ECT presents challenges in engineering and mathematics. The University of Manchester (formerly UMIST) pioneered research into ECT. The latest ACECT system presents the state-of-the-art technology, which can generate on-line images at 100 frames per second with 73 dB signal-to-noise ratio (SNR) and has been used for many challenging industrial applications, such as gas-oil-water flows in oil pipelines, wet gas separators, pneumatic conveyors, cyclone separators and fluidised bed dryers. It is foreseen that ECT will make major contributions to the gas/oil, pharmaceutical, power and other industries. In this Lecture, the principle of ECT, capacitance measuring circuits, image reconstruction algorithms and some applications will be discussed, together with a demonstration of an ACECT system.

Pawel Niewczas
Distinguished Lecturer

Talk Title: Advanced Photonic Sensors for Power and Energy Industries

Optical sensors and photonic devices have technically matured to the point that they are increasingly considered as alternatives for their electronic counterparts in numerous applications across the industry. In particular, the utilization of optical sensors has been considered for harsh, high-voltage or explosive environments where conventional transducers are difficult to deploy or where their operation is compromised by electromagnetic interference.

This prospective talk will explain the motivation for research on fiber-optic sensors, highlight the basic theories underlying their operation, and present selected examples of R&D projects carried out within the Advanced Sensors Team in the Institute for Energy and Environment at the University of Strathclyde, Glasgow, UK, targeting a range of industrial applications. The goal is to highlight great potential of optical sensors and to enrich recipients’ experience in instrumentation and measurement using alternative, non-electronic methods.

Alternatively, for audiences with greater photonics sensors awareness, the presentation can be tailored to solely focus on reporting the most recent progress in fiber sensing research for power and energy industries carried out within the team. In this instance, it will highlight specific examples of the measurement needs within the power and energy sectors and report on the novel approaches in fiber sensing to address these needs. In particular, it will illustrate such applications as downhole and subsea electrical plant monitoring; voltage and current measurement for power system metering and protection in the context of distributed generation; force and magnetic field monitoring in the context of thermonuclear fusion research; and, measurement of the loss of loading within concrete prestressing steel tendons in nuclear power plant applications. As the potential good solutions to these respective measurement needs, this talk will introduce such emerging technologies as the hybrid fiber Bragg grating (FBG) voltage and current sensors; novel solid-state FBG interrogation schemes utilizing wavelength division multiplexing (WDM) and time domain multiplexing (TDM) architectures (not requiring tunable spectral filters or lasers); and novel FBG sensors and interrogation schemes utilizing some promising intrinsic sensing mechanisms capable of measuring such quantities as magnetic and electric fields or bend.

Annamária R. Várkonyi-Kóczy  Headshot Photo
Distinguished Lecturer

Talk Title: “Anytime” Processing – An Effective Way to Overcome Time and Resource Limitations

Nowadays, scientists, researchers, and practical engineers face a previously unseen explosion of the richness and the complexity of problems to be solved. Besides the spatial and temporal complexity, common tasks usually involve non-negligible uncertainty or even lack of information, strict requirements concerning the timing, continuity, robustness, and reliability of outputs, and further expectations like adaptivity and capability of handling atypical and crisis situations efficiently.

Model based computing plays important role in achieving these goals, because it means the integration of the available knowledge about the problem at hand into the procedure to be executed in a proper form, acting as an active component during the operation. Unfortunately classical modeling methods often fail to meet the requirements of robustness, flexibility, adaptivity, learning, and generalizing abilities. Even soft computing based models may fail to be effective enough  because of their high (exponentially increasing) complexity. To satisfy the time, resource and data constraints associated with a given task, hybrid methods and new approaches are needed for the modeling, evaluation, and interpretation of the problems and results. A possible solution is offered to the above challenges by the combination of soft computing techniques with novel approaches of anytime and situational modeling and operation.

Anytime processing is very flexible with respect to the available input information, computational power, and time. It is able to generalize previous input information and to provide short response time if the required reaction time is significantly shortened due to failures or an alarm appearing in the modeled system; or if one has to make decisions before sufficient information arrives or the processing can be completed. The aim of the technique is to ensure continuous operation in case of (dynamically) changing circumstances and to provide optimal overall performance for the whole system. In case of a temporal shortage of computational power and/or loss of some data, the actual operation is continued maintaining the overall performance “at lower price”, i.e., information processing based on algorithms and/or models of simpler complexity provide outputs of acceptable quality to continue the operation of the complete system. The accuracy of the processing may become temporarily lower but it is possibly still enough to produce data for qualitative evaluations and supporting further decisions.

Situational modeling has been designed for the modeling and control of complex systems where the traditional cybernetics models haven’t proved to be sufficient because the characterization of the system is incomplete or ambiguous due to unique, dynamically changing, and unforeseen situations. Typical cases are the alarm situations, structural failures, starting and stopping of plants, etc. The goal of situational modeling is to handle the contradiction arising from the existence of a large number of situations and the limited number of processing strategies, by grouping the possible situations into a treatable (finite) number of model classes of operational situations and by assigning certain processing algorithms to the defined processing regimes. This technique - similarly to anytime processing - offers a tradeoff between resource (including time and data) consumption and output quality.

The presentation gives an overview of the basics of anytime and situational approaches. Besides summarizing theoretical results and pointing out the arising open questions (e.g. accuracy measures, data interpretation, transients), the author enlightens some possibilities offered by these new techniques by showing successful applications taken from the fields of signal and image processing, control and fault diagnosis of plants, analysis and expert systems. Some of the discussed topics are:
-    Anytime Fuzzy Fast Fourier Transformation and Adaptive Anytime Fuzzy Fast Fourier Transformation: How can we determine the most important signal parameters before the signal period arrives? How can we implement fast algorithms with only negligible delay?
-    Anytime Recursive Overcomplete Signal Representations: How can we minimize the channel capacity necessary for transmitting certain amount of information? How can we provide optimal and flexible on-going signal representations, on-going signal segmentations into stationary intervals, and on-going feature extractions for immediate utilization in data transmission, communication, diagnostics, or other applications if the transmission channel is overloaded and in the case of processing non-stationary signals when complete signal representations can be used only with serious limitations?
-    High Dynamic Range (HDR) imaging and situational image quality improvement: How can we make the invisible details of images visible? How can we enhance the useful information of images which is significant from the point of view of further processing?
-    Anytime control and fault diagnosis of plants: How can we produce useful results and react in crisis situations very quickly in order to avoid catastrophes? How can we increase the safely available reaction time of the (slow) human supervisor by significantly decreasing the time needed for the automatic detection and diagnosis of faults?
-    CASY, an Intelligent Car Crash Analysis System: How can we build an intelligent expert system, capable to reconstruct the 3D model of crashed cars autonomously (without any human interaction) using only 2D photos; based on it, how can it determine characteristic features of crashes like the energy absorbed by the car-body deformation, the direction of impact and the pre-crash speed of the car? In what other fields can the algorithms of system be used?

Gourab Sen Gupta Headshot Photo
Distinguished Lecturer

Talk Title: Sensors and Measurements for Robotics

Robots have changed the way we work, play, live and unfortunately fight wars. Robots invaded the workplace many decades ago, initially for factory automation. They are increasing their presence in the home at a very rapid pace, primarily for assisted living. Wars are being fought using robots on the ground, above and below the waters and in the air. In the next decade, the world will witness the largest growth of robots in the service industry. From the days of industrial automation using monstrous robots, the world has advanced to micro and nano robots traversing the veins of a human body to deliver drugs.

What makes the robots so capable and versatile as they are today? Will they ever be able to attain the full functionality, intelligence and versatility of human beings? Or is it a wishful thinking? What will be the breakthrough technology that will enable the robots to make that quantum jump in their capabilities?

For successful completion of tasks, robots have to perceive the world around them, the workspace in which they operate. At the heart of this perception are the inputs from a gamut of sensors. Accurate measurement of physical parameters and fusion of sensory data has a profound influence on the accuracy of the perception model. While a lot of energy and resources are still being expended for research into robot locomotion and actuators for motion, it is the advancement in sensors and measurement technology that will catapult the robots to the next level of versatility and acceptance. Miniaturisation of sensors and precision measurement will be the flavour of research in the next decade which will make a career in instrumentation and measurement a very attractive proposition for young scientists and researchers.

This presentation will -
•    highlight the importance of sensing and measurement in the world of robotics
•    give an overview of the various sensors and sensing technologies that are in vogue in robotics
•    discuss future direction of research and development of sensors for robotics – MEMS, biological sensors etc.
•    Illustrate case studies of advanced sensing and instrumentation in autonomous robots such as a swarm of super intelligent Nano Quadrocopters, a robot to inspect plant health and growth in a laboratory, and a manually operated robot to move hospital beds.

This presentation will be informative for industry and academicians and enthuse engineers and students to take up a career in sensors and instrumentation.