IEEE.org | IEEE Xplore Digital Library | IEEE Standards | IEEE Spectrum | More Sites
Estimating latency between network nodes in the Internet can play a significant role in the improvement of the performance of many applications and services that use latency to make routing decisions. A popular example is peer to peer (P2P) networks, which need to build an overlay between peers in a manner that minimizes the delay of exchanging data among peers. Measurement of latency between peers is therefore a critical parameter that will directly affect the quality of applications such as video streaming, gaming, file sharing, content distribution, server farms, and massively multiuser virtual environments (MMVE) or massively multiplayer online games (MMOG). But acquisition of latency information requires a considerable amount of measurements to be performed at each node in order for that node to keep a record of its latency to all the other nodes. Moreover, the measured latency values are frequently subject to change and need to be regularly repeated in order to be updated against network dynamics. This has motivated the use of techniques that alleviate the need for a large number of empirical measurements and instead try to predict the entire network latency matrix using a small set of latency measurements. Coordinate‐based approaches are the most popular solutions to this problem. The basic idea behind coordinates based schemes is to model the latency between each pair of nodes as the virtual distance among those nodes in a virtual coordinate system.
In this talk, we will cover the basics of how to measure latency in a distributed manner and without the need for a bottleneck central server. We will start by an introduction and background to the field, then we will briefly explain measurement approaches such as Network Time Protocol, Global Positioning System, and the IEEE 1588 Standard, before moving to coordinate based measurement approaches such as GNP (Global Network Positioning), CAN (Content Addressable Network), Lighthouse, Practical Internet Coordinates (PIC), VIVALDI, and Pcoord. In the end, we also propose a new decentralized coordinatebased solution with higher accuracy, mathematically‐proven convergence, and locality‐aware design for lower delay.
The target audiences of this tutorial are practitioners, scientists, and engineers who work with networking systems and applications where there is a need to measure and estimate delay among network nodes, possibly a massive number of nodes (thousands, tens of thousands, or even hundreds of thousands nodes).
A Massively Multiuser Virtual Environment (MMVE) sets out to create an environment for thousands, tens of thousands, or even millions of users to simultaneously interact with each other as in the real world. For example, Massively Multiuser Online Games (MMOGs), now a very profitable sector of the industry and subject to academic and industry research, is a special case of MMVEs where hundreds of thousands of players simultaneously play games with each other. Although originally designed for gaming, MMOGs are now widely used for socializing, business, commerce, scientific experimentation, and many other practical purposes. One could say that MMOGs are the “killer app” that brings MMVE into the realm of eSociety. This is evident from the fact that Virtual currencies such as Linden (or L$) in Second Life are already being exchanged for real-world money. Similarly, virtual goods and virtual real-estate are being bought and sold with real-world money. Massive numbers of users spend their time with their fellow players at online games like EverQuest, Half-Life, World of Warcraft, and Second Life. World of Warcraft, for example, has over twelve million users with a peak of over 500,000 players online at a given. There is no doubt that MMOGs and MMVEs have the potential to be the cornerstone of any eSociety platform in the near future because they bring the massiveness, awareness, and inter-personal interaction of the real society into the digital realm.
In this talk, we focus on approaches for supporting the massive number of users in such environments, consisting of scalability methods, zoning techniques, and areas of interest management. The focus will be on networking and system support and architectures, as well as research challenges still remaining to be solved.
The heart is a complex organic engine that converts chemical energy into work. Each heartbeat begins with an electrically-released pulse of calcium, which triggers force development and cell shortening, at the cost of energy and oxygen, and the dissipation of heat. My group has developed new instrumentation systems to measure all of these processes simultaneously while subjecting isolated samples of heart tissue to realistic contraction patterns that mimic the pressure-volume-time loops experienced by the heart with each beat. These devices are effective 'dynamometers' for the heart, that allow us to measure the performance of the heart and its tissues, much in the same way that you might test the performance of your motor vehicle on a 'dyno.'
This demanding undertaking has required us to develop our own actuators, force transducers, heat sensors, and optical measurement systems. Our instruments make use of several different measurement modalities which are integrated in a robotic hardware-based real-time acquisition and control environment and interpreted with the aid of a computational model. In this way, we can now resolve (to within a few nanoWatts) the heat released by living cardiac muscle fibers as they perform work at 37 °C.
Muscle force and length are controlled and measured to microNewton and nanometer precision by a laser interferometer, while the muscle is scanned in the view of an optical microscope equipped with a fluorescent calcium imaging system. Concurrently, the changing muscle geometry is monitored in 4D by a custom-built optical coherence tomograph, and the spacing of muscle-proteins is imaged in real-time by transmission-microscopy and laser diffraction systems. Oxygen consumption is measured using fluorescence-quenching techniques.
Equipped with these unique capabilities, we have probed the mechano-energetics of failing hearts from rats with diabetes. We have found that the peak stress and peak mechanical efficiency of tissues from these hearts was normal, despite prolonged twitch duration. We have thus shown that the compromised mechanical performance of the diabetic heart arises from a reduced period of diastolic filling and does not reflect either diminished mechanical performance or diminished efficiency of its tissues. In another program of research, we have demonstrated that despite claims to the contrary, dietary supplementation by fish-oils has no effect on heart muscle efficiency. Neither of these insights was fully revealed until the development of this instrument.
Optical sensors and techniques are used widely in many areas of instrumentation and measurement. Optical sensors are often, conveniently, ‘non-contact’, and thus impose negligible disturbance of the parameter undergoing measurement. Valuable information can be represented and recorded in space, time, and optical wavelength. They can provide exceptionally high spatial and/or temporal resolution, high bandwidth, and range. Moreover, optical sensors can be inexpensive and relatively simple to use.
At the Bioinstrumentation Lab at the Auckland Bioengineering Institute, we are particularly interested in developing techniques for measuring parameters from and inside and outside the body. Such measurements help us to quantify physiological performance, detect and treat disease, and develop novel medical and scientific instruments. In making such measurements we often draw upon and develop our own optical sensing and measurement methods – from interferometry, fluorimetry and diffuse light imaging, to area-based and volume-based optical imaging and processing techniques.
In this talk, I will overview some of the new interesting optically-based methods that we have recently developed for use in bioengineering applications. These include 1) diffuse optical imaging methods for monitoring the depth of a drug as it is rapidly injected through the skin, without requiring a needle; 2) stretchy soft optical sensors for measuring strains of up to several 100 % during movement; 3) multi-camera image registration techniques for measuring the 3D shape and strain of soft tissues; 4) optical coherence tomography techniques for detecting the 3D shape of deforming muscle tissues, and 5) polarization-sensitive imaging techniques for classifying the optical and mechanical properties of biological membranes.
While these techniques sensors and techniques have been motivated by applications in bioengineering, the underlying principles have broad applicability to other areas of instrumentation and measurement.
Nowadays, scientists, researchers, and practical engineers face a previously unseen explosion of the richness and the complexity of problems to be solved. Besides the spatial and temporal complexity, common tasks usually involve non-negligible uncertainty or even lack of information, strict requirements concerning the timing, continuity, robustness, and reliability of outputs, and further expectations like adaptivity and capability of handling atypical and crisis situations efficiently.
Model-based computing plays important role in achieving these goals because it means the integration of the available knowledge about the problem at hand into the procedure to be executed in a proper form, acting as an active component during the operation. Unfortunately, classical modeling methods often fail to meet the requirements of robustness, flexibility, adaptivity, learning, and generalizing abilities. Even soft computing based models may fail to be effective enough because of their high (exponentially increasing) complexity. To satisfy the time, resource, and data constraints associated with a given task, hybrid methods, and new approaches are needed for the modeling, evaluation, and interpretation of the problems and results. A possible solution is offered to the above challenges by the combination of soft computing techniques with novel approaches of any time and situational modeling and operation.
Anytime processing is very flexible with respect to the available input information, computational power, and time. It is able to generalize previous input information and to provide short response time if the required reaction time is significantly shortened due to failures or an alarm appearing in the modeled system; or if one has to make decisions before sufficient information arrives or the processing can be completed. The aim of the technique is to ensure continuous operation in case of (dynamically) changing circumstances and to provide optimal overall performance for the whole system. In case of a temporal shortage of computational power and/or loss of some data, the actual operation is continued maintaining the overall performance “at a lower price”, i.e., information processing based on algorithms and/or models of simpler complexity provide outputs of acceptable quality to continue the operation of the complete system. The accuracy of the processing may become temporarily lower but it is possibly still enough to produce data for qualitative evaluations and supporting further decisions.
Situational modeling has been designed for the modeling and control of complex systems where the traditional cybernetics models haven’t proved to be sufficient because the characterization of the system is incomplete or ambiguous due to unique, dynamically changing, and unforeseen situations. Typical cases are the alarm situations, structural failures, starting and stopping of plants, etc. The goal of situational modeling is to handle the contradiction arising from the existence of a large number of situations and the limited number of processing strategies, by grouping the possible situations into a treatable (finite) a number of model classes of operational situations and by assigning certain processing algorithms to the defined processing regimes. This technique - similarly to anytime processing - offers a tradeoff between resource (including time and data) consumption and output quality.
The presentation gives an overview of the basics of anytime and situational approaches. Besides summarizing theoretical results and pointing out the arising open questions (e.g. accuracy measures, data interpretation, transients), the author enlightens some possibilities offered by these new techniques by showing successful applications taken from the fields of signal and image processing, control and fault diagnosis of plants, analysis, and expert systems. Some of the discussed topics are:
Scientific and industrial worlds have started recently to look again with interest to the basic rules to perform reliability, availability and safety analysis and design on complex electro-mechanical systems. The main failure modes on electronic devices and sensors as well as the main techniques for failure mode investigation are of interest in modern system design. Statistical characterization of the main probability density functions and degradation models of innovation is mandatory to build lasting and safe products. The main reliability design techniques such as: fault tree analysis, cut set method, minimal path approach, critical block analysis for reliability are requested by companies worldwide as well as the knowledge of the main failure modes and reliability databases and handbooks as MIL-HDBK217, OREDA, BELLCORE, etc… Maintenance policies with special attention to corrective and preventive ones are also affected by reliability design in terms of advantages and disadvantages when applied to electro-mechanical systems. The main safety standards as IEC61508, IEC 61511 and EN50129, EN50128, EN50126 are usually considered in industrial design. The aim of this talk is to enable companies to develop inner confidence on advanced modelling techniques involving reliability, availability and safe design. Under this spotlight in addition to traditional and well known statistical models, innovative modelling techniques based on statistical data representation will be introduced and tailored to some specific case studies in the fields of bio instruments, transportations and oil & gas contexts.
The convergence of healthcare, instrumentation and measurement technologies will transform healthcare as we know it, improving quality of healthcare services, reducing inefficiencies, curbing costs and improving quality of life. Smart sensors, wearable devices, Internet of Things (IoT) platforms, and big data offer new and exciting possibilities for more robust, reliable, flexible and low-cost healthcare systems and patient care strategies. These may provide value-added information and functionalities for patients, particularly for those with neuro-motor impairments.
In this talk the focus will be on: hardware and software infrastructure for neuro-motor rehabilitation; distributed instrumentation and communication standards; motor rehabilitation based on virtual reality and serious game; use of cloud computing for healthcare monitoring; use of mobile technologies for data storage data communication related to patients’ care; wearable sensor network integration with unobtrusive sensing technologies; Internet of Things technologies; data processing, data presentation that may assist healthcare professionals in objective, accurate assessment of patients’ motor activity and health status during daily activities; systems that support personalization of healthcare; systems that promote independent living and empower individuals and their families for self-care and healthcare management.
Technologies for unobtrusive measurement of patient posture and balance, patient’s muscles activation, movements’ characterization during neuro-motor rehabilitation will be presented and discussed during the talk. As part of these interactive environments, 3D image sensors for natural user interaction with rehabilitation scenarios and remote sensing of user movement, represented by Leap Motion Controller and Kinect, as well as thermographic camera for muscle activity evaluation will be presented. Instrumented daily used equipment for rehabilitation, such as smart walkers and crutches, force platform and wearable motor activity monitors based on smart sensors embedded in clothes and accessories for muscular activity monitoring by electromyography (EMG), force and acceleration measurement capabilities will be presented and discussed. Sensing technologies as part of smart tailored environments, such as piezo-resistive force sensors, e-textile EMG, microwave Doppler radar, MEMS inertial devices for motion measurement and optical fiber sensors will be presented in the context of IoT technologies, where RFID is used for smart object identification and localization in the augmented reality scenarios for therapy. Challenges related to simple and secure connectivity, signal processing, data storage, risk on data loss, data representation, data analysis including the development of specific metrics that can be used to evaluate the progress of the patients during the rehabilitation process will be discussed. Additional remote sensing technologies including thermography for training effectiveness evaluation will be also considered.
A network of physical things/objects, as part of smart environment, which is based on sensors and embedded platforms with Internet connectivity will collect and exchange data on monitored subjects under physical rehabilitation that may involve also the usage of serious games based on virtual and augmented reality. Training using these technologies may improve patients rehabilitation outcomes, may allow objective evaluation of the rehabilitation progress, early communication between health professionals, health professionals and their patients but also may support the research based on analysis of big data.
The world’s population is ageing fast. According to the United Nations the median age for all world countries will rise from 28 now to 38 by 2050. Also, is estimated that by 2050, the population over 60 years will increase worldwide from 11% to 22%, a higher percentage (33%) of elderly population will be in developed countries. In this context, governments and private investors, in addition to work for increase efficiency and quality of healthcare, are searching for sustainable solutions to prevent increase expenditure on healthcare related with higher care demands of elderly people. As such, instrumented environments, pervasive computing and deployment of a seemingly invisible infrastructure of various wired and/or wireless communication networks, intelligent, real-time interactions between different players such as health professionals, informal caregiver and assessed people, are created and developed in various research institutions and healthcare system.
This presentation reviews the recent advances in the development of sensing solutions for vital signals and daily activity monitoring. Will be highlighted:
- Vital signals acquisition and processing by embedded devices in clothes and/or accessories (e.g. smart wrist worn) or in walking aids and transportation equipment such as walker or manual wheelchair. The strength and drawbacks regarding cardiac and respiratory assessment capabilities, the studies on cardiac sensing accuracy estimation and artefacts influence on cardiac function sensing through capacitive coupled electrocardiography, electromechanical film sensor and microwave Doppler radar ballistocardiography, reflective photoplethismography will be discussed. Blood pressure, heart rate variability and autonomous nervous system activity estimation based on virtual sensors included in wearable or object embedded devices will also be presented.
- Daily activity signals acquisition and processing through microwave motion sensor, MEMS inertial measurement units, infrared multi-point and Laser motion sensors. Acquisition and conditioning of signals for motion assessment and theragames based on motion sensing and recognition will be presented. Using a set of metrics that are calculated using the information delivered by the unobtrusive sensors for motion capture, objective evaluation of rehabilitation session effectiveness can be performed. Several methods for diagnosis and therapy monitoring, as time frequency analysis, principal component analysis and pattern recognition of motion signals with application to gait rehabilitation evaluation will described. The work under project Electronic Health Record for Physiotherapy promoted by Fundação para Ciência e Tecnologia, Portugal, for developing serious games for physiotherapy based on Kinect technology will be presented.
Concerning the embedded processing, communication and interoperability requirements for smart sensing devices a critical analysis of the existent solutions and a proposed innovatory solutions are discussed. Special attention is granted to wireless sensor network, M2M and IoT as so as to ubiquitous computing particularly smartphone apps applications for healthcare. A fast prototyping vital signs and motor activity monitor as so as the usage of IEEE1451.X smart sensor standards for biomedical applications are included in the presentation.
The creation of novel smart environments including remote vital signs and motor activity monitoring devices for health monitoring and physiotherapy interventions promote preventive, personalized and participative medicine, as in-home rehabilitation that can provide more comfort to the patients, better efficiency of treatments, and lower recovery periods and healthcare costs. The use of unobtrusive smart sensing and pervasive computing for health monitoring and physiotherapy interventions allow better assessment and communication between health professionals and clients, and increase likelihood of development and adoption of best practice based on adopting recognized research-based techniques and technologies, and sharing knowledge and expertise.
The tutorial will focus on sensor and measurement systems for new generations of vehicles with driver-assisted/autonomous capability. This is the main trend that is revolutionizing vehicles and mobility of people and goods and is also making smart our cities. The economic and social impacts of this application field are huge. Worldwide every year 90 million vehicles are sold, but 1.25 million people are killed due to lack of safety. In the US 3.1 billions of gallons of fuel are wasted due to traffic congestion. Assisted driving and autonomous driving aim at increasing safety, at improving fuel efficiency and our lifestyle by avoiding traffic congestion, at ensuring mobility for elderly and disabled people (inclusivity). The interest in this research subject is demonstrated by the huge investments of companies like Google, Intel, Tesla, Uber, Ford, GM, to name just a few, and by technology alliances, e.g. between BMW and Intel, planning autonomous cars for 2021. A convergence between automotive and ICT/Electronics industry is foreseen in the near future. An example of this convergence is the 5G Automotive Association http://www.5gaa.org/, which includes all main cars’ manufacturers, telecom service providers, electronic industries, measurement system providers (Keysight, Rohde&Schwarz).
The key enabling technologies for this scenario are the sensing and measurement systems, needed for the accurate vehicle positioning and navigation, for vehicle context-awareness, obstacle detection, and collision avoidance, for driver-assistance (enhanced vision, driver’s attention and fatigue detection).
The lecture will be divided into multiple sections.
First, in the Introduction, innovation and market trends in the field of sensor and measurement technologies applied to vehicles and smart mobility systems will be discussed, focusing on next generation of driver-assisted/autonomous vehicles.
Then, new Radar and Lidar systems, appearing on-board vehicles beside an array of imaging cameras, will be discussed for measurement of obstacle positions, distance, and relative speed. A trade-off has to be found between the power and size of active sensing systems like Radar and Lidar and their maximum measurement range. Moreover, in continuous wave Radars, the limited frequency sweep range and the limited number of TX/RX channels lead to limits for the resolution in distance, direction of arrival, and speed measurements. Examples of X-band mobility surveillance Radar and mm-wave automotive Radar will be provided.
On the other hand, MOEMS (micro opto electro mechanical systems)-based scanned systems, used to reduce size and cost of Lidars are causing distortions that are worsening the accuracy of light-based measurements. Distortions due to fish-eye lenses, used to enlarge the field-of-view, are decreasing measurement performance of imaging sensors. Techniques to mitigate such artifacts will be discussed.
Practical examples of traffic sign recognition systems, road signs recognition, image mosaicking for all-around view will be discussed. In addition, Lidar and imaging cameras suffer from decreased measurement performance in case of harsh operating conditions (e.g. bad weather or light conditions).
New biometric sensing and measurement systems will be also reviewed, such as Radar-based contactless heart/breath-rate measurement, smart steering-wheel for skin temperature/galvanic-response measurements or heart-rate detection, with the final aim of detecting the driver’s attention or health status.
Concerning on-board sensors for positioning and navigation, recent advances in MEMS accelerometers and gyroscope will be discussed. A careful analysis will be carried out about the measurement errors they cause on position and navigation, due to their bias and random walk output noise.
Finally, the lecture will analyze the trend in computing platforms, where parallel architectures and machine learning/AI (artificial intelligence) techniques, will be exploited to manage in real-time many and heterogeneous sources of measurements and to make autonomous decisions.
Suggestions for future directions of interest for the I&M Society, and references to recent publications on IMS journals and conferences, in the field of automated and connected vehicles, will be provided as a conclusion.
This three-part talk series deals with challenging problems of the modern hi-tech manufacturing industry (electronic and memory products), such as (1) Screening for Reliability, (2) Detecting Systematic Defects, and (3) Test for Yield Learning. The offered solutions conform to the systematic data-driven Six Sigma methodology which is based on setting extremely high objectives, collecting and deep analysing comprehensive production data with the aim to defect elimination toward the level below the six standard deviations between the mean and the nearest specification limit in any process. In particular, the following successfully developed real-world industry-originated projects will be discussed and generalised for extended implementation and application of the obtained results:
1. Eliminating the Burn-in Bottleneck in IC Manufacturing
Reliability screening is one of several types of testing that are performed at different stages of the IC manufacturing process. It plays an important role in controlling and ensuring the quality and consistency of integrated circuits. One of the most popularly used forms of reliability test is burn-in testing (i.e., accelerated testing performed under elevated temperature and other stress conditions). Burn-in is normally associated with a long test time and high cost. As a result, burn-in testing is often a bottleneck of the entire IC manufacturing process, limiting its throughput. It is no surprise, therefore, that much attention and efforts have been dedicated towards possible reduction or even elimination of the burn-in testing.
This presentation offers a step-by-step methodology for the burn-in test time reduction of up to 90% based on the extended use of the High-Voltage Stress Test (HVST) technique. The Weibull statistical analysis is used to model the infant mortality failure distribution.
2. Defect Cluster Recognition for Fabricated Semiconductor Wafers
Many systematic failures in the wafer fabrication (so-called frontend process) can only be caught during the IC manufacturing (i.e., during the backend process). Thus, there is a need for a simple yet accurate system to perform a wafer defect cluster analysis based on fast knowledge extraction from the production test data. The talk will cover the design and development of an automation tool to carry out this task - Automatic Defect Cluster Analysis System (ADCAS). It is aimed at supporting the backend initiated efforts, such as defect root-cause identification, die-level neighbourhood analysis as well as yield analysis and improvement. It is suitable for a plug-and-play type application on semiconductor production databases while providing an excellent trade-off between the simplicity of implementation and high accuracy of the analysis.
3. Automatic Media Inspection in Magnetic Memory Drive Production
In the modern high-volume hard disk drive production process, if an assembled product fails the final test it is normally not discarded, but instead, it is sent for so-called Teardown. There it is disassembled to the constituent components. These components are thoroughly examined and retested for their individual functionality. If found to be in good operational condition, they are redeployed in new products. To retest the magnetic disk (or media) the Laser Doppler Vibrometry (LDV) has been traditionally employed. Unfortunately, LDV test is normally lengthy thus causing a bottleneck in the Teardown, and thus reducing the overall manufacturing efficiency. In order to address the problem, manual visual inspection is often performed as a preliminary filtering step. Such an arrangement is not optimal as it is open to the human error factor. It still could be costly and has throughput limitations.
In this part of the talk series, the factors influencing successful and rapid image acquisition of micrometer level defects on a specular surface are explored, namely, camera spatial resolution, spectral properties, image system Signal-to-Noise Ratio and lighting methods. A detection, as well as classification scheme, is offered to classify four major types of commonly occurring media defects.
Advanced sensing presents the prerequisite for realizing intelligent manufacturing. Sensors monitor production operations in real-time, often in harsh environments, provide input for diagnosing the root cause of quality degradation and fault progression such that subsequent corrective measures can be formulated and executed online to control a machine’s deviation from its optimal state. With the increasing convergence among measurement science, information technology, wireless communication, and system miniaturization, sensing has continually expanded the contribution of mechatronics to intelligent manufacturing, enabling functionalities that were not feasible before in terms of in-situ state monitoring and process control. New sensors not only acquire higher resolution data at faster rates but also provide local computing resources for autonomously analyzing the acquired data for intelligent decision support.
This talk presents research on advanced sensing for improved observability in manufacturing process monitoring, using polymer injection molding and sheet metal micro rolling as two examples. The design, characterization, and realization of multivariate sensing and acoustic-based wireless data transmission techniques in RF-shielded environment are first introduced. Next, computational methods for solving an ill-posed problem in data reconstruction are described. The talk highlights the significance of advanced sensing and data analytics for advancing the science base and state-of-the-technology to fully realize the potential of intelligent manufacturing.
Optical sensors and photonic devices have technically matured to the point that they are increasingly considered as alternatives for their electronic counterparts in numerous applications across the industry. In particular, the utilization of optical sensors has been considered for harsh, high-voltage or explosive environments where conventional transducers are difficult to deploy or where their operation is compromised by electromagnetic interference.
This prospective talk will explain the motivation for research on fiber-optic sensors, highlight the basic theories underlying their operation, and present selected examples of R&D projects carried out within the Advanced Sensors Team in the Institute for Energy and Environment at the University of Strathclyde, Glasgow, UK, targeting a range of industrial applications. The goal is to highlight great potential of optical sensors and to enrich recipients’ experience in instrumentation and measurement using alternative, non-electronic methods.
Alternatively, for audiences with greater photonics sensors awareness, the presentation can be tailored to solely focus on reporting the most recent progress in fiber sensing research for power and energy industries carried out within the team. In this instance, it will highlight specific examples of the measurement needs within the power and energy sectors and report on the novel approaches in fiber sensing to address these needs. In particular, it will illustrate such applications as downhole and subsea electrical plant monitoring; voltage and current measurement for power system metering and protection in the context of distributed generation; force and magnetic field monitoring in the context of thermonuclear fusion research; and, measurement of the loss of loading within concrete prestressing steel tendons in nuclear power plant applications. As the potential good solutions to these respective measurement needs, this talk will introduce such emerging technologies as the hybrid fiber Bragg grating (FBG) voltage and current sensors; novel solid-state FBG interrogation schemes utilizing wavelength division multiplexing (WDM) and time-domain multiplexing (TDM) architectures (not requiring tunable spectral filters or lasers); and novel FBG sensors and interrogation schemes utilizing some promising intrinsic sensing mechanisms capable of measuring such quantities as magnetic and electric fields or bend.
Electrical capacitance tomography (ECT) is an imaging technique for industrial applications. ECT is based on measuring capacitance from a multi-electrode capacitance sensor and reconstructing cross-sectional images, aiming to visualise the distribution of dielectric materials, such as gas/oil flows in an oil pipeline and gas/solids distribution in a fluidised bed. The internal information is valuable for understanding complicated phenomena, verifying computational fluid dynamic (CFD) models, measurement, and control of industrial processes, which are difficult with conventional process instruments. Compared with other tomography modalities, ECT is the most mature and offers advantages of no radiation, rapid response, non-intrusive and non-invasive, withstanding high temperature and high pressure and low-cost.
Research into ECT involves sensor and electronic circuit design, data acquisition, computer interface, mathematics, finite element analysis, software programming, and general knowledge in process engineering. Because of extremely small capacitance to be measured (down to 0.0001 pF) and the nature of soft-field, ECT presents challenges in engineering and mathematics. The University of Manchester (formerly UMIST) pioneered research into ECT. The latest ACECT system presents the state-of-the-art technology, which can generate on-line images at 100 frames per second with 73 dB signal-to-noise ratio (SNR) and has been used for many challenging industrial applications, such as gas-oil-water flows in oil pipelines, wet gas separators, pneumatic conveyors, cyclone separators and fluidised bed dryers. It is foreseen that ECT will make major contributions to the gas/oil, pharmaceutical, power, and other industries. In this Lecture, the principle of ECT, capacitance measuring circuits, image reconstruction algorithms, and some applications will be discussed, together with a demonstration of an ACECT system.