Search results for: location based alarm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29017

Search results for: location based alarm

26047 Network Analysis and Sex Prediction based on a full Human Brain Connectome

Authors: Oleg Vlasovets, Fabian Schaipp, Christian L. Mueller

Abstract:

we conduct a network analysis and predict the sex of 1000 participants based on ”connectome” - pairwise Pearson’s correlation across 436 brain parcels. We solve the non-smooth convex optimization problem, known under the name of Graphical Lasso, where the solution includes a low-rank component. With this solution and machine learning model for a sex prediction, we explain the brain parcels-sex connectivity patterns.

Keywords: network analysis, neuroscience, machine learning, optimization

Procedia PDF Downloads 137
26046 Approach for Demonstrating Reliability Targets for Rail Transport during Low Mileage Accumulation in the Field: Methodology and Case Study

Authors: Nipun Manirajan, Heeralal Gargama, Sushil Guhe, Manoj Prabhakaran

Abstract:

In railway industry, train sets are designed based on contractual requirements (mission profile), where reliability targets are measured in terms of mean distance between failures (MDBF). However, during the beginning of revenue services, trains do not achieve the designed mission profile distance (mileage) within the timeframe due to infrastructure constraints, scarcity of commuters or other operational challenges thereby not respecting the original design inputs. Since trains do not run sufficiently and do not achieve the designed mileage within the specified time, car builder has a risk of not achieving the contractual MDBF target. This paper proposes a constant failure rate based model to deal with the situations where mileage accumulation is not a part of the design mission profile. The model provides appropriate MDBF target to be demonstrated based on actual accumulated mileage. A case study of rolling stock running in the field is undertaken to analyze the failure data and MDBF target demonstration during low mileage accumulation. The results of case study prove that with the proposed method, reliability targets are achieved under low mileage accumulation.

Keywords: mean distance between failures, mileage-based reliability, reliability target appropriations, rolling stock reliability

Procedia PDF Downloads 255
26045 Fault-Tolerant Predictive Control for Polytopic LPV Systems Subject to Sensor Faults

Authors: Sofiane Bououden, Ilyes Boulkaibet

Abstract:

In this paper, a robust fault-tolerant predictive control (FTPC) strategy is proposed for systems with linear parameter varying (LPV) models and input constraints subject to sensor faults. Generally, virtual observers are used for improving the observation precision and reduce the impacts of sensor faults and uncertainties in the system. However, this type of observer lacks certain system measurements which substantially reduce its accuracy. To deal with this issue, a real observer is then designed based on the virtual observer, and consequently a real observer-based robust predictive control is designed for polytopic LPV systems. Moreover, the proposed observer can entirely assure that all system states and sensor faults are estimated. As a result, and based on both observers, a robust fault-tolerant predictive control is then established via the Lyapunov method where sufficient conditions are proposed, for stability analysis and control purposes, in linear matrix inequalities (LMIs) form. Finally, simulation results are given to show the effectiveness of the proposed approach.

Keywords: linear parameter varying systems, fault-tolerant predictive control, observer-based control, sensor faults, input constraints, linear matrix inequalities

Procedia PDF Downloads 192
26044 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 146
26043 Heritage Landmark of Penang: Segara Ninda, a Mix of Culture

Authors: Normah Sulaiman, Yong Zhi Kang, Nor Hayati Hussain, Abdul Rehman Khalid

Abstract:

Segara Ninda owned by Din Ku Meh, the governor of the province Satul, a Malay man with a big role liaising with Thailand. This mansion is part of the legacy he left behind among other properties in George Town, Penang, besides his family. The island’s geographical location is strategic which has benefitted it through important trade routes for Europe, Middle, East, India, and China in the past. Due to this reasoning, various architectural styles were introduced in Penang; Late Straits Eclectic style is one of the forms of the Colonial Architectural style widely spread as vernacular shophouses in George Town. Segara Ninda is located among the mixture of nouveau-riche, historical and heritage sites at the most important street; Penang Road, which dated back to the late 18th century. This paper examines the strait eclectic style that Segara Ninda encompasses. Acknowledging the mixture of colonial architecture in Georgetown, we argue that the mansion faces challenging issues in conservation processes to be vindicated. This is reflected by analysing the spatial layout, visual elements quality, and its activity through interviews with the occupants of the mansion. The focus will be on the understanding of building form, features, and functions; respecting the architectural spaces and their activity. The methodology applied is to promote our understanding of the mix of culture that the mansion holds through documentation, observation and measuring exercises. This offers a positional interpretation of the mix of culture that the mansion holds. This conservation effort will further contribute exposure to the public and recognize it in the society as its essence is a deficiency character to the existing built environment.

Keywords: eclectic, heritage, spatial organization, culture

Procedia PDF Downloads 170
26042 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Model

Authors: Alam Ali, Ashok Kumar Pathak

Abstract:

Path analysis is a statistical technique used to evaluate the strength of the direct and indirect effects of variables. One or more structural regression equations are used to estimate a series of parameters in order to find the better fit of data. Sometimes, exogenous variables do not show a significant strength of their direct and indirect effect when the assumption of classical regression (ordinary least squares (OLS)) are violated by the nature of the data. The main motive of this article is to investigate the efficacy of the copula-based regression approach over the classical regression approach and calculate the direct and indirect effects of variables when data violates the OLS assumption and variables are linked through an elliptical copula. We perform this study using a well-organized numerical scheme. Finally, a real data application is also presented to demonstrate the performance of the superiority of the copula approach.

Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique

Procedia PDF Downloads 62
26041 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 109
26040 Cryptography Based Authentication Methods

Authors: Mohammad A. Alia, Abdelfatah Aref Tamimi, Omaima N. A. Al-Allaf

Abstract:

This paper reviews a comparison study on the most common used authentication methods. Some of these methods are actually based on cryptography. In this study, we show the main cryptographic services. Also, this study presents a specific discussion about authentication service, since the authentication service is classified into several categorizes according to their methods. However, this study gives more about the real life example for each of the authentication methods. It talks about the simplest authentication methods as well about the available biometric authentication methods such as voice, iris, fingerprint, and face authentication.

Keywords: information security, cryptography, system access control, authentication, network security

Procedia PDF Downloads 458
26039 Numerical Simulation of Axially Loaded to Failure Large Diameter Bored Pile

Authors: M. Ezzat, Y. Zaghloul, T. Sorour, A. Hefny, M. Eid

Abstract:

Ultimate capacity of large diameter bored piles is usually determined from pile loading tests as recommended by several international codes and foundation design standards. However, loading of this type of piles till achieving apparent failure is practically seldom. In this paper, numerical analyses are carried out to simulate load test of a large diameter bored pile performed at the location of Alzey highway bridge project (Germany). Test results of pile load settlement relationship till failure as well as results of the base and shaft resistances are available. Apparent failure was indicated in this test by the significant increase of the induced settlement during the last load increment applied on the pile head. Measurements of this pile load test are used to assess the quality of the numerical models investigated. Three different material soil models are implemented in the analyses: Mohr coulomb (MC), Soft soil (SS), and Modified Mohr coulomb (MMC). Very good agreement is obtained between the field measured settlement and the calculated settlement using the MMC model. Results of analysis showed also that the MMC constitutive model is superior to MC, and SS models in predicting the ultimate base and shaft resistances of the large diameter bored pile. After calibrating the numerical model, behavior of large diameter bored piles under axial loads is discussed and the formation of the plastic zone around the pile is explored. Results obtained showed that the plastic zone below the base of the pile at failure extended laterally to about four times the pile diameter and vertically to about three times the pile diameter.

Keywords: ultimate capacity, large diameter bored piles, plastic zone, failure, pile load test

Procedia PDF Downloads 137
26038 Configuration Design and Optimization of the Movable Leg-Foot Lunar Soft-Landing Device

Authors: Shan Jia, Jinbao Chen, Jinhua Zhou, Jiacheng Qian

Abstract:

Lunar exploration is a necessary foundation for deep-space exploration. For the functional limitations of the fixed landers which are widely used currently and are to expand the detection range by the use of wheeled rovers with unavoidable path-repeatability, a movable lunar soft-landing device based on cantilever type buffer mechanism and leg-foot type walking mechanism is presented. Firstly, a 20 DoFs quadruped configuration based on pushrod is proposed. The configuration is of the bionic characteristics such as hip, knee and ankle joints, and can make the kinematics of the whole mechanism unchanged before and after buffering. Secondly, the multi-function main/auxiliary buffers based on crumple-energy absorption and screw-nut mechanism, as well as the telescopic device which could be used to protect the plantar force sensors during the buffer process are designed. Finally, the kinematic model of the whole mechanism is established, and the configuration optimization of the whole mechanism is completed based on the performance requirements of slope adaptation and obstacle crossing. This research can provide a technical solution integrating soft-landing, large-scale inspection and material-transfer for future lunar exploration and even mars exploration, and can also serve as the technical basis for developing the reusable landers.

Keywords: configuration design, lunar soft-landing device, movable, optimization

Procedia PDF Downloads 142
26037 Streamlines: Paths of Fluid Flow through Sandstone Samples Based on Computed Microtomography

Authors: Ł. Kaczmarek, T. Wejrzanowski, M. Maksimczuk

Abstract:

The study presents the use of the numerical calculations based on high-resolution computed microtomography in analysis of fluid flow through Miocene sandstones. Therefore, the permeability studies of rocks were performed. Miocene samples were taken from well S-3, located in the eastern part of the Carpathian Foredeep. For aforementioned analysis, two series of X-ray irradiation were performed. The first set of samples was selected to obtain the spatial distribution of grains and pores. At this stage of the study length of voxel side amounted 27 microns. The next set of X-ray irradation enabled recognition of microstructural components as well as petrophysical features. The length of voxel side in this stage was up to 2 µm. Based on this study, the samples were broken down into two distinct groups. The first one represents conventional reservoir deposits, in opposite to second one - unconventional type. Appropriate identification of petrophysical parameters such as porosity and permeability of the formation is a key element for optimization of the reservoir development.

Keywords: grains, permeability, pores, pressure distribution

Procedia PDF Downloads 243
26036 Detection of Nanotoxic Material Using DNA Based QCM

Authors: Juneseok You, Chanho Park, Kuehwan Jang, Sungsoo Na

Abstract:

Sensing of nanotoxic materials is strongly important, as their engineering applications are growing recently and results in that nanotoxic material can harmfully influence human health and environment. In current study we report the quartz crystal microbalance (QCM)-based, in situ and real-time sensing of nanotoxic-material by frequency shift. We propose the in situ detection of nanotoxic material of zinc oxice by using QCM functionalized with a taget-specific DNA. Since the mass of a target material is comparable to that of an atom, the mass change caused by target binding to DNA on the quartz electrode is so small that it is practically difficult to detect the ions at low concentrations. In our study, we have demonstrated the in-situ and fast detection of zinc oxide using the quartz crystal microbalance (QCM). The detection was derived from the DNA hybridization between the DNA on the quartz electrode. The results suggest that QCM-based detection opens a new avenue for the development of a practical water-testing sensor.

Keywords: nanotoxic material, qcm, frequency, in situ sensing

Procedia PDF Downloads 414
26035 Development of a Congestion Controller of Computer Network Using Artificial Intelligence Algorithm

Authors: Mary Anne Roa

Abstract:

Congestion in network occurs due to exceed in aggregate demand as compared to the accessible capacity of the resources. Network congestion will increase as network speed increases and new effective congestion control methods are needed, especially for today’s very high speed networks. To address this undeniably global issue, the study focuses on the development of a fuzzy-based congestion control model concerned with allocating the resources of a computer network such that the system can operate at an adequate performance level when the demand exceeds or is near the capacity of the resources. Fuzzy logic based models have proven capable of accurately representing a wide variety of processes. The model built is based on bandwidth, the aggregate incoming traffic and the waiting time. The theoretical analysis and simulation results show that the proposed algorithm provides not only good utilization but also low packet loss.

Keywords: congestion control, queue management, computer networks, fuzzy logic

Procedia PDF Downloads 383
26034 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.

Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis

Procedia PDF Downloads 333
26033 Multi-Objective Multi-Period Allocation of Temporary Earthquake Disaster Response Facilities with Multi-Commodities

Authors: Abolghasem Yousefi-Babadi, Ali Bozorgi-Amiri, Aida Kazempour, Reza Tavakkoli-Moghaddam, Maryam Irani

Abstract:

All over the world, natural disasters (e.g., earthquakes, floods, volcanoes and hurricanes) causes a lot of deaths. Earthquakes are introduced as catastrophic events, which is accident by unusual phenomena leading to much loss around the world. Such could be replaced by disasters or any other synonyms strongly demand great long-term help and relief, which can be hard to be managed. Supplies and facilities are very important challenges after any earthquake which should be prepared for the disaster regions to satisfy the people's demands who are suffering from earthquake. This paper proposed disaster response facility allocation problem for disaster relief operations as a mathematical programming model. Not only damaged people in the earthquake victims, need the consumable commodities (e.g., food and water), but also they need non-consumable commodities (e.g., clothes) to protect themselves. Therefore, it is concluded that paying attention to disaster points and people's demands are very necessary. To deal with this objective, both commodities including consumable and need non-consumable commodities are considered in the presented model. This paper presented the multi-objective multi-period mathematical programming model regarding the minimizing the average of the weighted response times and minimizing the total operational cost and penalty costs of unmet demand and unused commodities simultaneously. Furthermore, a Chebycheff multi-objective solution procedure as a powerful solution algorithm is applied to solve the proposed model. Finally, to illustrate the model applicability, a case study of the Tehran earthquake is studied, also to show model validation a sensitivity analysis is carried out.

Keywords: facility location, multi-objective model, disaster response, commodity

Procedia PDF Downloads 251
26032 H.263 Based Video Transceiver for Wireless Camera System

Authors: Won-Ho Kim

Abstract:

In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video.

Keywords: wireless video transceiver, video surveillance camera, H.263 video encoding digital signal processing

Procedia PDF Downloads 356
26031 Frame Camera and Event Camera in Stereo Pair for High-Resolution Sensing

Authors: Khen Cohen, Daniel Yankelevich, David Mendlovic, Dan Raviv

Abstract:

We present a 3D stereo system for high-resolution sensing in both the spatial and the temporal domains by combining a frame-based camera and an event-based camera. We establish a method to merge both devices into one unite system and introduce a calibration process, followed by a correspondence technique and interpolation algorithm for 3D reconstruction. We further provide quantitative analysis about our system in terms of depth resolution and additional parameter analysis. We show experimentally how our system performs temporal super-resolution up to effectively 1ms and can detect fast-moving objects and human micro-movements that can be used for micro-expression analysis. We also demonstrate how our method can extract colored events for an event-based camera without any degradation in the spatial resolution, compared to a colored filter array.

Keywords: DVS-CIS stereo vision, micro-movements, temporal super-resolution, 3D reconstruction

Procedia PDF Downloads 289
26030 A Modified NSGA-II Algorithm for Solving Multi-Objective Flexible Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk, Ozan Bahadir

Abstract:

NSGA-II is one of the most well-known and most widely used evolutionary algorithms. In addition to its new versions, such as NSGA-III, there are several modified types of this algorithm in the literature. In this paper, a hybrid NSGA-II algorithm has been suggested for solving the multi-objective flexible job shop scheduling problem. For a better search, new neighborhood-based crossover and mutation operators are defined. To create new generations, the neighbors of the selected individuals by the tournament selection are constructed. Also, at the end of each iteration, before sorting, neighbors of a certain number of good solutions are derived, except for solutions protected by elitism. The neighbors are generated using a constraint-based neural network that uses various constructs. The non-dominated sorting and crowding distance operators are same as the classic NSGA-II. A comparison based on some multi-objective benchmarks from the literature shows the efficiency of the algorithm.

Keywords: flexible job shop scheduling problem, multi-objective optimization, NSGA-II algorithm, neighborhood structures

Procedia PDF Downloads 218
26029 Exploring Elder Care in Different Settings in West Bengal: A Psycho-Social Study of Private Homes, Hospitals and Long-Term Care Facilities

Authors: Tulika Bhattacharyya, Suhita C. Chatterjee

Abstract:

West Bengal, one of the most rapidly ageing states in India, has inadequate structure for elder care. Therefore, there is an urgent need to improve elder care which involves focusing on different care settings where the elderly exists, like - Homes, Hospitals and Long-Term Care facilities (e.g. - Old Age Homes, Hospices). The study explores various elder care settings, with the intention to develop an understanding about them, and thereby generate comprehensive information about the entire spectrum of elder care in Kolkata. Empirical data are collected from the elderly and their caregivers in different settings. The tools for data collection are narratives, in-depth interviews and focus group discussions, along with field observations. Mixed method design is adopted to analyze the complexities of elder care in different set ups. The major challenges of elder care in private Homes are: architecturally inadequate housing conditions, paucity of financial support and scarcity of skilled caregivers. While the key factors preventing the Hospital and Long-Term Care Facilities from providing elder care services are inadequate policies and set governmental standards for elder care for the hospitalized elderly in various departments of the Hospital and the elderly residing in different kinds of Long Term Care Facilities. The limitations in each care setting results in considerable neglect and abuse of the elderly. The major challenges in elder care in West Bengal are lack of continuum between different care settings/ peripheral location of private Homes within public health framework and inadequate state Palliative policy- including narcotic regulations. The study suggests remedial measures to improve the capacity to deliver elder care in different settings.

Keywords: elder care settings, family caregiver, home care, geriatric hospital care, long term care facility

Procedia PDF Downloads 281
26028 Development of Open Source Geospatial Certification Model Based on Geospatial Technology Competency Model

Authors: Tanzeel Ur Rehman Khan, Franz Josef Behr, Phillip Davis

Abstract:

Open source geospatial certifications are needed in geospatial technology education and industry sector. In parallel with proprietary software, free and open source software solutions become important in geospatial technology research and play an important role for the growth of the geospatial industry. ESRI, GISCI (GIS Certification Institute), ASPRS (American Society of Photogrammetry and remote sensing), and Meta spatial are offering certifications on proprietary and open source software. These are portfolio and competency based certifications depending on GIS Body of Knowledge (Bok). The analysis of these certification approaches might lead to the discovery of some gaps in them and will open a new way to develop certifications related to the geospatial open source (OS). This new certification will investigate the different geospatial competencies according to open source tools that help to identify geospatial professionals and strengthen the geospatial academic content. The goal of this research is to introduce a geospatial certification model based on geospatial technology competency model (GTCM).The developed certification will not only incorporate the importance of geospatial education and production of the geospatial competency-based workforce in universities and companies (private or public) as well as describe open source solutions with tools and technology. Job analysis, market analysis, survey analysis of this certification opens a new horizon for business as well.

Keywords: geospatial certification, open source, geospatial technology competency model, geoscience

Procedia PDF Downloads 545
26027 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons

Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker

Abstract:

To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.

Keywords: bioinformatics, automation, opentrons, research

Procedia PDF Downloads 100
26026 Improvement of Camera Calibration Based on the Relationship between Focal Length and Aberration Coefficient

Authors: Guorong Sui, Xingwei Jia, Chenhui Yin, Xiumin Gao

Abstract:

In the processing of camera-based high precision and non-contact measurement, the geometric-optical aberration is always inevitably disturbing the measuring system. Moreover, the aberration is different with the different focal length, which will increase the difficulties of the system’s calibration. Therefore, to understand the relationship between the focal length as a function of aberration properties is a very important issue to the calibration of the measuring systems. In this study, we propose a new mathematics model, which is based on the plane calibration method by Zhang Zhengyou, and establish a relationship between the focal length and aberration coefficient. By using the mathematics model and carefully modified compensation templates, the calibration precision of the system can be dramatically improved. The experiment results show that the relative error is less than 1%. It is important for optoelectronic imaging systems that apply to measure, track and position by changing the camera’s focal length.

Keywords: camera calibration, aberration coefficient, vision measurement, focal length, mathematics model

Procedia PDF Downloads 350
26025 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 229
26024 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector

Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts

Abstract:

At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.

Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion

Procedia PDF Downloads 179
26023 Architecture for QoS Based Service Selection Using Local Approach

Authors: Gopinath Ganapathy, Chellammal Surianarayanan

Abstract:

Services are growing rapidly and generally they are aggregated into a composite service to accomplish complex business processes. There may be several services that offer the same required function of a particular task in a composite service. Hence a choice has to be made for selecting suitable services from alternative functionally similar services. Quality of Service (QoS)plays as a discriminating factor in selecting which component services should be selected to satisfy the quality requirements of a user during service composition. There are two categories of approaches for QoS based service selection, namely global and local approaches. Global approaches are known to be Non-Polynomial (NP) hard in time and offer poor scalability in large scale composition. As an alternative to global methods, local selection methods which reduce the search space by breaking up the large/complex problem of selecting services for the workflow into independent sub problems of selecting services for individual tasks are coming up. In this paper, distributed architecture for selecting services based on QoS using local selection is presented with an overview of local selection methodology. The architecture describes the core components, namely, selection manager and QoS manager needed to implement the local approach and their functions. Selection manager consists of two components namely constraint decomposer which decomposes the given global or workflow level constraints in local or task level constraints and service selector which selects appropriate service for each task with maximum utility, satisfying the corresponding local constraints. QoS manager manages the QoS information at two levels namely, service class level and individual service level. The architecture serves as an implementation model for local selection.

Keywords: architecture of service selection, local method for service selection, QoS based service selection, approaches for QoS based service selection

Procedia PDF Downloads 419
26022 Assessing the Leadership Succession Plan in Faith-Based Senior High Schools in Ghana and Its Associated Challenges

Authors: J. E. Cobbinah

Abstract:

One of the most challenging issues confronting schools is good leadership succession planning. Experts argue that, although the idea of leadership succession planning is one of the strategies or practices that can help sustain improvement and promote continuity of good leadership, seem to have been neglected in many schools over the years. Appointment of head teachers in senior high schools is based on long service or one’s ability to demonstrate his/her competence in a leadership selection interview. There is no clear and well-structured leadership succession plan, before leadership position is filled, while school leadership succession planning seem to be an issue that nobody talks about. In faith-based schools the issue is even worse, because religious groups impose whoever they consider strong in the faith on schools as leaders, irrespective of the individual competence, ability to take up challenges associated with individuals’ preparedness to take up leadership position. Therefore, the present study examined the nature (including type) of leadership succession plans in faith-based senior high schools and its associated challenges. Convergent mixed method design was employed to effectively achieve the objectives of the study. The data collection strategies involved the use of interviews, questionnaires, and reviews of secondary data. The data was gathered from students, school leaders (head teachers, deputy heads, and head of departments), selected parents teachers associated members, school management committee members and members from school governors. The results show that governors of faith-based schools are making efforts to enhance education quality, by making school leadership accountable, the absence and the neglect of clear, and well-structured leadership succession plan has some negative outcomes. Unsustainable students’ academic performance, lack of support from existing staffs and senior leaders and lack of support in the implementation of school improvement plan. It would be concluded that, faith-based schools should focus on leadership competence and abilities in the selection process of potential school leaders to achieve a good succession plan rather than appointing leaders who are affiliates of one’s faith.

Keywords: school leadership, succession planning, faith-based schools, school governors

Procedia PDF Downloads 431
26021 Research on Air pollution Spatiotemporal Forecast Model Based on LSTM

Authors: JingWei Yu, Hong Yang Yu

Abstract:

At present, the increasingly serious air pollution in various cities of China has made people pay more attention to the air quality index(hereinafter referred to as AQI) of their living areas. To face this situation, it is of great significance to predict air pollution in heavily polluted areas. In this paper, based on the time series model of LSTM, a spatiotemporal prediction model of PM2.5 concentration in Mianyang, Sichuan Province, is established. The model fully considers the temporal variability and spatial distribution characteristics of PM2.5 concentration. The spatial correlation of air quality at different locations is based on the Air quality status of other nearby monitoring stations, including AQI and meteorological data to predict the air quality of a monitoring station. The experimental results show that the method has good prediction accuracy that the fitting degree with the actual measured data reaches more than 0.7, which can be applied to the modeling and prediction of the spatial and temporal distribution of regional PM2.5 concentration.

Keywords: LSTM, PM2.5, neural networks, spatio-temporal prediction

Procedia PDF Downloads 124
26020 Delivering User Context-Sensitive Service in M-Commerce: An Empirical Assessment of the Impact of Urgency on Mobile Service Design for Transactional Apps

Authors: Daniela Stephanie Kuenstle

Abstract:

Complex industries such as banking or insurance experience slow growth in mobile sales. While today’s mobile applications are sophisticated and enable location based and personalized services, consumers prefer online or even face-to-face services to complete complex transactions. A possible reason for this reluctance is that the provided service within transactional mobile applications (apps) does not adequately correspond to users’ needs. Therefore, this paper examines the impact of the user context on mobile service (m-service) in m-commerce. Motivated by the potential which context-sensitive m-services hold for the future, the impact of temporal variations as a dimension of user context, on m-service design is examined. In particular, the research question asks: Does consumer urgency function as a determinant of m-service composition in transactional apps by moderating the relation between m-service type and m-service success? Thus, the aim is to explore the moderating influence of urgency on m-service types, which includes Technology Mediated Service and Technology Generated Service. While mobile applications generally comprise features of both service types, this thesis discusses whether unexpected urgency changes customer preferences for m-service types and how this consequently impacts the overall m-service success, represented by purchase intention, loyalty intention and service quality. An online experiment with a random sample of N=1311 participants was conducted. Participants were divided into four treatment groups varying in m-service types and urgency level. They were exposed to two different urgency scenarios (high/ low) and two different app versions conveying either technology mediated or technology generated service. Subsequently, participants completed a questionnaire to measure the effectiveness of the manipulation as well as the dependent variables. The research model was tested for direct and moderating effects of m-service type and urgency on m-service success. Three two-way analyses of variance confirmed the significance of main effects, but demonstrated no significant moderation of urgency on m-service types. The analysis of the gathered data did not confirm a moderating effect of urgency between m-service type and service success. Yet, the findings propose an additive effects model with the highest purchase and loyalty intention for Technology Generated Service and high urgency, while Technology Mediated Service and low urgency demonstrate the strongest effect for service quality. The results also indicate an antagonistic relation between service quality and purchase intention depending on the level of urgency. Although a confirmation of the significance of this finding is required, it suggests that only service convenience, as one dimension of mobile service quality, delivers conditional value under high urgency. This suggests a curvilinear pattern of service quality in e-commerce. Overall, the paper illustrates the complex interplay of technology, user variables, and service design. With this, it contributes to a finer-grained understanding of the relation between m-service design and situation dependency. Moreover, the importance of delivering situational value with apps depending on user context is emphasized. Finally, the present study raises the demand to continue researching the impact of situational variables on m-service design in order to develop more sophisticated m-services.

Keywords: mobile consumer behavior, mobile service design, mobile service success, self-service technology, situation dependency, user-context sensitivity

Procedia PDF Downloads 261
26019 Acute Cartilage Defects of the Knee Treated With Chondral Restoration Procedures and Patellofemoral Stabilisation

Authors: John Scanlon, Antony Raymond, Randeep Aujla, Peter D’Alessandro, Satyen Gohil

Abstract:

Background: The incidence of significant acute chondral injuries with patella dislocation is around 10-15%. It is accepted that chondral procedures should only be performed in the presence of joint stability Methods:Patients were identified from surgeon/hospital logs. Patient demographics, lesion size and location, surgical procedure, patient reported outcome measures, post-operative MR imaging, and complications were recorded. PROMs and patient satisfaction was obtained. Results:20 knees (18 patients) were included. Mean age was 18.6 years (range; 11-39), and the mean follow-up was 16.6 months (range; 2-70). The defect locations were the lateral femoral condyle (9/20; 45%), patella (9/20; 45%), medial femoral condyle (1/20; 5%) and the trochlea (1/20; 5%). The mean defect size was 2.6cm2. Twelve knees were treated with cartilage fixation, 5 with microfracture, and 3 with OATS. At follow up, the overall mean Lysholm score was 77.4 (± 17.1), with no chondral regenerative procedure being statistically superior. There was no difference in Lysholm scores between those patients having acute medial patellofemoral ligament reconstruction versus medial soft tissue plication (p=0.59). Five (25%) knees required re-operation (one arthroscopic arthrolysis; one patella chondroplasty; two removal of loose bodies; one implant adjustment). Overall, 90% responded as being satisfied with surgery. Conclusion: Our aggressive pathway to identify and treat acute cartilage defects with early operative intervention and patella stabilisation has shown high rates of satisfaction and Lysholm scores. The full range of chondral restoration options should be considered by surgeons managing these patients.

Keywords: patella dislocation, chondral restoration, knee, patella stabilisation

Procedia PDF Downloads 113
26018 Distribution and Characterization of Thermal Springs in Northern Oman

Authors: Fahad Al Shidi, Reginald Victor

Abstract:

This study was conducted in Northern Oman to assess the physical and chemical characteristics of 40 thermal springs distributed in Al Hajar Mountains in northern Oman. Physical measurements of water samples were carried out in two main seasons in Oman (winter and summer 2019). Studied springs were classified into three groups based on water temperature, four groups based on water pH values and two groups based on conductivity. Ten thermal alkaline springs that originated in Ophiolite (Samail Napp) were dominated by high pH (> 11), elevated concentration of Cl- and Na+ ions, relatively low temperature and discharge ratio. Other springs in the Hajar Super Group massif recorded high concentrations of Ca2+ and SO2-4 ions controlled by rock dominance, geochemistry processes, and mineralization. There was only one spring which has brackish water with very high conductivity (5500 µs/cm) and Total Dissolved Solids and it is not suitable for irrigation purposes because of the high abundance of Na+, Cl−, and Ca2+ ions.

Keywords: alkaline springs, geothermal, HSG, ophiolite

Procedia PDF Downloads 134