Search results for: reliable facility location model
20170 Influence of Angular Position of Unbalanced Force on Crack Breathing Mechanism
Authors: Roselyn Zaman, Mobarak Hossain
Abstract:
A new mathematical model is developed to study crack breathing behavior considering effect of angular position of unbalanced force at different crack locations. Crack breathing behavior has been determined using effectual bending angle by studying the transient change of the crack area. Different crack breathing behavior of the unbalanced shaft has been observed for different combination of angular position of unbalanced force with crack location except crack locations 0.3L and 0.8335L, where L is the total length of the shaft, where unbalanced shaft behave completely like the balanced shaft. Based on different combination of angular position of unbalanced force with crack location, the stiffness of unbalanced shaft can be divided into three regions. An unbalanced shaft is overall stiffer than a balanced shaft when angular position of unbalance force is between 90° to 270° and crack located between 0.3L and 0.8335L, and it is overall flexible when the crack located in outside this crack region. On the other hand, it is overall flexible when angular position of unbalanced force is between 0° to 90° or 270° to 360° and crack located in middle region and it is overall stiffer for outside this crack region.Keywords: cracked shaft, crack location, shaft stiffness, unbalanced force, and unbalanced force orientation
Procedia PDF Downloads 27020169 Determination of Measurement Uncertainty of the Diagnostic Meteorological Model CALMET
Authors: Nina Miklavčič, Urška Kugovnik, Natalia Galkina, Primož Ribarič, Rudi Vončina
Abstract:
Today, the need for weather predictions is deeply rooted in the everyday life of people as well as it is in industry. The forecasts influence final decision-making processes in multiple areas, from agriculture and prevention of natural disasters to air traffic regulations and solutions on a national level for health, security, and economic problems. Namely, in Slovenia, alongside other existing forms of application, weather forecasts are adopted for the prognosis of electrical current transmission through powerlines. Meteorological parameters are one of the key factors which need to be considered in estimations of the reliable supply of electrical energy to consumers. And like for any other measured value, the knowledge about measurement uncertainty is also critical for the secure and reliable supply of energy. The estimation of measurement uncertainty grants us a more accurate interpretation of data, a better quality of the end results, and even a possibility of improvement of weather forecast models. In the article, we focused on the estimation of measurement uncertainty of the diagnostic microscale meteorological model CALMET. For the purposes of our research, we used a network of meteorological stations spread in the area of our interest, which enables a side-by-side comparison of measured meteorological values with the values calculated with the help of CALMET and the measurement uncertainty estimation as a final result.Keywords: uncertancy, meteorological model, meteorological measurment, CALMET
Procedia PDF Downloads 8120168 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method
Authors: Arwa Alzughaibi
Abstract:
Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization
Procedia PDF Downloads 25820167 Development of Digital Twin Concept to Detect Abnormal Changes in Structural Behaviour
Authors: Shady Adib, Vladimir Vinogradov, Peter Gosling
Abstract:
Digital Twin (DT) technology is a new technology that appeared in the early 21st century. The DT is defined as the digital representation of living and non-living physical assets. By connecting the physical and virtual assets, data are transmitted smoothly, allowing the virtual asset to fully represent the physical asset. Although there are lots of studies conducted on the DT concept, there is still limited information about the ability of the DT models for monitoring and detecting unexpected changes in structural behaviour in real time. This is due to the large computational efforts required for the analysis and an excessively large amount of data transferred from sensors. This paper aims to develop the DT concept to be able to detect the abnormal changes in structural behaviour in real time using advanced modelling techniques, deep learning algorithms, and data acquisition systems, taking into consideration model uncertainties. finite element (FE) models were first developed offline to be used with a reduced basis (RB) model order reduction technique for the construction of low-dimensional space to speed the analysis during the online stage. The RB model was validated against experimental test results for the establishment of a DT model of a two-dimensional truss. The established DT model and deep learning algorithms were used to identify the location of damage once it has appeared during the online stage. Finally, the RB model was used again to identify the damage severity. It was found that using the RB model, constructed offline, speeds the FE analysis during the online stage. The constructed RB model showed higher accuracy for predicting the damage severity, while deep learning algorithms were found to be useful for estimating the location of damage with small severity.Keywords: data acquisition system, deep learning, digital twin, model uncertainties, reduced basis, reduced order model
Procedia PDF Downloads 9920166 The Awareness of Computer Science Students Regarding the Security of Location Based Games
Authors: Jacques Barnard, Magda Huisman, Gunther R. Drevin
Abstract:
Rapid expansion and development in die mobile technology market has created an opportunity for users to participate in location based games. As a consequence of this fast expanding market and new technology, it is important to be aware of the implications this has on security. This paper measures the impact on the security awareness of games’ participants, as well as on that of students at university level with regards to their various stages of input in years of studying and gamer classification. This serves to provide insight into the matter as to discernible differences in the awareness of the security implications concerning these technologies. The data was accumulated via a web questionnaire that was to be completed yearly by students from respective year groups. Results signify a meaningful disparity in security awareness among students completing the varying study years and research. This awareness, however, does not always impact on gamers.Keywords: gamer classifications, location based games, location based data, security awareness
Procedia PDF Downloads 29320165 Identification of Location Parameters for Different User Types of the Inner-City Building Stock: An Austrian Example
Authors: Bernhard Bauer, Thomas Meixner, Amir Dini, Detlef Heck
Abstract:
The inner city building stock is characterized by different types of buildings of different decades and centuries and different types of historical constructions. Depending on the natural growth of a city, those types are often located in downtown areas and the surrounding suburbs. Since the population is becoming older and the variation of the different social requirements spread with the so-called 'Silver Society', city quarters have to be seen alternatively. If an area is very attractive for young students to live there because of the busy nightlife, it might not be suitable for the older society. To identify 'Location Types A, B, C' for different user groups, qualitative interviews with 24 citizens of the city of Graz (Austria) have been carried out, in order to identify the most important values for making a location or city quarter 'A', 'B', or 'C'. Furthermore these acknowledgements have been put into a softwaretool for predicting locations that are the most suitable for certain user groups. On the other hands side, investors or owners of buildings can use the tool for determining the most suitable user group for the location of their building or construction project in order to adapt the project or building stock to the requirements of the users.Keywords: building stock, location parameters, inner city population, built environment
Procedia PDF Downloads 31320164 Application of Computational Flow Dynamics (CFD) Analysis for Surge Inception and Propagation for Low Head Hydropower Projects
Authors: M. Mohsin Munir, Taimoor Ahmad, Javed Munir, Usman Rashid
Abstract:
Determination of maximum elevation of a flowing fluid due to sudden rejection of load in a hydropower facility is of great interest to hydraulic engineers to ensure safety of the hydraulic structures. Several mathematical models exist that employ one-dimensional modeling for the determination of surge but none of these perfectly simulate real-time circumstances. The paper envisages investigation of surge inception and propagation for a Low Head Hydropower project using Computational Fluid Dynamics (CFD) analysis on FLOW-3D software package. The fluid dynamic model utilizes its analysis for surge by employing Reynolds’ Averaged Navier-Stokes Equations (RANSE). The CFD model is designed for a case study at Taunsa hydropower Project in Pakistan. Various scenarios have run through the model keeping in view upstream boundary conditions. The prototype results were then compared with the results of physical model testing for the same scenarios. The results of the numerical model proved quite accurate coherence with the physical model testing and offers insight into phenomenon which are not apparent in physical model and shall be adopted in future for the similar low head projects limiting delays and cost incurred in the physical model testing.Keywords: surge, FLOW-3D, numerical model, Taunsa, RANSE
Procedia PDF Downloads 36120163 Steady State Modeling and Simulation of an Industrial Steam Boiler
Authors: Amina Lyria Deghal Cheridi, Abla Chaker, Ahcene Loubar
Abstract:
Relap5 system code is one among powerful tools, which is used in the area of design and safety evaluation. This work aims to simulate the behavior of a radiant steam boiler at the steady-state conditions using Relap5 code system. To perform this study, a detailed Relap5 model is built including all the parts of the steam boiler. The control and regulation systems are also considered. To reproduce the most important parameters and phenomena with an acceptable accuracy and fidelity, a strong qualification work is undertaken concerning the facility nodalization. It consists of making a comparison between the code results and the plant available data in steady-state operation mode. Therefore, the model qualification results at the steady-state are in good agreement with the steam boiler experimental data. The steam boiler Relap5 model has proved satisfactory; and the model was capable of predicting the main thermal-hydraulic steady-state conditions of the steam boiler.Keywords: industrial steam boiler, model qualification, natural circulation, relap5/mod3.2, steady state simulation
Procedia PDF Downloads 27320162 Valuation of Cultural Heritage: A Hedonic Pricing Analysis of Housing via GIS-based Data
Authors: Dai-Ling Li, Jung-Fa Cheng, Min-Lang Huang, Yun-Yao Chi
Abstract:
The hedonic pricing model has been popularly applied to describe the economic value of environmental amenities in urban housing, but the results for cultural heritage variables remain relatively ambiguous. In this paper, integrated variables extending by GIS-based data and an existing typology of communities used to examine how cultural heritage and environmental amenities and disamenities affect housing prices across urban communities in Tainan, Taiwan. The developed models suggest that, although a sophisticated variable for central services is selected, the centrality of location is not fully controlled in the price models and thus picked up by correlated peripheral and central amenities such as cultural heritage, open space or parks. Analysis of these correlations permits us to qualify results and present a revised set of relatively reliable estimates. Positive effects on housing prices are identified for views, various types of recreational infrastructure and vicinity of nationally cultural sites and significant landscapes. Negative effects are found for several disamenities including wasteyards, refuse incinerators, petrol stations and industries. The results suggest that systematic hypothesis testing and reporting of correlations may contribute to consistent explanatory patterns in hedonic pricing estimates for cultural heritage and landscape amenities in urban.Keywords: hedonic pricing model, cultural heritage, landscape amenities, housing
Procedia PDF Downloads 34020161 The Investigation of the Impact of Process and Location Parameters in Warpage Study of Semiconductor Packages
Authors: Wheyming Song, Ssu-Ping Lin
Abstract:
The primary advantage of package-on-package (PoP) packaging is that since it has less volume, it weighs less. But this is also related to its principal drawback, which is warpage. This research investigates how PoP package warpage patterns are affected by assembling process parameters, including substrate temperature, injection speed, injection temperature, and compound forces. We also investigate how warpage patterns are affected by the location of the silicon chip. The methodologies used in this research are design of experiment and warpage simulation via ANSYS. We propose a regression model to predict the warpage value as a function of substrate temperature, injection speed, injection temperature, and compound forces. Our results show that interaction effects exist between substrate temperature and compound forces and between injection speed and injection temperature. Therefore, determining the optimal values for substrate temperature, compound forces, injection speed, and injection temperature cannot be done individually. Also, our results show that the warpage patterns based on the location of silicon chips can be classified into 11 groups, with the largest warpage occurring at the left-most and right-most sides.Keywords: package-on-package, warpage, design of experiment, simulation
Procedia PDF Downloads 30620160 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby
Authors: Jazim Sohail, Filipe Teixeira-Dias
Abstract:
Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI
Procedia PDF Downloads 21720159 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 6520158 Multilayer Perceptron Neural Network for Rainfall-Water Level Modeling
Authors: Thohidul Islam, Md. Hamidul Haque, Robin Kumar Biswas
Abstract:
Floods are one of the deadliest natural disasters which are very complex to model; however, machine learning is opening the door for more reliable and accurate flood prediction. In this research, a multilayer perceptron neural network (MLP) is developed to model the rainfall-water level relation, in a subtropical monsoon climatic region of the Bangladesh-India border. Our experiments show promising empirical results to forecast the water level for 1 day lead time. Our best performing MLP model achieves 98.7% coefficient of determination with lower model complexity which surpasses previously reported results on similar forecasting problems.Keywords: flood forecasting, machine learning, multilayer perceptron network, regression
Procedia PDF Downloads 17320157 Statistical Classification, Downscaling and Uncertainty Assessment for Global Climate Model Outputs
Authors: Queen Suraajini Rajendran, Sai Hung Cheung
Abstract:
Statistical down scaling models are required to connect the global climate model outputs and the local weather variables for climate change impact prediction. For reliable climate change impact studies, the uncertainty associated with the model including natural variability, uncertainty in the climate model(s), down scaling model, model inadequacy and in the predicted results should be quantified appropriately. In this work, a new approach is developed by the authors for statistical classification, statistical down scaling and uncertainty assessment and is applied to Singapore rainfall. It is a robust Bayesian uncertainty analysis methodology and tools based on coupling dependent modeling error with classification and statistical down scaling models in a way that the dependency among modeling errors will impact the results of both classification and statistical down scaling model calibration and uncertainty analysis for future prediction. Singapore data are considered here and the uncertainty and prediction results are obtained. From the results obtained, directions of research for improvement are briefly presented.Keywords: statistical downscaling, global climate model, climate change, uncertainty
Procedia PDF Downloads 37120156 Influence of the Line Parameters in Transmission Line Fault Location
Authors: Marian Dragomir, Alin Dragomir
Abstract:
In the paper, two fault location algorithms are presented for transmission lines which use the line parameters to estimate the distance to the fault. The first algorithm uses only the measurements from one end of the line and the positive and zero sequence parameters of the line, while the second one uses the measurements from both ends of the line and only the positive sequence parameters of the line. The algorithms were tested using a transmission grid transposed in MATLAB. In a first stage it was established a fault location base line, where the algorithms mentioned above estimate the fault locations using the exact line parameters. After that, the positive and zero sequence resistance and reactance of the line were calculated again for different ground resistivity values and then the fault locations were estimated again in order to compare the results with the base line results. The results show that the algorithm which uses the zero sequence impedance of the line is the most sensitive to the line parameters modifications. The other algorithm is less sensitive to the line parameters modification.Keywords: estimation algorithms, fault location, line parameters, simulation tool
Procedia PDF Downloads 35620155 Structural Damage Detection via Incomplete Model Data Using Output Data Only
Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan
Abstract:
Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation
Procedia PDF Downloads 36520154 A Reliable Multi-Type Vehicle Classification System
Authors: Ghada S. Moussa
Abstract:
Vehicle classification is an important task in traffic surveillance and intelligent transportation systems. Classification of vehicle images is facing several problems such as: high intra-class vehicle variations, occlusion, shadow, illumination. These problems and others must be considered to develop a reliable vehicle classification system. In this study, a reliable multi-type vehicle classification system based on Bag-of-Words (BoW) paradigm is developed. Our proposed system used and compared four well-known classifiers; Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), k-Nearest Neighbour (KNN), and Decision Tree to classify vehicles into four categories: motorcycles, small, medium and large. Experiments on a large dataset show that our approach is efficient and reliable in classifying vehicles with accuracy of 95.7%. The SVM outperforms other classification algorithms in terms of both accuracy and robustness alongside considerable reduction in execution time. The innovativeness of developed system is it can serve as a framework for many vehicle classification systems.Keywords: vehicle classification, bag-of-words technique, SVM classifier, LDA classifier, KNN classifier, decision tree classifier, SIFT algorithm
Procedia PDF Downloads 35920153 Estimation of Slab Depth, Column Size and Rebar Location of Concrete Specimen Using Impact Echo Method
Authors: Y. T. Lee, J. H. Na, S. H. Kim, S. U. Hong
Abstract:
In this study, an experimental research for estimation of slab depth, column size and location of rebar of concrete specimen is conducted using the Impact Echo Method (IE) based on stress wave among non-destructive test methods. Estimation of slab depth had total length of 1800×300 and 6 different depths including 150 mm, 180 mm, 210 mm, 240 mm, 270 mm and 300 mm. The concrete column specimen was manufactured by differentiating the size into 300×300×300 mm, 400×400×400 mm and 500×500×500 mm. In case of the specimen for estimation of rebar, rebar of ∅22 mm was used in a specimen of 300×370×200 and arranged at 130 mm and 150 mm from the top to the rebar top. As a result of error rate of slab depth was overall mean of 3.1%. Error rate of column size was overall mean of 1.7%. Mean error rate of rebar location was 1.72% for top, 1.19% for bottom and 1.5% for overall mean showing relative accuracy.Keywords: impact echo method, estimation, slab depth, column size, rebar location, concrete
Procedia PDF Downloads 35120152 A Study of the Adaptive Reuse for School Land Use Strategy: An Application of the Analytic Network Process and Big Data
Authors: Wann-Ming Wey
Abstract:
In today's popularity and progress of information technology, the big data set and its analysis are no longer a major conundrum. Now, we could not only use the relevant big data to analysis and emulate the possible status of urban development in the near future, but also provide more comprehensive and reasonable policy implementation basis for government units or decision-makers via the analysis and emulation results as mentioned above. In this research, we set Taipei City as the research scope, and use the relevant big data variables (e.g., population, facility utilization and related social policy ratings) and Analytic Network Process (ANP) approach to implement in-depth research and discussion for the possible reduction of land use in primary and secondary schools of Taipei City. In addition to enhance the prosperous urban activities for the urban public facility utilization, the final results of this research could help improve the efficiency of urban land use in the future. Furthermore, the assessment model and research framework established in this research also provide a good reference for schools or other public facilities land use and adaptive reuse strategies in the future.Keywords: adaptive reuse, analytic network process, big data, land use strategy
Procedia PDF Downloads 20420151 Application of Generalized Autoregressive Score Model to Stock Returns
Authors: Katleho Daniel Makatjane, Diteboho Lawrence Xaba, Ntebogang Dinah Moroke
Abstract:
The current study investigates the behaviour of time-varying parameters that are based on the score function of the predictive model density at time t. The mechanism to update the parameters over time is the scaled score of the likelihood function. The results revealed that there is high persistence of time-varying, as the location parameter is higher and the skewness parameter implied the departure of scale parameter from the normality with the unconditional parameter as 1.5. The results also revealed that there is a perseverance of the leptokurtic behaviour in stock returns which implies the returns are heavily tailed. Prior to model estimation, the White Neural Network test exposed that the stock price can be modelled by a GAS model. Finally, we proposed further researches specifically to model the existence of time-varying parameters with a more detailed model that encounters the heavy tail distribution of the series and computes the risk measure associated with the returns.Keywords: generalized autoregressive score model, South Africa, stock returns, time-varying
Procedia PDF Downloads 50220150 3rd Generation Modular Execution: A Global Breakthrough in Modular Facility Construction System
Authors: Sean Bryner S. Rey, Eric Tanjutco
Abstract:
Modular execution strategies are performed to address the various challenges of any projects and are implemented on each project phase that covers Engineering, Procurement, Fabrication and Construction. It was until the recent years that the intent to surpass mechanical modularization approach were conceptualized to give solution to much greater demands of project components such as site location and adverse weather condition, material sourcing, construction schedule, safety risks and overall plot layout and allocation. The intent of this paper is to introduce the 3rd Generation Modular Execution with an overview of its advantages on project execution and will give emphasis on Engineering, Construction, Operation and Maintenance. Most importantly, the paper will present the key differentiator of 3rd Gen modular execution against other conventional project execution and the merits it bears for the industry.Keywords: 3rd generation modular, process block, construction, operation & maintenance
Procedia PDF Downloads 47520149 Skin-Dose Mapping for Patients Undergoing Interventional Radiology Procedures: Clinical Experimentations versus a Mathematical Model
Authors: Aya Al Masri, Stefaan Carpentier, Fabrice Leroy, Thibault Julien, Safoin Aktaou, Malorie Martin, Fouad Maaloul
Abstract:
Introduction: During an 'Interventional Radiology (IR)' procedure, the patient's skin-dose may become very high for a burn, necrosis and ulceration to appear. In order to prevent these deterministic effects, an accurate calculation of the patient skin-dose mapping is essential. For most machines, the 'Dose Area Product (DAP)' and fluoroscopy time are the only information available for the operator. These two parameters are a very poor indicator of the peak skin dose. We developed a mathematical model that reconstructs the magnitude (delivered dose), shape, and localization of each irradiation field on the patient skin. In case of critical dose exceeding, the system generates warning alerts. We present the results of its comparison with clinical studies. Materials and methods: Two series of comparison of the skin-dose mapping of our mathematical model with clinical studies were performed: 1. At a first time, clinical tests were performed on patient phantoms. Gafchromic films were placed on the table of the IR machine under of PMMA plates (thickness = 20 cm) that simulate the patient. After irradiation, the film darkening is proportional to the radiation dose received by the patient's back and reflects the shape of the X-ray field. After film scanning and analysis, the exact dose value can be obtained at each point of the mapping. Four experimentation were performed, constituting a total of 34 acquisition incidences including all possible exposure configurations. 2. At a second time, clinical trials were launched on real patients during real 'Chronic Total Occlusion (CTO)' procedures for a total of 80 cases. Gafchromic films were placed at the back of patients. We performed comparisons on the dose values, as well as the distribution, and the shape of irradiation fields between the skin dose mapping of our mathematical model and Gafchromic films. Results: The comparison between the dose values shows a difference less than 15%. Moreover, our model shows a very good geometric accuracy: all fields have the same shape, size and location (uncertainty < 5%). Conclusion: This study shows that our model is a reliable tool to warn physicians when a high radiation dose is reached. Thus, deterministic effects can be avoided.Keywords: clinical experimentation, interventional radiology, mathematical model, patient's skin-dose mapping.
Procedia PDF Downloads 14120148 Wireless Sensor Network for Forest Fire Detection and Localization
Authors: Tarek Dandashi
Abstract:
WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.Keywords: forest fire, WSN, wireless sensor network, algortihm
Procedia PDF Downloads 26320147 HBTOnto: An Ontology Model for Analyzing Human Behavior Trajectories
Authors: Heba M. Wagih, Hoda M. O. Mokhtar
Abstract:
Social Network has recently played a significant role in both scientific and social communities. The growing adoption of social network applications has been a relevant source of information nowadays. Due to its popularity, several research trends are emerged to service the huge volume of users including, Location-Based Social Networks (LBSN), Recommendation Systems, Sentiment Analysis Applications, and many others. LBSNs applications are among the highly demanded applications that do not focus only on analyzing the spatiotemporal positions in a given raw trajectory but also on understanding the semantics behind the dynamics of the moving object. LBSNs are possible means of predicting human mobility based on users social ties as well as their spatial preferences. LBSNs rely on the efficient representation of users’ trajectories. Hence, traditional raw trajectory information is no longer convenient. In our research, we focus on studying human behavior trajectory which is the major pillar in location recommendation systems. In this paper, we propose an ontology design patterns with their underlying description logics to efficiently annotate human behavior trajectories.Keywords: human behavior trajectory, location-based social network, ontology, social network
Procedia PDF Downloads 45320146 Approach for the Mathematical Calculation of the Damping Factor of Railway Bridges with Ballasted Track
Authors: Andreas Stollwitzer, Lara Bettinelli, Josef Fink
Abstract:
The expansion of the high-speed rail network over the past decades has resulted in new challenges for engineers, including traffic-induced resonance vibrations of railway bridges. Excessive resonance-induced speed-dependent accelerations of railway bridges during high-speed traffic can lead to negative consequences such as fatigue symptoms, distortion of the track, destabilisation of the ballast bed, and potentially even derailment. A realistic prognosis of bridge vibrations during high-speed traffic must not only rely on the right choice of an adequate calculation model for both bridge and train but first and foremost on the use of dynamic model parameters which reflect reality appropriately. However, comparisons between measured and calculated bridge vibrations are often characterised by considerable discrepancies, whereas dynamic calculations overestimate the actual responses and therefore lead to uneconomical results. This gap between measurement and calculation constitutes a complex research issue and can be traced to several causes. One major cause is found in the dynamic properties of the ballasted track, more specifically in the persisting, substantial uncertainties regarding the consideration of the ballasted track (mechanical model and input parameters) in dynamic calculations. Furthermore, the discrepancy is particularly pronounced concerning the damping values of the bridge, as conservative values have to be used in the calculations due to normative specifications and lack of knowledge. By using a large-scale test facility, the analysis of the dynamic behaviour of ballasted track has been a major research topic at the Institute of Structural Engineering/Steel Construction at TU Wien in recent years. This highly specialised test facility is designed for isolated research of the ballasted track's dynamic stiffness and damping properties – independent of the bearing structure. Several mechanical models for the ballasted track consisting of one or more continuous spring-damper elements were developed based on the knowledge gained. These mechanical models can subsequently be integrated into bridge models for dynamic calculations. Furthermore, based on measurements at the test facility, model-dependent stiffness and damping parameters were determined for these mechanical models. As a result, realistic mechanical models of the railway bridge with different levels of detail and sufficiently precise characteristic values are available for bridge engineers. Besides that, this contribution also presents another practical application of such a bridge model: Based on the bridge model, determination equations for the damping factor (as Lehr's damping factor) can be derived. This approach constitutes a first-time method that makes the damping factor of a railway bridge calculable. A comparison of this mathematical approach with measured dynamic parameters of existing railway bridges illustrates, on the one hand, the apparent deviation between normatively prescribed and in-situ measured damping factors. On the other hand, it is also shown that a new approach, which makes it possible to calculate the damping factor, provides results that are close to reality and thus raises potentials for minimising the discrepancy between measurement and calculation.Keywords: ballasted track, bridge dynamics, damping, model design, railway bridges
Procedia PDF Downloads 16420145 Developing a Spatial Transport Model to Determine Optimal Routes When Delivering Unprocessed Milk
Authors: Sunday Nanosi Ndovi, Patrick Albert Chikumba
Abstract:
In Malawi, smallholder dairy farmers transport unprocessed milk to sell at Milk Bulking Groups (MBGs). MBGs store and chill the milk while awaiting collection by processors. The farmers deliver milk using various modes of transportation such as foot, bicycle, and motorcycle. As a perishable food, milk requires timely transportation to avoid deterioration. In other instances, some farmers bypass the nearest MBGs for facilities located further away. Untimely delivery worsens quality and results in rejection at MBG. Subsequently, these rejections lead to revenue losses for dairy farmers. Therefore, the objective of this study was to optimize routes when transporting milk by selecting the shortest route using time as a cost attribute in Geographic Information Systems (GIS). A spatially organized transport system impedes milk deterioration while promoting profitability for dairy farmers. A transportation system was modeled using Route Analysis and Closest Facility network extensions. The final output was to find the quickest routes and identify the nearest milk facilities from incidents. Face-to-face interviews targeted leaders from all 48 MBGs in the study area and 50 farmers from Namahoya MBG. During field interviews, coordinates were captured in order to create maps. Subsequently, maps supported the selection of optimal routes based on the least travel times. The questionnaire targeted 200 respondents. Out of the total, 182 respondents were available. Findings showed that out of the 50 sampled farmers that supplied milk to Namahoya, only 8% were nearest to the facility, while 92% were closest to 9 different MBGs. Delivering milk to the nearest MBGs would minimize travel time and distance by 14.67 hours and 73.37 km, respectively.Keywords: closest facility, milk, route analysis, spatial transport
Procedia PDF Downloads 5820144 Findings from an Access Improvement Project for Antiretroviral Therapy Uptake through Traditional Birth Attendants at Mother Theresa Hospital, Lagos, Nigeria
Authors: Daniel Afolayan, Christina Olawepo, Francis Olowookanga, Nguhemen Tingir, Olawale Fadare, John Oko
Abstract:
In Nigeria, traditional birth attendants (TBAs) can play an important role in the prevention of mother-to-child transmission of HIV. However, their role in improving access to antiretroviral therapy (ART) is unclear. Catholic Caritas Foundation of Nigeria (Caritas Nigeria) is an implementing agency supporting increased access to HIV testing and treatment services in Lagos state through health facilities including Mother Theresa Hospital. Despite intra-facility testing and community outreaches, ART uptake at Mother Theresa Hospital, Lagos was low with 6 individuals on antiretroviral drugs 3 months post-activation. This study explored improving access to ART through linkages with TBAs for ART uptake at the facility. Plan-Do-Study-Act model was used. The goal was to improve uptake of ART from 6 to 80 in 5 months (end of project year). Scanning revealed a network of 15 TBAs with potential as satellites for HIV testing. Caritas Nigeria linked the facility with 15 TBAs who were provided with HIV test kits and trained on HIV testing services for provider-initiated testing and outreaches. Weekly reports and referrals of positives were received, tracked and feedback given on testing yield. These TBAs serve individuals of various age and gender at their trado-medical centres. At the end of 5 months, HIV testing increased by 10,575 (78% from TBAs) and HIV positives obtained improved by 77 (44.2% from TBAs). 55 new individuals were enrolled and commenced on ART (61.8% from TBAs). There was a successful linkage of all clients with escort services due to incentives. Total uptake of ART was 61 (76.3% of target). Structured partnerships between TBAs and HIV care and treatment centers should be strengthened to improve access to ART.Keywords: access improvement, antiretroviral therapy, traditional birth attendants, uptake
Procedia PDF Downloads 46020143 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads
Authors: Dražen Cvitanić, Biljana Maljković
Abstract:
This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency
Procedia PDF Downloads 45220142 Using Knowledge Management and Visualisation Concepts to Improve Patients and Hospitals Staff Workflow
Authors: A. A. AlRasheed, A. Atkins, R. Campion
Abstract:
This paper focuses on using knowledge management and visualisation concepts to improve the patients and hospitals employee’s workflow. Hospitals workflow is a complex and complicated process and poor patient flow can put both patients and a hospital’s reputation at risk, and can threaten the facility’s financial sustainability. Healthcare leaders are under increased pressure to reduce costs while maintaining or increasing patient care standards. In this paper, a framework is proposed to help improving patient experience, staff satisfaction, and operational efficiency across hospitals by using knowledge management based visualisation concepts. This framework is using real-time visibility to track and monitor location and status of patients, staff, rooms, and medical equipment.Keywords: knowledge management, improvements, visualisation, workflow
Procedia PDF Downloads 26920141 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.Keywords: melting furnace, inverse heat transfer, enthalpy method, levenberg–marquardt method
Procedia PDF Downloads 324