Search results for: uncertainties of measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3062

Search results for: uncertainties of measurement

2462 Radial Distortion Correction Based on the Concept of Verifying the Planarity of a Specimen

Authors: Shih-Heng Tung, Ming-Hsiang Shih, Wen-Pei Sung

Abstract:

Because of the rapid development of digital camera and computer, digital image correlation method has drawn lots of attention recently and has been applied to a variety of fields. However, the image distortion is inevitable when the image is captured through a lens. This image distortion problem can result in an innegligible error while using digital image correlation method. There are already many different ways to correct the image distortion, and most of them require specific image patterns or precise control points. A new distortion correction method is proposed in this study. The proposed method is based on the fact that a flat surface should keep flat when it is measured using three-dimensional (3D) digital image measurement technique. Lens distortion can be divided into radial distortion, decentering distortion and thin prism distortion. Because radial distortion has a more noticeable influence than the other types of distortions, this method deals only with radial distortion. The simplified 3D digital image measurement technique is adopted to measure the surface coordinates of a flat specimen. Then the gradient method is applied to find the best correction parameters. A few experiments are carried out in this study to verify the correctness of this method. The results show that this method can achieve a good accuracy and it is suitable for both large and small distortion conditions. The most important advantage is that it requires neither mark with specific pattern nor precise control points.

Keywords: 3D DIC, radial distortion, distortion correction, planarity

Procedia PDF Downloads 551
2461 Building a Lean Construction Body of Knowledge

Authors: Jyoti Singh, Ahmed Stifi, Sascha Gentes

Abstract:

The process of construction significantly contributes to high level of risks, complexity and uncertainties leading to cost and time overrun, customer dissatisfaction etc. lean construction is important as it is a comprehensive system of tools and concepts focusing on moving closer to customer satisfaction by understanding the process, identifying the waste and eliminating it. The proposed work includes identification of knowledge areas from lean perspective, lean tools/concepts used in lean construction and establishing a relationship matrix between knowledge areas and lean tools/concepts, thus developing and building up a lean construction body of knowledge (LCBOK), i.e. a guide to lean construction, aiming to provide guidelines to manage individual projects and also helping construction industry to minimise waste and maximize value to the customer. In this study, we identified 8 knowledge areas and 62 lean tools/concepts from lean perspective and also one tool can help to manage two or more knowledge areas.

Keywords: knowledge areas, lean body matrix, lean construction, lean tools

Procedia PDF Downloads 436
2460 Detection of Patient Roll-Over Using High-Sensitivity Pressure Sensors

Authors: Keita Nishio, Takashi Kaburagi, Yosuke Kurihara

Abstract:

Recent advances in medical technology have served to enhance average life expectancy. However, the total time for which the patients are prescribed complete bedrest has also increased. With patients being required to maintain a constant lying posture- also called bedsore- development of a system to detect patient roll-over becomes imperative. For this purpose, extant studies have proposed the use of cameras, and favorable results have been reported. Continuous on-camera monitoring, however, tends to violate patient privacy. We have proposed unconstrained bio-signal measurement system that could detect body-motion during sleep and does not violate patient’s privacy. Therefore, in this study, we propose a roll-over detection method by the date obtained from the bi-signal measurement system. Signals recorded by the sensor were assumed to comprise respiration, pulse, body motion, and noise components. Compared the body-motion and respiration, pulse component, the body-motion, during roll-over, generate large vibration. Thus, analysis of the body-motion component facilitates detection of the roll-over tendency. The large vibration associated with the roll-over motion has a great effect on the Root Mean Square (RMS) value of time series of the body motion component calculated during short 10 s segments. After calculation, the RMS value during each segment was compared to a threshold value set in advance. If RMS value in any segment exceeded the threshold, corresponding data were considered to indicate occurrence of a roll-over. In order to validate the proposed method, we conducted experiment. A bi-directional microphone was adopted as a high-sensitivity pressure sensor and was placed between the mattress and bedframe. Recorded signals passed through an analog Band-pass Filter (BPF) operating over the 0.16-16 Hz bandwidth. BPF allowed the respiration, pulse, and body-motion to pass whilst removing the noise component. Output from BPF was A/D converted with the sampling frequency 100Hz, and the measurement time was 480 seconds. The number of subjects and data corresponded to 5 and 10, respectively. Subjects laid on a mattress in the supine position. During data measurement, subjects—upon the investigator's instruction—were asked to roll over into four different positions—supine to left lateral, left lateral to prone, prone to right lateral, and right lateral to supine. Recorded data was divided into 48 segments with 10 s intervals, and the corresponding RMS value for each segment was calculated. The system was evaluated by the accuracy between the investigator’s instruction and the detected segment. As the result, an accuracy of 100% was achieved. While reviewing the time series of recorded data, segments indicating roll-over tendencies were observed to demonstrate a large amplitude. However, clear differences between decubitus and the roll-over motion could not be confirmed. Extant researches possessed a disadvantage in terms of patient privacy. The proposed study, however, demonstrates more precise detection of patient roll-over tendencies without violating their privacy. As a future prospect, decubitus estimation before and after roll-over could be attempted. Since in this paper, we could not confirm the clear differences between decubitus and the roll-over motion, future studies could be based on utilization of the respiration and pulse components.

Keywords: bedsore, high-sensitivity pressure sensor, roll-over, unconstrained bio-signal measurement

Procedia PDF Downloads 121
2459 A Reinforcement Learning Approach for Evaluation of Real-Time Disaster Relief Demand and Network Condition

Authors: Ali Nadi, Ali Edrissi

Abstract:

Relief demand and transportation links availability is the essential information that is needed for every natural disaster operation. This information is not in hand once a disaster strikes. Relief demand and network condition has been evaluated based on prediction method in related works. Nevertheless, prediction seems to be over or under estimated due to uncertainties and may lead to a failure operation. Therefore, in this paper a stochastic programming model is proposed to evaluate real-time relief demand and network condition at the onset of a natural disaster. To address the time sensitivity of the emergency response, the proposed model uses reinforcement learning for optimization of the total relief assessment time. The proposed model is tested on a real size network problem. The simulation results indicate that the proposed model performs well in the case of collecting real-time information.

Keywords: disaster management, real-time demand, reinforcement learning, relief demand

Procedia PDF Downloads 316
2458 Smart Books as a Supporting Tool for Developing Skills of Designing and Employing Webquest 2.0

Authors: Huda Alyami

Abstract:

The present study aims to measure the effectiveness of an "Interactive eBook" in order to develop skills of designing and employing webquests for female intern teachers. The study uses descriptive analytical methodology as well as quasi-experimental methodology. The sample of the study consists of (30) female intern teachers from the Department of Special Education (in the tracks of Gifted Education and Learning Difficulties), during the first semester of the academic year 2015, at King Abdul-Aziz University in Jeddah city. The sample is divided into (15) female intern teachers for the experimental group, and (15) female intern teachers for the control group. A set of qualitative and quantitative tools have been prepared and verified for the study, embodied in: a list of the designing webquests' skills, a list of the employing webquests' skills, a webquests' knowledge achievement test, a product rating card, an observation card, and an interactive ebook. The study concludes the following results: 1. After pre-control, there are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of the experimental and the control groups in the post measurement of the webquests' knowledge achievement test, in favor of the experimental group. 2. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the product rating card in favor of the experimental group. 3. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the observation card for the experimental group. In the light of the previous findings, the study recommends the following: taking advantage of interactive ebooks when teaching all educational courses for various disciplines at the university level, creating educational participative platforms to share educational interactive ebooks for various disciplines at the local and regional levels. The study suggests conducting further qualitative studies on the effectiveness of interactive ebooks, in addition to conducting studies on the use of (Web 2.0) in webquests.

Keywords: interactive eBook, webquest, design, employing, develop skills

Procedia PDF Downloads 183
2457 Two Component Source Apportionment Based on Absorption and Size Distribution Measurement

Authors: Tibor Ajtai, Noémi Utry, Máté Pintér, Gábor Szabó, Zoltán Bozóki

Abstract:

Beyond its climate and health related issues ambient light absorbing carbonaceous particulate matter (LAC) has also become a great scientific interest in terms of its regulations recently. It has been experimentally demonstrated in recent studies, that LAC is dominantly composed of traffic and wood burning aerosol particularly under wintertime urban conditions, when the photochemical and biological activities are negligible. Several methods have been introduced to quantitatively apportion aerosol fractions emitted by wood burning and traffic but most of them require costly and time consuming off-line chemical analysis. As opposed to chemical features, the microphysical properties of airborne particles such as optical absorption and size distribution can be easily measured on-line, with high accuracy and sensitivity, especially under highly polluted urban conditions. Recently a new method has been proposed for the apportionment of wood burning and traffic aerosols based on the spectral dependence of their absorption quantified by the Aerosol Angström Exponent (AAE). In this approach the absorption coefficient is deduced from transmission measurement on a filter accumulated aerosol sample and the conversion factor between the measured optical absorption and the corresponding mass concentration (the specific absorption cross section) are determined by on-site chemical analysis. The recently developed multi-wavelength photoacoustic instruments provide novel, in-situ approach towards the reliable and quantitative characterization of carbonaceous particulate matter. Therefore, it also opens up novel possibilities on the source apportionment through the measurement of light absorption. In this study, we demonstrate an in-situ spectral characterization method of the ambient carbon fraction based on light absorption and size distribution measurements using our state-of-the-art multi-wavelength photoacoustic instrument (4λ-PAS) and Single Mobility Particle Sizer (SMPS) The carbonaceous particulate selective source apportionment study was performed for ambient particulate matter in the city center of Szeged, Hungary where the dominance of traffic and wood burning aerosol has been experimentally demonstrated earlier. The proposed model is based on the parallel, in-situ measurement of optical absorption and size distribution. AAEff and AAEwb were deduced from the measured data using the defined correlation between the AOC(1064nm)/AOC(266nm) and N100/N20 ratios. σff(λ) and σwb(λ) were determined with the help of the independently measured temporal mass concentrations in the PM1 mode. Furthermore, the proposed optical source apportionment is based on the assumption that the light absorbing fraction of PM is exclusively related to traffic and wood burning. This assumption is indirectly confirmed here by the fact that the measured size distribution is composed of two unimodal size distributions identified to correspond to traffic and wood burning aerosols. The method offers the possibility of replacing laborious chemical analysis with simple in-situ measurement of aerosol size distribution data. The results by the proposed novel optical absorption based source apportionment method prove its applicability whenever measurements are performed at an urban site where traffic and wood burning are the dominant carbonaceous sources of emission.

Keywords: absorption, size distribution, source apportionment, wood burning, traffic aerosol

Procedia PDF Downloads 227
2456 Software Assessment Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However,these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: optimization technique, quality assurance, software certification model, software assessment

Procedia PDF Downloads 487
2455 Study of Human Upper Arm Girth during Elbow Isokinetic Contractions Based on a Smart Circumferential Measuring System

Authors: Xi Wang, Xiaoming Tao, Raymond C. H. So

Abstract:

As one of the convenient and noninvasive sensing approaches, the automatic limb girth measurement has been applied to detect intention behind human motion from muscle deformation. The sensing validity has been elaborated by preliminary researches but still need more fundamental study, especially on kinetic contraction modes. Based on the novel fabric strain sensors, a soft and smart limb girth measurement system was developed by the authors’ group, which can measure the limb girth in-motion. Experiments were carried out on elbow isometric flexion and elbow isokinetic flexion (biceps’ isokinetic contractions) of 90°/s, 60°/s, and 120°/s for 10 subjects (2 canoeists and 8 ordinary people). After removal of natural circumferential increments due to elbow position, the joint torque is found not uniformly sensitive to the limb circumferential strains, but declining as elbow joint angle rises, regardless of the angular speed. Moreover, the maximum joint torque was found as an exponential function of the joint’s angular speed. This research highly contributes to the application of the automatic limb girth measuring during kinetic contractions, and it is useful to predict the contraction level of voluntary skeletal muscles.

Keywords: fabric strain sensor, muscle deformation, isokinetic contraction, joint torque, limb girth strain

Procedia PDF Downloads 337
2454 Measurement of Turbulence with PITOT Static Tube in Low Speed Subsonic Wind Tunnel

Authors: Gopikrishnan, Bharathiraja, Boopalan, Jensin Joshua

Abstract:

The Pitot static tube has proven their values and practicability in measuring velocity of fluids for many years. With the aim of extensive usage of such Pitot tube systems, one of the major enabling technologies is to use the design and fabricate a high sensitive pitot tube for the purpose of calibration of the subsonic wind tunnel. Calibration of wind tunnel is carried out by using different instruments to measure variety of parameters. Using too many instruments inside the tunnel may not only affect the fluid flow but also lead to drag or losses. So, it is essential to replace the different system with a single system that would give all the required information. This model of high sensitive Pitot tube has been designed to ease the calibration process. It minimizes the use of different instruments and this single system is capable of calibrating the wind tunnel test section. This Pitot static tube is completely digitalized and so that the velocity data`s can be collected directly from the instrument. Since the turbulence factors are dependent on velocity, the data’s that are collected from the pitot static tube are then processed and the level of turbulence in the fluid flow is calculated. It is also capable of measuring the pressure distribution inside the wind tunnel and the flow angularity of the fluid. Thus, the well-designed high sensitive Pitot static tube is utilized in calibrating the tunnel and also for the measurement of turbulence.

Keywords: pitot static tube, turbulence, wind tunnel, velocity

Procedia PDF Downloads 526
2453 Cultural Background as Moderator of the Association Between Personal Bonding Social Capital and Well-Being: An Association Study in a Sample of Dutch and Turkish Older Adults in the Netherlands

Authors: Marianne Simons, Sinan Kurt, Marjolein Stefens, Kai Karos, Johan Lataster

Abstract:

As cultural diversity within older populations in European countries increases, the role of cultural background should be taken account of in aging studies. Bonding social capital (BSC), containing someone’s socio-emotional resources, is recognised as an important ingredient for wellbeing in old age and found to be associated with someone’s cultural background. The current study examined the association between BSC, loneliness and wellbeing in a sample including older Turkish migrants with a collectivistic cultural background and native Dutch older adults, both living in the Netherlands, characterised by an individualistic culture. A sample of 119 Turkish migrants (64.7% male; age 65-87, M(SD)=71.13(5.04) and 124 native Dutch adults (32.3% male, age 65-94, M(SD)= 71.9(5.32) filled out either an online or printed questionnaire measuring BSC, psychological, social and emotional well-being, loneliness and relevant demographic covariates. Regression analysis - including confounders age, gender, level of education, physical health and relationship - showed positive associations between BSC and respectively emotional, social and psychological well-being and a negative association with loneliness in both samples. Moderation analyses showed that these associations were significantly stronger for the Turkish older migrants than for their native peers. Measurement invariance analysis indicated partial metric invariance for the measurement of BSC and loneliness and non-invariance for wellbeing, calling for caution comparing means between samples. The results stress the importance of BSC for wellbeing of older migrants from collectivistic cultures living in individualistic countries. Previous research, shows a trend of older migrants displaying lower levels of BSC as well as associated variables, such as education, physical health, and financial income. This calls for more research of the interplay between demographic and psychosocial factors restraining mental wellbeing of older migrant populations. Measurement invariance analyses further emphasize the importance of taking cultural background into account in positive aging studies.

Keywords: positive aging, cultural background, wellbeing, social capital, loneliness

Procedia PDF Downloads 90
2452 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons

Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe

Abstract:

This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.

Keywords: digital holography, quantum imaging, quantum holography, quantum metrology

Procedia PDF Downloads 92
2451 Comparing the Efficacy of Quantitative Electroencephalogram-Based Neurofeedback Therapy Program versus Organizational Skills Training Program to Reduce the Core Symptoms among Children Group of ADHD

Authors: Radwa R. El-Saadany , Medhat Abu Zeid, Tarek Omar, Marwa S. Maqsoud

Abstract:

Attention deficit/hyperactivity disorder (ADHD) is one of the most common neurodevelopmental disorders characterized by attention deficit, hyperactivity, and impulsivity. Neurofeedback (NF) is one of the neurotherapy treatments that cause brain wave changes. Method: The current pseudo-experimental study with a pre–post-test design was conducted on a population of children with attention deficit hyperactivity disorder (ADHD).The sample size comprised of (30) children selected by random sampling method and assigned to two therapeutic groups: First therapeutic group received a neurofeedback program. Based on QEEG, it reached (10) children. The second therapeutic group received an organization skills training program, it reached (10) and the control group that did not receive programs, it reached (10) children. Results: There are significant differences between pre- and post-assessments among therapeutic groups in reducing the three core symptoms of ADHD in favor of post measurement. There are no significant differences between post-assessment and follow up measurement of the therapeutic groups.

Keywords: QEEG-based neurofeedback therapy program, organizational skills training program, attention deficit hyperactivity disorder

Procedia PDF Downloads 77
2450 Introduction of Integrated Image Deep Learning Solution and How It Brought Laboratorial Level Heart Rate and Blood Oxygen Results to Everyone

Authors: Zhuang Hou, Xiaolei Cao

Abstract:

The general public and medical professionals recognized the importance of accurately measuring and storing blood oxygen levels and heart rate during the COVID-19 pandemic. The demand for accurate contactless devices was motivated by the need for cross-infection reduction and the shortage of conventional oximeters, partially due to the global supply chain issue. This paper evaluated a contactless mini program HealthyPai’s heart rate (HR) and oxygen saturation (SpO2) measurements compared with other wearable devices. In the HR study of 185 samples (81 in the laboratory environment, 104 in the real-life environment), the mean absolute error (MAE) ± standard deviation was 1.4827 ± 1.7452 in the lab, 6.9231 ± 5.6426 in the real-life setting. In the SpO2 study of 24 samples, the MAE ± standard deviation of the measurement was 1.0375 ± 0.7745. Our results validated that HealthyPai utilizing the Integrated Image Deep Learning Solution (IIDLS) framework, can accurately measure HR and SpO2, providing the test quality at least comparable to other FDA-approved wearable devices in the market and surpassing the consumer-grade and research-grade wearable standards.

Keywords: remote photoplethysmography, heart rate, oxygen saturation, contactless measurement, mini program

Procedia PDF Downloads 134
2449 The Correlation Between Epicardial Fat Pad and Coronary Artery Disease

Authors: Behnam Shakerian, Negin Razavi

Abstract:

The pathogenesis of coronary artery disease is multifactorial. The epicardial fat pad is a localized fat depot lying between the myocardium and the visceral layer of the pericardium. The mechanisms through which epicardial fat pad can cause atherosclerosis are complex. The epicardial fat pad can surround the coronary arteries and contributes to the development and progression of coronary artery disease. Methods: we selected 50 patients who underwent coronary artery angiography for the evaluation of coronary artery disease that results were positive for coronary artery disease. All patients underwent an echocardiographic examination after coronary angiography to measure epicardial fat pad thickness. The epicardial fat pad was defined as an echo-free space between the myocardium's outer wall and the pericardium's visceral layer. Results: The epicardial fat pad was measured on the right ventricle apex in 46 patients. Sixty- five percent of the studied patients were male. The most common vessel with stenosis was the left anterior descending artery. A significant correlation was observed between epicardial fat pad thickness and the severity of coronary artery disease. Discussions: The epicardial fat pad provides a horizon on the pathophysiology of cardiovascular diseases. It directly contributes to the development and progression of coronary artery disease by causing inflammation and endothelial damage. Further investigations are needed to determine whether medical treatment can reduce the mass of epicardial fat pad and can help to improve atherosclerosis. Conclusion: The epicardial fat pad measurement could be used as an indicator of coronary arteries’ atherosclerosis. Therefore, thickness measurement of the epicardial fat pad in the clinical practice could be of assistance in identifying patients at risk and if required, undergoing supplementary diagnosis with coronary angiography.

Keywords: epicardial, fat pad, coronary artery disease, echocardiography

Procedia PDF Downloads 161
2448 Feasibility Study of Particle Image Velocimetry in the Muzzle Flow Fields during the Intermediate Ballistic Phase

Authors: Moumen Abdelhafidh, Stribu Bogdan, Laboureur Delphine, Gallant Johan, Hendrick Patrick

Abstract:

This study is part of an ongoing effort to improve the understanding of phenomena occurring during the intermediate ballistic phase, such as muzzle flows. A thorough comprehension of muzzle flow fields is essential for optimizing muzzle device and projectile design. This flow characterization has heretofore been almost entirely limited to local and intrusive measurement techniques such as pressure measurements using pencil probes. Consequently, the body of quantitative experimental data is limited, so is the number of numerical codes validated in this field. The objective of the work presented here is to demonstrate the applicability of the Particle Image Velocimetry (PIV) technique in the challenging environment of the propellant flow of a .300 blackout weapon to provide accurate velocity measurements. The key points of a successful PIV measurement are the selection of the particle tracer, their seeding technique, and their tracking characteristics. We have experimentally investigated the aforementioned points by evaluating the resistance, gas dispersion, laser light reflection as well as the response to a step change across the Mach disk for five different solid tracers using two seeding methods. To this end, an experimental setup has been performed and consisted of a PIV system, the combustion chamber pressure measurement, classical high-speed schlieren visualization, and an aerosol spectrometer. The latter is used to determine the particle size distribution in the muzzle flow. The experimental results demonstrated the ability of PIV to accurately resolve the salient features of the propellant flow, such as the under the expanded jet and vortex rings, as well as the instantaneous velocity field with maximum centreline velocities of more than 1000 m/s. Besides, naturally present unburned particles in the gas and solid ZrO₂ particles with a nominal size of 100 nm, when coated on the propellant powder, are suitable as tracers. However, the TiO₂ particles intended to act as a tracer, surprisingly not only melted but also functioned as a combustion accelerator and decreased the number of particles in the propellant gas.

Keywords: intermediate ballistic, muzzle flow fields, particle image velocimetry, propellant gas, particle size distribution, under expanded jet, solid particle tracers

Procedia PDF Downloads 161
2447 Uncertainty of the Brazilian Earth System Model for Solar Radiation

Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Deivid Pires, Rafael Haag, Elton Gimenez Rossini

Abstract:

This study evaluated the uncertainties involved in the solar radiation projections generated by the Brazilian Earth System Model (BESM) of the Weather and Climate Prediction Center (CPTEC) belonging to Coupled Model Intercomparison Phase 5 (CMIP5), with the aim of identifying efficiency in the projections for solar radiation of said model and in this way establish the viability of its use. Two different scenarios elaborated by Intergovernmental Panel on Climate Change (IPCC) were evaluated: RCP 4.5 (with more optimistic contour conditions) and 8.5 (with more pessimistic initial conditions). The method used to verify the accuracy of the present model was the Nash coefficient and the Statistical bias, as it better represents these atmospheric patterns. The BESM showed a tendency to overestimate the data ​​of solar radiation projections in most regions of the state of Rio Grande do Sul and through the validation methods adopted by this study, BESM did not present a satisfactory accuracy.

Keywords: climate changes, projections, solar radiation, uncertainty

Procedia PDF Downloads 250
2446 Prototype of Over Dimension Over Loading (ODOL) Freight Transportation Monitoring System Based on Arduino Mega 'Sabrang': A Case Study in Klaten, Indonesia

Authors: Chairul Fajar, Muhammad Nur Hidayat, Muksalmina

Abstract:

The issue of Over Dimension Over Loading (ODOL) in Indonesia remains a significant challenge, causing traffic accidents, disrupting traffic flow, accelerating road damage, and potentially leading to bridge collapses. Klaten Regency, located on the slopes of Mount Merapi along the Woro River in Kemalang District, has potential Class C excavation materials such as sand and stone. Data from the Klaten Regency Transportation Department indicates that ODOL violations account for 72%, while non-violating vehicles make up only 28%. ODOL involves modifying factory-standard vehicles beyond the limits specified in the Type Test Registration Certificate (SRUT) to save costs and travel time. This study aims to develop a prototype ‘Sabrang’ monitoring system based on Arduino Mega to control and monitor ODOL freight transportation in the mining of Class C excavation materials in Klaten Regency. The prototype is designed to automatically measure the dimensions and weight of objects using a microcontroller. The data analysis techniques used in this study include the Normality Test and Paired T-Test, comparing sensor measurement results on scaled objects. The study results indicate differences in measurement validation under room temperature and ambient temperature conditions. Measurements at room temperature showed that the majority of H0 was accepted, meaning there was no significant difference in measurements when the prototype tool was used. Conversely, measurements at ambient temperature showed that the majority of H0 was rejected, indicating a significant difference in measurements when the prototype tool was used. In conclusion, the ‘Sabrang’ monitoring system prototype is effective for controlling ODOL, although measurement results are influenced by temperature conditions. This study is expected to assist in the monitoring and control of ODOL, thereby enhancing traffic safety and road infrastructure.

Keywords: over dimension over loading, prototype, microcontroller, Arduino, normality test, paired t-test

Procedia PDF Downloads 34
2445 Sound Source Localisation and Augmented Reality for On-Site Inspection of Prefabricated Building Components

Authors: Jacques Cuenca, Claudio Colangeli, Agnieszka Mroz, Karl Janssens, Gunther Riexinger, Antonio D'Antuono, Giuseppe Pandarese, Milena Martarelli, Gian Marco Revel, Carlos Barcena Martin

Abstract:

This study presents an on-site acoustic inspection methodology for quality and performance evaluation of building components. The work focuses on global and detailed sound source localisation, by successively performing acoustic beamforming and sound intensity measurements. A portable experimental setup is developed, consisting of an omnidirectional broadband acoustic source and a microphone array and sound intensity probe. Three main acoustic indicators are of interest, namely the sound pressure distribution on the surface of components such as walls, windows and junctions, the three-dimensional sound intensity field in the vicinity of junctions, and the sound transmission loss of partitions. The measurement data is post-processed and converted into a three-dimensional numerical model of the acoustic indicators with the help of the simultaneously acquired geolocation information. The three-dimensional acoustic indicators are then integrated into an augmented reality platform superimposing them onto a real-time visualisation of the spatial environment. The methodology thus enables a measurement-supported inspection process of buildings and the correction of errors during construction and refurbishment. Two experimental validation cases are shown. The first consists of a laboratory measurement on a full-scale mockup of a room, featuring a prefabricated panel. The latter is installed with controlled defects such as lack of insulation and joint sealing material. It is demonstrated that the combined acoustic and augmented reality tool is capable of identifying acoustic leakages from the building defects and assist in correcting them. The second validation case is performed on a prefabricated room at a near-completion stage in the factory. With the help of the measurements and visualisation tools, the homogeneity of the partition installation is evaluated and leakages from junctions and doors are identified. Furthermore, the integration of acoustic indicators together with thermal and geometrical indicators via the augmented reality platform is shown.

Keywords: acoustic inspection, prefabricated building components, augmented reality, sound source localization

Procedia PDF Downloads 383
2444 A Generative Pretrained Transformer-Based Question-Answer Chatbot and Phantom-Less Quantitative Computed Tomography Bone Mineral Density Measurement System for Osteoporosis

Authors: Mian Huang, Chi Ma, Junyu Lin, William Lu

Abstract:

Introduction: Bone health attracts more attention recently and an intelligent question and answer (QA) chatbot for osteoporosis is helpful for science popularization. With Generative Pretrained Transformer (GPT) technology developing, we build an osteoporosis corpus dataset and then fine-tune LLaMA, a famous open-source GPT foundation large language model(LLM), on our self-constructed osteoporosis corpus. Evaluated by clinical orthopedic experts, our fine-tuned model outperforms vanilla LLaMA on osteoporosis QA task in Chinese. Three-dimensional quantitative computed tomography (QCT) measured bone mineral density (BMD) is considered as more accurate than DXA for BMD measurement in recent years. We develop an automatic Phantom-less QCT(PL-QCT) that is more efficient for BMD measurement since no need of an external phantom for calibration. Combined with LLM on osteoporosis, our PL-QCT provides efficient and accurate BMD measurement for our chatbot users. Material and Methods: We build an osteoporosis corpus containing about 30,000 Chinese literatures whose titles are related to osteoporosis. The whole process is done automatically, including crawling literatures in .pdf format, localizing text/figure/table region by layout segmentation algorithm and recognizing text by OCR algorithm. We train our model by continuous pre-training with Low-rank Adaptation (LoRA, rank=10) technology to adapt LLaMA-7B model to osteoporosis domain, whose basic principle is to mask the next word in the text and make the model predict that word. The loss function is defined as cross-entropy between the predicted and ground-truth word. Experiment is implemented on single NVIDIA A800 GPU for 15 days. Our automatic PL-QCT BMD measurement adopt AI-associated region-of-interest (ROI) generation algorithm for localizing vertebrae-parallel cylinder in cancellous bone. Due to no phantom for BMD calibration, we calculate ROI BMD by CT-BMD of personal muscle and fat. Results & Discussion: Clinical orthopaedic experts are invited to design 5 osteoporosis questions in Chinese, evaluating performance of vanilla LLaMA and our fine-tuned model. Our model outperforms LLaMA on over 80% of these questions, understanding ‘Expert Consensus on Osteoporosis’, ‘QCT for osteoporosis diagnosis’ and ‘Effect of age on osteoporosis’. Detailed results are shown in appendix. Future work may be done by training a larger LLM on the whole orthopaedics with more high-quality domain data, or a multi-modal GPT combining and understanding X-ray and medical text for orthopaedic computer-aided-diagnosis. However, GPT model gives unexpected outputs sometimes, such as repetitive text or seemingly normal but wrong answer (called ‘hallucination’). Even though GPT give correct answers, it cannot be considered as valid clinical diagnoses instead of clinical doctors. The PL-QCT BMD system provided by Bone’s QCT(Bone’s Technology(Shenzhen) Limited) achieves 0.1448mg/cm2(spine) and 0.0002 mg/cm2(hip) mean absolute error(MAE) and linear correlation coefficient R2=0.9970(spine) and R2=0.9991(hip)(compared to QCT-Pro(Mindways)) on 155 patients in three-center clinical trial in Guangzhou, China. Conclusion: This study builds a Chinese osteoporosis corpus and develops a fine-tuned and domain-adapted LLM as well as a PL-QCT BMD measurement system. Our fine-tuned GPT model shows better capability than LLaMA model on most testing questions on osteoporosis. Combined with our PL-QCT BMD system, we are looking forward to providing science popularization and early morning screening for potential osteoporotic patients.

Keywords: GPT, phantom-less QCT, large language model, osteoporosis

Procedia PDF Downloads 71
2443 Evolving Software Assessment and Certification Models Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However, these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: software quality, quality assurance, software certification model, software assessment

Procedia PDF Downloads 523
2442 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 401
2441 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 326
2440 Abilitest Battery: Presentation of Tests and Psychometric Properties

Authors: Sylwia Sumińska, Łukasz Kapica, Grzegorz Szczepański

Abstract:

Introduction: Cognitive skills are a crucial part of everyday functioning. Cognitive skills include perception, attention, language, memory, executive functions, and higher cognitive skills. With the aging of societies, there is an increasing percentage of people whose cognitive skills decline. Cognitive skills affect work performance. The appropriate diagnosis of a worker’s cognitive skills reduces the risk of errors and accidents at work which is also important for senior workers. The study aimed to prepare new cognitive tests for adults aged 20-60 and assess the psychometric properties of the tests. The project responds to the need for reliable and accurate methods of assessing cognitive performance. Computer tests were developed to assess psychomotor performance, attention, and working memory. Method: Two hundred eighty people aged 20-60 will participate in the study in 4 age groups. Inclusion criteria for the study were: no subjective cognitive impairment, no history of severe head injuries, chronic diseases, psychiatric and neurological diseases. The research will be conducted from February - to June 2022. Cognitive tests: 1) Measurement of psychomotor performance: Reaction time, Reaction time with selective attention component; 2) Measurement of sustained attention: Visual search (dots), Visual search (numbers); 3) Measurement of working memory: Remembering words, Remembering letters. To assess the validity and the reliability subjects will perform the Vienna Test System, i.e., “Reaction Test” (reaction time), “Signal Detection” (sustained attention), “Corsi Block-Tapping Test” (working memory), and Perception and Attention Test (TUS), Colour Trails Test (CTT), Digit Span – subtest from The Wechsler Adult Intelligence Scale. Eighty people will be invited to a session after three months aimed to assess the consistency over time. Results: Due to ongoing research, the detailed results from 280 people will be shown at the conference separately in each age group. The results of correlation analysis with the Vienna Test System will be demonstrated as well.

Keywords: aging, attention, cognitive skills, cognitive tests, psychomotor performance, working memory

Procedia PDF Downloads 105
2439 Evaluation of Intervention Effectiveness from the Client Perspective: Dimensions and Measurement of Wellbeing

Authors: Neşe Alkan

Abstract:

Purpose: The point that applied/clinical psychology, which is the practice and research discipline of the mental health field, has reached today can be summarized as the necessity of handling the psychological well-being of people from multiple perspectives and the goal of moving it to a higher level. Clients' subjective assessment of their own condition and wellbeing is an integral part of evidence-based interventions. There is a need for tools through which clients can evaluate the effectiveness of the psychotherapy/intervention performed with them and their contribution to the wellbeing and wellbeing of this process in a valid and reliable manner. The aim of this research is to meet this need, to test the reliability and validity of the index in Turkish, and explore its usability in the practices of both researchers and psychotherapists. Method: A total of 213 adults aged between 18-54, 69.5% working and 29.5% university students, were included in the study. Along with their demographic information, the participants were administered a set of scales: wellbeing, life satisfaction, spiritual satisfaction, shopping addiction, and loneliness, namely via an online platform. The construct validity of the wellbeing scale was tested with exploratory and confirmatory factor analyses, convergent and discriminant validity were tested with two-way full and partial correlation analyses and, measurement invariance was tested with one-way analysis of variance. Results: Factor analyzes showed that the scale consisted of six dimensions as it is in its original structure. The internal consistency of the scale was found to be Cronbach α = .82. Two-way correlation analyzes revealed that the wellbeing scale total score was positively correlated with general life satisfaction (r = .62) and spiritual satisfaction (r = .29), as expected. It was negatively correlated with loneliness (r = -.51) and shopping addiction (r = -.15). While the scale score did not vary by gender, previous illness, or nicotine addiction, it was found that the total wellbeing scale scores of the participants who had used antidepressant medication during the past year were lower than those who did not use antidepressant medication (F(1,204) = 7.713, p = .005). Conclusion: It has been concluded that the 12-item wellbeing scale consisting of six dimensions can be used in research and health sciences practices as a valid and reliable measurement tool. Further research which examines the reliability and validity of the scale in different widely used languages such as Spanish and Chinese is recommended.

Keywords: wellbeing, intervention effectiveness, reliability and validity, effectiveness

Procedia PDF Downloads 179
2438 In-door Localization Algorithm and Appropriate Implementation Using Wireless Sensor Networks

Authors: Adeniran K. Ademuwagun, Alastair Allen

Abstract:

The relationship dependence between RSS and distance in an enclosed environment is an important consideration because it is a factor that can influence the reliability of any localization algorithm founded on RSS. Several algorithms effectively reduce the variance of RSS to improve localization or accuracy performance. Our proposed algorithm essentially avoids this pitfall and consequently, its high adaptability in the face of erratic radio signal. Using 3 anchors in close proximity of each other, we are able to establish that RSS can be used as reliable indicator for localization with an acceptable degree of accuracy. Inherent in this concept, is the ability for each prospective anchor to validate (guarantee) the position or the proximity of the other 2 anchors involved in the localization and vice versa. This procedure ensures that the uncertainties of radio signals due to multipath effects in enclosed environments are minimized. A major driver of this idea is the implicit topological relationship among sensors due to raw radio signal strength. The algorithm is an area based algorithm; however, it does not trade accuracy for precision (i.e the size of the returned area).

Keywords: anchor nodes, centroid algorithm, communication graph, radio signal strength

Procedia PDF Downloads 508
2437 Guided Energy Theory of a Particle: Answered Questions Arise from Quantum Foundation

Authors: Desmond Agbolade Ademola

Abstract:

This work aimed to introduce a theory, called Guided Energy Theory of a particle that answered questions that arise from quantum foundation, quantum mechanics theory, and interpretation such as: what is nature of wavefunction? Is mathematical formalism of wavefunction correct? Does wavefunction collapse during measurement? Do quantum physical entanglement and many world interpretations really exist? In addition, is there uncertainty in the physical reality of our nature as being concluded in the Quantum theory? We have been able to show by the fundamental analysis presented in this work that the way quantum mechanics theory, and interpretation describes nature is not correlated with physical reality. Because, we discovered amongst others that, (1) Guided energy theory of a particle fundamentally provides complete physical observable series of quantized measurement of a particle momentum, force, energy e.t.c. in a given distance and time.In contrast, quantum mechanics wavefunction describes that nature has inherited probabilistic and indeterministic physical quantities, resulting in unobservable physical quantities that lead to many worldinterpretation.(2) Guided energy theory of a particle fundamentally predicts that it is mathematically possible to determine precise quantized measurementof position and momentum of a particle simultaneously. Because, there is no uncertainty in nature; nature however naturally guides itself against uncertainty. Contrary to the conclusion in quantum mechanics theory that, it is mathematically impossible to determine the position and the momentum of a particle simultaneously. Furthermore, we have been able to show by this theory that, it is mathematically possible to determine quantized measurement of force acting on a particle simultaneously, which is not possible on the premise of quantum mechanics theory. (3) It is evidently shown by our theory that, guided energy does not collapse, only describes the lopsided nature of a particle behavior in motion. This pretty offers us insight on gradual process of engagement - convergence and disengagement – divergence of guided energy holders which further highlight the picture how wave – like behavior return to particle-like behavior and how particle – like behavior return to wave – like behavior respectively. This further proves that the particles’ behavior in motion is oscillatory in nature. The mathematical formalism of Guided energy theory shows that nature is certainty whereas the mathematical formalism of Quantum mechanics theory shows that nature is absolutely probabilistics. In addition, the nature of wavefunction is the guided energy of the wave. In conclusion, the fundamental mathematical formalism of Quantum mechanics theory is wrong.

Keywords: momentum, physical entanglement, wavefunction, uncertainty

Procedia PDF Downloads 295
2436 Structural, Magnetic, Dielectric and Electrical Properties of Gd3+ Doped Cobalt Ferrite Nanoparticles

Authors: Raghvendra Singh Yadav, Ivo Kuřitka, Jarmila Vilcakova, Jaromir Havlica, Lukas Kalina, Pavel Urbánek, Michal Machovsky, Milan Masař, Martin Holek

Abstract:

In this work, CoFe₂₋ₓGdₓO₄ (x=0.00, 0.05, 0.10, 0.15, 0.20) spinel ferrite nanoparticles are synthesized by sonochemical method. The structural properties and cation distribution are investigated using X-ray Diffraction (XRD), Raman Spectroscopy, Fourier Transform Infrared Spectroscopy and X-ray photoelectron spectroscopy. The morphology and elemental analysis are screened using field emission scanning electron microscopy (FE-SEM) and energy dispersive X-ray spectroscopy, respectively. The particle size measured by FE-SEM and XRD analysis confirm the formation of nanoparticles in the range of 7-10 nm. The electrical properties show that the Gd³⁺ doped cobalt ferrite (CoFe₂₋ₓGdₓO₄; x= 0.20) exhibit enhanced dielectric constant (277 at 100 Hz) and ac conductivity (20.17 x 10⁻⁹ S/cm at 100 Hz). The complex impedance measurement study reveals that as Gd³⁺ doping concentration increases, the impedance Z’ and Z’ ’ decreases. The influence of Gd³⁺ doping in cobalt ferrite nanoparticles on the magnetic property is examined by using vibrating sample magnetometer. Magnetic property measurement reveal that the coercivity decreases with Gd³⁺ substitution from 234.32 Oe (x=0.00) to 12.60 Oe (x=0.05) and further increases from 12.60 Oe (x=0.05) to 68.62 Oe (x=0.20). The saturation magnetization decreases with Gd³⁺ substitution from 40.19 emu/g (x=0.00) to 21.58 emu/g (x=0.20). This decrease follows the three-sublattice model suggested by Yafet-Kittel (Y-K). The Y-K angle increases with the increase of Gd³⁺ doping in cobalt ferrite nanoparticles.

Keywords: sonochemical method, nanoparticles, magnetic property, dielectric property, electrical property

Procedia PDF Downloads 354
2435 Proposal of Non-Destructive Inspection Function Based on Internet of Things Technology Using Drone

Authors: Byoungjoon Yu, Jihwan Park, Sujung Sin, Junghyun Im, Minsoo Park, Sehwan Park, Seunghee Park

Abstract:

In this paper, we propose a technology to monitor the soundness of an Internet-based bridge using a non-conductive inspection function. There has been a collapse accident due to the aging of the bridge structure, and it is necessary to prepare for the deterioration of the bridge. The NDT/SHM system for maintenance of existing bridge structures requires a large number of inspection personnel and expensive inspection costs, and access of expensive and large equipment to measurement points is required. Because current drone inspection equipment can only be inspected through camera, it is difficult to inspect inside damage accurately, and the results of an internal damage evaluation are subjective, and it is difficult for non-specialists to recognize the evaluation results. Therefore, it is necessary to develop NDT/SHM techniques for maintenance of new-concept bridge structures that allow for free movement and real-time evaluation of measurement results. This work is financially supported by Korea Ministry of Land, Infrastructure, and Transport (MOLIT) as 'Smart City Master and Doctor Course Grant Program' and a grant (14SCIP-B088624-01) from Construction Technology Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

Keywords: Structural Health Monitoring, SHM, non-contact sensing, nondestructive testing, NDT, Internet of Things, autonomous self-driving drone

Procedia PDF Downloads 268
2434 Environmental Exposure Assessment among Refuellers at Brussels South Charleroi Airport

Authors: Mostosi C., Stéphenne J., Kempeneers E.

Abstract:

Introduction: Refuellers from Brussels South Charleroi Airport (BSCA) expressed concerns about the risks involved in handling JET-A1 fuel. The HSE Manager of BSCA, in collaboration with the occupational physician and the industrial hygiene unit of the External Service of Occupational Medicine, decided to assess the toxicological exposure of these workers. Materials and methods: Two measurement methods were used. The first was to assay three types of metabolites in urine to highlight the exposure to xylenes, toluene, and benzene in aircraft fuels. Out of 32 refuellers in the department, 26 participated in the sampling, and 23 samples were exploited. The second method targeted the assessment of environmental exposure to certain potentially hazardous substances that refuellers are likely to breathe in work areas at the airport. It was decided to carry out two ambient air measurement campaigns, using static systems on the one hand and, on the other hand, using individual sensors worn by the refuellers at the level of the respiratory tract. Volatile organic compounds and diesel particles were analyzed. Results: Despite the fears that motivated these analyzes, the overall results showed low levels of exposure, far below the existing limit values, both in air quality and in urinary measurements. Conclusion: These results are comparable to a study carried out in several French airports. The staff could be reassured, and then the medical surveillance was modified by the occupational physician. With the aviation development at BSCA, equipment and methods are evolving. Their exposure will have to be reassessed.

Keywords: refuelling, airport, exposure, fuel, occupational health, air quality

Procedia PDF Downloads 86
2433 Performance Management in Serbian Banks: Balanced Scorecard Approach

Authors: Nela Milosevic, Sladjana Barjaktarovic Rakocevic, Sladjana Benkovic, Nemanja Milanovic

Abstract:

Nowadays, performance measurement systems play a key role in evaluating the strategic performances of an organization. On the other hand, there has been a shift towards the Balanced Scorecard (BSC), which has been recognized as a valuable managerial approach. The main goal of this paper is to analyze the main performances of Serbian banks measured at the branches level, through the usage of the Balanced Scorecard framework. Although an extensive number of practitioners have an interest in the Balanced Scorecard approach, little empirical research has been conducted on the implementation of its concept in the service sector like banks, especially within developing countries. From the beginning of August till the end of September 2015, authors have been conducting in-depth interviews among a number of experts from the most successful banks in Serbia. The results show that the non-financial measures, especially, customer oriented indicators and product/ service oriented indicators, seem to be very important factors for improving not only the financial situation within the bank, but also overall business performances. Additionally, the findings prove that there is the cause-effect relationship between non-financial and financial dimensions of the Balanced Scorecard. Having in mind that the banks are still using outdated performance evaluation systems, such as annual, quarterly and monthly reports, we hope that this paper will contribute to the knowledge of how banks in Serbia may apply the Balanced Scorecard approach to evaluate their performance on the most efficient and effective way.

Keywords: balanced scorecard approach, bank management, performance measurement systems, strategic performances

Procedia PDF Downloads 341