Search results for: measurement and verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3181

Search results for: measurement and verification

2791 Smart Irrigation System

Authors: Levent Seyfi, Ertan Akman, Tuğrul C. Topak

Abstract:

In this study, irrigation automation with electronic sensors and its control with smartphones were aimed. In this context, temperature and soil humidity measurements of the area irrigated were obtained by temperature and humidity sensors. A micro controller (Arduino) was utilized for accessing values of these parameters and controlling the proposed irrigation system. The irrigation system could automatically be worked according to obtained measurement values. Besides, a GSM module used together with Arduino provided that the irrigation system was in connection to smartphones. Thus, the irrigation system can be remotely controlled. Not only can we observe whether the irrigation system is working or not via developed special android application but also we can see temperature and humidity measurement values. In addition to this, if desired, the irrigation system can be remotely and manually started or stopped regardless of measured sensor vales thanks to the developed android application. In addition to smartphones, the irrigation system can be alternatively controlled via the designed website (www.sulamadenetim.com).

Keywords: smartphone, Android Operating System, sensors, irrigation System, arduino

Procedia PDF Downloads 615
2790 Experimental Chip/Tool Temperature FEM Model Calibration by Infrared Thermography: A Case Study

Authors: Riccardo Angiuli, Michele Giannuzzi, Rodolfo Franchi, Gabriele Papadia

Abstract:

Temperature knowledge in machining is fundamental to improve the numerical and FEM models used for the study of some critical process aspects, such as the behavior of the worked material and tool. The extreme conditions in which they operate make it impossible to use traditional measuring instruments; infrared thermography can be used as a valid measuring instrument for temperature measurement during metal cutting. In the study, a large experimental program on superduplex steel (ASTM A995 gr. 5A) cutting was carried out, the relevant cutting temperatures were measured by infrared thermography when certain cutting parameters changed, from traditional values to extreme ones. The values identified were used to calibrate a FEM model for the prediction of residual life of the tools. During the study, the problems related to the detection of cutting temperatures by infrared thermography were analyzed, and a dedicated procedure was developed that could be used during similar processing.

Keywords: machining, infrared thermography, FEM, temperature measurement

Procedia PDF Downloads 184
2789 Open Fields' Dosimetric Verification for a Commercially-Used 3D Treatment Planning System

Authors: Nashaat A. Deiab, Aida Radwan, Mohamed Elnagdy, Mohamed S. Yahiya, Rasha Moustafa

Abstract:

This study is to evaluate and investigate the dosimetric performance of our institution's 3D treatment planning system, Elekta PrecisePLAN, for open 6MV fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields guided by the recommended QA tests prescribed in AAPM TG53, NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Seven different tests were done applied on solid water equivalent phantom along with 2D array dose detection system, the calculated doses using 3D treatment planning system PrecisePLAN, compared with measured doses to make sure that the dose calculations are accurate for open fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields. The QA results showed dosimetric accuracy of the TPS for open fields within the specified tolerance limits. However large square (25cm x 25cm) and rectangular fields (20cm x 5cm) some points were out of tolerance in penumbra region (11.38 % and 10.9 %, respectively). For the test of SSD variation, the large field resulted from SSD 125 cm for 10cm x 10cm filed the results recorded an error of 0.2% at the central axis and 1.01% in penumbra. The results yielded differences within the accepted tolerance level as recommended. Large fields showed variations in penumbra. These differences between dose values predicted by the TPS and the measured values at the same point may result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.

Keywords: quality assurance, dose calculation, 3D treatment planning system, photon beam

Procedia PDF Downloads 517
2788 Cepstrum Analysis of Human Walking Signal

Authors: Koichi Kurita

Abstract:

In this study, we propose a real-time data collection technique for the detection of human walking motion from the charge generated on the human body. This technique is based on the detection of a sub-picoampere electrostatic induction current, generated by the motion, flowing through the electrode of a wireless portable sensor attached to the subject. An FFT analysis of the wave-forms of the electrostatic induction currents generated by the walking motions showed that the currents generated under normal and restricted walking conditions were different. Moreover, we carried out a cepstrum analysis to detect any differences in the walking style. Results suggest that a slight difference in motion, either due to the individual’s gait or a splinted leg, is directly reflected in the electrostatic induction current generated by the walking motion. The proposed wireless portable sensor enables the detection of even subtle differences in walking motion.

Keywords: human walking motion, motion measurement, current measurement, electrostatic induction

Procedia PDF Downloads 344
2787 Calcium Silicate Bricks – Ultrasonic Pulse Method: Effects of Natural Frequency of Transducers on Measurement Results

Authors: Jiri Brozovsky

Abstract:

Modulus of elasticity is one of the important parameters of construction materials, which considerably influence their deformation properties and which can also be determined by means of non-destructive test methods like ultrasonic pulse method. However, measurement results of ultrasonic pulse methods are influenced by various factors, one of which is the natural frequency of the transducers. The paper states knowledge about influence of natural frequency of the transducers (54; 82 and 150kHz) on ultrasonic pulse velocity and dynamic modulus of elasticity (Young's Dynamic modulus of elasticity). Differences between ultrasonic pulse velocity and dynamic modulus of elasticity were found with the same smallest dimension of test specimen in the direction of sounding and density their value decreases as the natural frequency of transducers grew.

Keywords: calcium silicate brick, ultrasonic pulse method, ultrasonic pulse velocity, dynamic modulus of elasticity

Procedia PDF Downloads 416
2786 Corrosion Inhibition of Copper in 1M HNO3 Solution by Oleic Acid

Authors: S. Nigri, R. Oumeddour, F. Djazi

Abstract:

The inhibition of the corrosion of copper in 1 M HNO3 solution by oleic acid was investigated by weight loss measurement, potentiodynamic polarization and scanning electron microscope (SEM) studies. The experimental results have showed that this compound revealed a good corrosion inhibition and the inhibition efficiency is increased with the inhibitor concentration to reach 98%. The results obtained revealed that the adsorption of the inhibitor molecule onto metal surface is found to obey Langmuir adsorption isotherm. The temperature effect on the corrosion behavior of copper in 1 M HNO3 without and with inhibitor at different concentration was studied in the temperature range from 303 to 333 K and the kinetic parameters activation such as Ea, ∆Ha and ∆Sa were evaluated. Tafel plot analysis revealed that oleic acid acts as a mixed type inhibitor. SEM analysis substantiated the formation of protective layer over the copper surface.

Keywords: oleic acid, weight loss, electrochemical measurement, SEM analysis

Procedia PDF Downloads 395
2785 Modular Robotics and Terrain Detection Using Inertial Measurement Unit Sensor

Authors: Shubhakar Gupta, Dhruv Prakash, Apoorv Mehta

Abstract:

In this project, we design a modular robot capable of using and switching between multiple methods of propulsion and classifying terrain, based on an Inertial Measurement Unit (IMU) input. We wanted to make a robot that is not only intelligent in its functioning but also versatile in its physical design. The advantage of a modular robot is that it can be designed to hold several movement-apparatuses, such as wheels, legs for a hexapod or a quadpod setup, propellers for underwater locomotion, and any other solution that may be needed. The robot takes roughness input from a gyroscope and an accelerometer in the IMU, and based on the terrain classification from an artificial neural network; it decides which method of propulsion would best optimize its movement. This provides the bot with adaptability over a set of terrains, which means it can optimize its locomotion on a terrain based on its roughness. A feature like this would be a great asset to have in autonomous exploration or research drones.

Keywords: modular robotics, terrain detection, terrain classification, neural network

Procedia PDF Downloads 145
2784 Location Detection of Vehicular Accident Using Global Navigation Satellite Systems/Inertial Measurement Units Navigator

Authors: Neda Navidi, Rene Jr. Landry

Abstract:

Vehicle tracking and accident recognizing are considered by many industries like insurance and vehicle rental companies. The main goal of this paper is to detect the location of a car accident by combining different methods. The methods, which are considered in this paper, are Global Navigation Satellite Systems/Inertial Measurement Units (GNSS/IMU)-based navigation and vehicle accident detection algorithms. They are expressed by a set of raw measurements, which are obtained from a designed integrator black box using GNSS and inertial sensors. Another concern of this paper is the definition of accident detection algorithm based on its jerk to identify the position of that accident. In fact, the results convinced us that, even in GNSS blockage areas, the position of the accident could be detected by GNSS/INS integration with 50% improvement compared to GNSS stand alone.

Keywords: driver behavior monitoring, integration, IMU, GNSS, monitoring, tracking

Procedia PDF Downloads 234
2783 Framework Development of Carbon Management Software Tool in Sustainable Supply Chain Management of Indian Industry

Authors: Sarbjit Singh

Abstract:

This framework development explored the status of GSCM in manufacturing SMEs and concluded that there was a significant gap w.r.t carbon emissions measurement in the supply chain activities. The measurement of carbon emissions within supply chains is important green initiative toward its reduction. The majority of the SMEs were facing the problem to quantify the green house gas emissions in its supply chain & to make it a low carbon supply chain or GSCM. Thus, the carbon management initiatives were amalgamated with the supply chain activities in order to measure and reduce the carbon emissions, confirming the GHG protocol scopes. Henceforth, it covers the development of carbon management software (CMS) tool to quantify carbon emissions for effective carbon management. This tool is cheap and easy to use for the industries for the management of their carbon emissions within the supply chain.

Keywords: w.r.t carbon emissions, carbon management software, supply chain management, Indian Industry

Procedia PDF Downloads 469
2782 The Methods of Customer Satisfaction Measurement and Its Statistical Analysis towards Sales and Logistic Activities in Food Sector

Authors: Seher Arslankaya, Bahar Uludağ

Abstract:

Meeting the needs and demands of customers and pleasing the customers are important requirements for companies in food sectors where the growth of competition is significantly unpredictable. Customer satisfaction is also one of the key concepts which is mainly driven by wide range of customer preference and expectation upon products and services introduced and delivered to them. In order to meet the customer demands, the companies that engage in food sectors are expected to have a well-managed set of Total Quality Management (TQM), which sets out to improve quality of products and services; to reduce costs and to increase customer satisfaction by restructuring traditional management practices. It aims to increase customer satisfaction by meeting (their) customer expectations and requirements. The achievement would be determined with the help of customer satisfaction surveys, which is done to obtain immediate feedback and to provide quick responses. In addition, the surveys would also assist the making of strategic planning which helps to anticipate customer future needs and expectations. Meanwhile, periodic measurement of customer satisfaction would be a must because with the better understanding of customers perceptions from the surveys (done by questioners), the companies would have a clear idea to identify their own strengths and weaknesses that help the companies keep their loyal customers; to stand in comparison toward their competitors and map out their future progress and improvement. In this study, we propose a survey based on customer satisfaction measurement method and its statistical analysis for sales and logistic activities of food firms. Customer satisfaction would be discussed in details. Furthermore, after analysing the data derived from the questionnaire that applied to customers by using the SPSS software, various results obtained from the application would be presented. By also applying ANOVA test, the study would analysis the existence of meaningful differences between customer demographic proportion and their perceptions. The purpose of this study is also to find out requirements which help to remove the effects that decrease customer satisfaction and produce loyal customers in food industry. For this purpose, the customer complaints are collected. Additionally, comments and suggestions are done according to the obtained results of surveys, which would be useful for the making-process of strategic planning in food industry.

Keywords: customer satisfaction measurement and analysis, food industry, SPSS, TQM

Procedia PDF Downloads 250
2781 Prediction of Bubbly Plume Characteristics Using the Self-Similarity Model

Authors: Li Chen, Alex Skvortsov, Chris Norwood

Abstract:

Gas releasing into water can be found in for many industrial situations. This process results in the formation of bubbles and acoustic emission which depends upon the bubble characteristics. If the bubble creation rates (bubble volume flow rate) are of interest, an inverse method has to be used based on the measurement of acoustic emission. However, there will be sound attenuation through the bubbly plume which will influence the measurement and should be taken into consideration in the model. The sound transmission through the bubbly plume depends on the characteristics of the bubbly plume, such as the shape and the bubble distributions. In this study, the bubbly plume shape is modelled using a self-similarity model, which has been normally applied for a single phase buoyant plume. The prediction is compared with the experimental data. It has been found the model can be applied to a buoyant plume of gas-liquid mixture. The influence of the gas flow rate and discharge nozzle size is studied.

Keywords: bubbly plume, buoyant plume, bubble acoustics, self-similarity model

Procedia PDF Downloads 287
2780 Surfactant Improved Heavy Oil Recovery in Sandstone Reservoirs by Wettability Alteration

Authors: Rabia Hunky, Hayat Kalifa, Bai

Abstract:

The wettability of carbonate reservoirs has been widely recognized as an important parameter in oil recovery by flooding technology. Many surfactants have been studied for this application. However, the importance of wettability alteration in sandstone reservoirs by surfactant has been poorly studied. In this paper, our recent study of the relationship between rock surface wettability and cumulative oil recovery for sandstone cores is reported. In our research, it has been found there is a good agreement between the wettability and oil recovery. Nonionic surfactants, Tomadol® 25-12 and Tomadol® 45-13, are very effective in wettability alteration of sandstone core surface from highly oil-wet conditions to water-wet conditions. By spontaneous imbibition test, Interfacial tension, and contact angle measurement these two surfactants exhibit the highest recovery of the synthetic oil made with heavy oil. Based on these experimental results, we can further conclude that the contact angle measurement and imbibition test can be used as rapid screening tools to identify better EOR surfactants to increase heavy oil recovery from sandstone reservoirs.

Keywords: EOR, oil gas, IOR, WC, IF, oil and gas

Procedia PDF Downloads 103
2779 Improvement of Camera Calibration Based on the Relationship between Focal Length and Aberration Coefficient

Authors: Guorong Sui, Xingwei Jia, Chenhui Yin, Xiumin Gao

Abstract:

In the processing of camera-based high precision and non-contact measurement, the geometric-optical aberration is always inevitably disturbing the measuring system. Moreover, the aberration is different with the different focal length, which will increase the difficulties of the system’s calibration. Therefore, to understand the relationship between the focal length as a function of aberration properties is a very important issue to the calibration of the measuring systems. In this study, we propose a new mathematics model, which is based on the plane calibration method by Zhang Zhengyou, and establish a relationship between the focal length and aberration coefficient. By using the mathematics model and carefully modified compensation templates, the calibration precision of the system can be dramatically improved. The experiment results show that the relative error is less than 1%. It is important for optoelectronic imaging systems that apply to measure, track and position by changing the camera’s focal length.

Keywords: camera calibration, aberration coefficient, vision measurement, focal length, mathematics model

Procedia PDF Downloads 364
2778 Modeling of Particle Reduction and Volatile Compounds Profile during Chocolate Conching by Electronic Nose and Genetic Programming (GP) Based System

Authors: Juzhong Tan, William Kerr

Abstract:

Conching is one critical procedure in chocolate processing, where special flavors are developed, and smooth mouse feel the texture of the chocolate is developed due to particle size reduction of cocoa mass and other additives. Therefore, determination of the particle size and volatile compounds profile of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products. Currently, precise particle size measurement is usually done by laser scattering which is expensive and inaccessible to small/medium size chocolate manufacturers. Also, some other alternatives, such as micrometer and microscopy, can’t provide good measurements and provide little information. Volatile compounds analysis of cocoa during conching, has similar problems due to its high cost and limited accessibility. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was inserted to a conching machine and was used to monitoring the volatile compound profile of chocolate during the conching. A model correlated volatile compounds profiles along with factors including the content of cocoa, sugar, and the temperature during the conching to particle size of chocolate particles by genetic programming was established. The model was used to predict the particle size reduction of chocolates with different cocoa mass to sugar ratio (1:2, 1:1, 1.5:1, 2:1) at 8 conching time (15min, 30min, 1h, 1.5h, 2h, 4h, 8h, and 24h). And the predictions were compared to laser scattering measurements of the same chocolate samples. 91.3% of the predictions were within the range of later scatting measurement ± 5% deviation. 99.3% were within the range of later scatting measurement ± 10% deviation.

Keywords: cocoa bean, conching, electronic nose, genetic programming

Procedia PDF Downloads 255
2777 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 323
2776 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets

Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor Sookia

Abstract:

In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that 'fat-tailedness' alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.

Keywords: extreme value theory, financial crisis 2008, value at risk, frontier markets

Procedia PDF Downloads 276
2775 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 123
2774 An Experimental Study on the Measurement of Fuel to Air Ratio Using Flame Chemiluminescence

Authors: Sewon Kim, Chang Yeop Lee, Minjun Kwon

Abstract:

This study is aiming at establishing the relationship between the optical signal of flame and an equivalent ratio of flame. In this experiment, flame optical signal in a furnace is measured using photodiode. The combustion system which is composed of metal fiber burner and vertical furnace and flame chemiluminescence is measured at various experimental conditions. In this study, the flame chemiluminescence of laminar premixed flame is measured by using commercially available photodiode. It is experimentally investigated the relationship between equivalent ratio and photodiode signal. In addition, The strategy of combustion control method is proposed by using the optical signal and fuel pressure. The results showed that certain relationship between optical data of photodiode and equivalence ratio exists and this leads to the successful application of this system for instantaneous measurement of equivalence ration of the combustion system.

Keywords: flame chemiluminescence, photo diode, equivalence ratio, combustion control

Procedia PDF Downloads 397
2773 Fracture Crack Monitoring Using Digital Image Correlation Technique

Authors: B. G. Patel, A. K. Desai, S. G. Shah

Abstract:

The main of objective of this paper is to develop new measurement technique without touching the object. DIC is advance measurement technique use to measure displacement of particle with very high accuracy. This powerful innovative technique which is used to correlate two image segments to determine the similarity between them. For this study, nine geometrically similar beam specimens of different sizes with (steel fibers and glass fibers) and without fibers were tested under three-point bending in a closed loop servo-controlled machine with crack mouth opening displacement control with a rate of opening of 0.0005 mm/sec. Digital images were captured before loading (unreformed state) and at different instances of loading and were analyzed using correlation techniques to compute the surface displacements, crack opening and sliding displacements, load-point displacement, crack length and crack tip location. It was seen that the CMOD and vertical load-point displacement computed using DIC analysis matches well with those measured experimentally.

Keywords: Digital Image Correlation, fibres, self compacting concrete, size effect

Procedia PDF Downloads 389
2772 Assessing Perinatal Mental Illness during the COVID-19 Pandemic: A Review of Measurement Tools

Authors: Mya Achike

Abstract:

Background and Significance: Perinatal mental illness covers a wide range of conditions and has a huge influence on maternal-child health. Issues and challenges with perinatal mental health have been associated with poor pregnancy, birth, and postpartum outcomes. It is estimated that one out of five new and expectant mothers experience some degree of perinatal mental illness, which makes this a hugely significant health outcome. Certain factors increase the maternal risk for mental illness. Challenges related to poverty, migration, extreme stress, exposure to violence, emergency and conflict situations, natural disasters, and pandemics can exacerbate mental health disorders. It is widely expected that perinatal mental health is being negatively affected during the present COVID-19 pandemic. Methods: A review of studies that reported a measurement tool to assess perinatal mental health outcomes during the COVID-19 pandemic was conducted following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. PubMed, CINAHL, and Google Scholar were used to search for peer-reviewed studies published after late 2019, in accordance with the emergence of the virus. The search resulted in the inclusion of ten studies. Approach to measure health outcome: The main approach to measure perinatal mental illness is the use of self-administered, validated questionnaires, usually in the clinical setting. Summary: Widespread use of these tools has afforded the clinical and research communities the ability to identify and support women who may be suffering from mental illness disorders during a pandemic. More research is needed to validate tools in other vulnerable, perinatal populations.

Keywords: mental health during covid, perinatal mental health, perinatal mental health measurement tools, perinatal mental health tools

Procedia PDF Downloads 135
2771 Modeling and Analyzing the WAP Class 2 Wireless Transaction Protocol Using Event-B

Authors: Rajaa Filali, Mohamed Bouhdadi

Abstract:

This paper presents an incremental formal development of the Wireless Transaction Protocol (WTP) in Event-B. WTP is part of the Wireless Application Protocol (WAP) architectures and provides a reliable request-response service. To model and verify the protocol, we use the formal technique Event-B which provides an accessible and rigorous development method. This interaction between modelling and proving reduces the complexity and helps to eliminate misunderstandings, inconsistencies, and specification gaps. As result, verification of WTP allows us to find some deficiencies in the current specification.

Keywords: event-B, wireless transaction protocol, proof obligation, refinement, Rodin, ProB

Procedia PDF Downloads 317
2770 Dynamic Measurement System Modeling with Machine Learning Algorithms

Authors: Changqiao Wu, Guoqing Ding, Xin Chen

Abstract:

In this paper, ways of modeling dynamic measurement systems are discussed. Specially, for linear system with single-input single-output, it could be modeled with shallow neural network. Then, gradient based optimization algorithms are used for searching the proper coefficients. Besides, method with normal equation and second order gradient descent are proposed to accelerate the modeling process, and ways of better gradient estimation are discussed. It shows that the mathematical essence of the learning objective is maximum likelihood with noises under Gaussian distribution. For conventional gradient descent, the mini-batch learning and gradient with momentum contribute to faster convergence and enhance model ability. Lastly, experimental results proved the effectiveness of second order gradient descent algorithm, and indicated that optimization with normal equation was the most suitable for linear dynamic models.

Keywords: dynamic system modeling, neural network, normal equation, second order gradient descent

Procedia PDF Downloads 127
2769 Spaces of Interpretation: Personal Space

Authors: Yehuda Roth

Abstract:

In quantum theory, a system’s time evolution is predictable unless an observer performs measurement, as the measurement process can randomize the system. This randomness appears when the measuring device does not accurately describe the measured item, i.e., when the states characterizing the measuring device appear as a superposition of those being measured. When such a mismatch occurs, the measured data randomly collapse into a single eigenstate of the measuring device. This scenario resembles the interpretation process in which the observer does not experience an objective reality but interprets it based on preliminary descriptions initially ingrained into his/her mind. This distinction is the motivation for the present study in which the collapse scenario is regarded as part of the interpretation process of the observer. By adopting the formalism of the quantum theory, we present a complete mathematical approach that describes the interpretation process. We demonstrate this process by applying the proposed interpretation formalism to the ambiguous image "My wife and mother-in-law" to identify whether a woman in the picture is young or old.

Keywords: quantum-like interpretation, ambiguous image, determination, quantum-like collapse, classified representation

Procedia PDF Downloads 104
2768 Defence Ethics : A Performance Measurement Framework for the Defence Ethics Program

Authors: Allyson Dale, Max Hlywa

Abstract:

The Canadian public expects the highest moral standards from Canadian Armed Forces (CAF) members and Department of National Defence (DND) employees. The Chief, Professional Conduct and Culture (CPCC) stood up in April 2021 with the mission of ensuring that the defence culture and members’ conduct are aligned with the ethical principles and values that the organization aspires towards. The Defence Ethics Program (DEP), which stood up in 1997, is a values-based ethics program for individuals and organizations within the DND/CAF and now falls under CPCC. The DEP is divided into five key functional areas, including policy, communications, collaboration, training and education, and advice and guidance. The main focus of the DEP is to foster an ethical culture within defence so that members and organizations perform to the highest ethical standards. The measurement of organizational ethics is often complex and challenging. In order to monitor whether the DEP is achieving its intended outcomes, a performance measurement framework (PMF) was developed using the Director General Military Personnel Research and Analysis (DGMPRA) PMF development process. This evidence-based process is based on subject-matter expertise from the defence team. The goal of this presentation is to describe each stage of the DGMPRA PMF development process and to present and discuss the products of the DEP PMF (e.g., logic model). Specifically, first, a strategic framework was developed to provide a high-level overview of the strategic objectives, mission, and vision of the DEP. Next, Key Performance Questions were created based on the objectives in the strategic framework. A logic model detailing the activities, outputs (what is produced by the program activities), and intended outcomes of the program were developed to demonstrate how the program works. Finally, Key Performance Indicators were developed based on both the intended outcomes in the logic model and the Key Performance Questions in order to monitor program effectiveness. The Key Performance Indicators measure aspects of organizational ethics such as ethical conduct and decision-making, DEP collaborations, and knowledge and awareness of the Defence Ethics Code while leveraging ethics-related items from multiple DGMPRA surveys where appropriate.

Keywords: defence ethics, ethical culture, organizational performance, performance measurement framework

Procedia PDF Downloads 105
2767 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 157
2766 Performance Evaluation of the CareSTART S1 Analyzer for Quantitative Point-Of-Care Measurement of Glucose-6-Phosphate Dehydrogenase Activity

Authors: Haiyoung Jung, Mi Joung Leem, Sun Hwa Lee

Abstract:

Background & Objective: Glucose-6-phosphate dehydrogenase (G6PD) deficiency is a genetic abnormality that results in an inadequate amount of G6PD, leading to increased susceptibility of red blood cells to reactive oxygen species and hemolysis. The present study aimed to evaluate the careSTARTTM S1 analyzer for measuring G6PD activity to hemoglobin (Hb) ratio. Methods: Precision for G6PD activity and hemoglobin measurement was evaluated using control materials with two levels on five repeated runs per day for five days. The analytic performance of the careSTARTTM S1 analyzer was compared with spectrophotometry in 40 patient samples. Reference ranges suggested by the manufacturer were validated in 20 healthy males and females each. Results: The careSTARTTM S1 analyzer demonstrated precision of 6.0% for low-level (14~45 U/dL) and 2.7% for high-level (60~90 U/dL) control in G6PD activity, and 1.4% in hemoglobin (7.9~16.3 u/g Hb). A comparison study of G6PD to Hb ratio between the careSTARTTM S1 analyzer and spectrophotometry showed an average difference of 29.1% with a positive bias of the careSTARTTM S1 analyzer. All normal samples from the healthy population were validated for the suggested reference range for males (≥2.19 U/g Hb) and females (≥5.83 U/g Hb). Conclusion: The careSTARTTM S1 analyzer demonstrated good analytical performance and can replace the current spectrophotometric measurement of G6PD enzyme activity. In the aspect of the management of clinical laboratories, it can be a reasonable option as a point-of-care analyzer with minimal handling of samples and reagents, in addition to the automatic calculation of the ratio of measured G6PD activity and Hb concentration, to minimize any clerical errors involved with manual calculation.

Keywords: POCT, G6PD, performance evaluation, careSTART

Procedia PDF Downloads 64
2765 Mending Broken Fences Policing: Developing the Intelligence-Led/Community-Based Policing Model(IP-CP) and Quality/Quantity/Crime(QQC) Model

Authors: Anil Anand

Abstract:

Despite enormous strides made during the past decade, particularly with the adoption and expansion of community policing, there remains much that police leaders can do to improve police-public relations. The urgency is particularly evident in cities across the United States and Europe where an increasing number of police interactions over the past few years have ignited large, sometimes even national, protests against police policy and strategy, highlighting a gap between what police leaders feel they have archived in terms of public satisfaction, support, and legitimacy and the perception of bias among many marginalized communities. The decision on which one policing strategy is chosen over another, how many resources are allocated, and how strenuously the policy is applied resides primarily with the police and the units and subunits tasked with its enforcement. The scope and opportunity for police officers in impacting social attitudes and social policy are important elements that cannot be overstated. How do police leaders, for instance, decide when to apply one strategy—say community-based policing—over another, like intelligence-led policing? How do police leaders measure performance and success? Should these measures be based on quantitative preferences over qualitative, or should the preference be based on some other criteria? And how do police leaders define, allow, and control discretionary decision-making? Mending Broken Fences Policing provides police and security services leaders with a model based on social cohesion, that incorporates intelligence-led and community policing (IP-CP), supplemented by a quality/quantity/crime (QQC) framework to provide a four-step process for the articulable application of police intervention, performance measurement, and application of discretion.

Keywords: social cohesion, quantitative performance measurement, qualitative performance measurement, sustainable leadership

Procedia PDF Downloads 295
2764 Application Research of Stilbene Crystal for the Measurement of Accelerator Neutron Sources

Authors: Zhao Kuo, Chen Liang, Zhang Zhongbing, Ruan Jinlu. He Shiyi, Xu Mengxuan

Abstract:

Stilbene, C₁₄H₁₂, is well known as one of the most useful organic scintillators for pulse shape discrimination (PSD) technique for its good scintillation properties. An on-line acquisition system and an off-line acquisition system were developed with several CAMAC standard plug-ins, NIM plug-ins, neutron/γ discriminating plug-in named 2160A and a digital oscilloscope with high sampling rate respectively for which stilbene crystals and photomultiplier tube detectors (PMT) as detector for accelerator neutron sources measurement carried out in China Institute of Atomic Energy. Pulse amplitude spectrums and charge amplitude spectrums were real-time recorded after good neutron/γ discrimination whose best PSD figure-of-merits (FoMs) are 1.756 for D-D accelerator neutron source and 1.393 for D-T accelerator neutron source. The probability of neutron events in total events was 80%, and neutron detection efficiency was 5.21% for D-D accelerator neutron sources, which were 50% and 1.44% for D-T accelerator neutron sources after subtracting the background of scattering observed by the on-line acquisition system. Pulse waveform signals were acquired by the off-line acquisition system randomly while the on-line acquisition system working. The PSD FoMs obtained by the off-line acquisition system were 2.158 for D-D accelerator neutron sources and 1.802 for D-T accelerator neutron sources after waveform digitization off-line processing named charge integration method for just 1000 pulses. In addition, the probabilities of neutron events in total events obtained by the off-line acquisition system matched very well with the probabilities of the on-line acquisition system. The pulse information recorded by the off-line acquisition system could be repetitively used to adjust the parameters or methods of PSD research and obtain neutron charge amplitude spectrums or pulse amplitude spectrums after digital analysis with a limited number of pulses. The off-line acquisition system showed equivalent or better measurement effects compared with the online system with a limited number of pulses which indicated a feasible method based on stilbene crystals detectors for the measurement of prompt neutrons neutron sources like prompt accelerator neutron sources emit a number of neutrons in a short time.

Keywords: stilbene crystal, accelerator neutron source, neutron / γ discrimination, figure-of-merits, CAMAC, waveform digitization

Procedia PDF Downloads 187
2763 Analysis of the Acoustic Performance of Vertical Internal Seals with Pet Wool as NBR 15.575-4NO Green Towers Building-DF

Authors: Lucas Aerre, Wallesson Faria, Roberto Pimentel, Juliana Santos

Abstract:

An extremely disturbing and irritating element in the lives of people and organizations is the noise, the consequences that can bring us has a lot of connection with human health as well as financial and economic aspects. In order to improve the efficiency of buildings in Brazil in general, a performance standard was created, NBR 15.575 in which all buildings are seen in a more systemic and peculiar way, while following the requirements of the standard. The acoustic performance present in these buildings is one such requirement. Based on this, the present work was elaborated with the objective of evaluating through acoustic measurements the acoustic performance of vertical internal fences that are under the incidence of aerial noise of a building in the city of Brasilia-DF. A short theoretical basis is made and soon after the procedures of measurement are described through the control method established by the standard, and its results are evaluated according to the parameters of the same. The measurement performed between rooms of the same unit, presented a standardized sound pressure level difference (D nT, w) equal to 40 dB, thus being classified within the minimum performance required by the standard in question.

Keywords: airborne noise, performance standard, soundproofing, vertical seal

Procedia PDF Downloads 297
2762 Temperament and Psychopathology in Children of Patients Suffering from Schizophrenia

Authors: Rushi Naaz, Diksha Suchdeva

Abstract:

Background: Temperament is a very important aspect of functioning that needs to be understood in children of patients suffering from schizophrenia. The children of parents with mental disorder have substantially increased risk of psychiatric illness in them and may exhibit a range of problems from minor variations in temperament and adjustment to manifest psychiatric disorder. Method: A case control study was conducted to study the temperament characteristics and psychopathology in children of patients suffering from schizophrenia as compared to those of healthy controls. Both the groups were evaluated on Temperament Measurement Schedule and Childhood Psychopathology Measurement Schedule. Results: The results showed that children of patients suffering from schizophrenia were withdrawing, less adaptable, less sociable and had lower activity level than children of healthy parents. However, on the measure of psychopathology, no significant difference was found. Conclusion: Since temperament can be identified at an early age, children at risk for the disorder later on could be identified early enough for possible primary intervention.

Keywords: children, childhood psychopathology, parental psychopathology, psychiatric disorders, schizophrenia, temperament

Procedia PDF Downloads 372