Search results for: statistical verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4351

Search results for: statistical verification

4171 Association of Body Composition Parameters with Lower Limb Strength and Upper Limb Functional Capacity in Quilombola Remnants

Authors: Leonardo Costa Pereira, Frederico Santos Santana, Mauro Karnikowski, Luís Sinésio Silva Neto, Aline Oliveira Gomes, Marisete Peralta Safons, Margô Gomes De Oliveira Karnikowski

Abstract:

In Brazil, projections of population aging follow all world projections, the birth rate tends to be surpassed by the mortality rate around the year 2045. Historically, the population of Brazilian blacks suffered for several centuries from the oppression of dominant classes. A group, especially of blacks, stands out in relation to territorial, historical and social aspects, and for centuries they have isolated themselves in small communities, in order to maintain their freedom and culture. The isolation of the Quilombola communities generated socioeconomic effects as well as the health of these blacks. Thus, the objective of the present study is to verify the association of body composition parameters with lower and upper limb strength and functional capacity in Quilombola remnants. The research was approved by ethics committee (1,771,159). Anthropometric evaluations of hip and waist circumference, body mass and height were performed. In order to verify the body composition, the relationship between stature and body mass (BM) was performed, generating the body mass index (BMI), as well as the dual-energy X-ray absorptiometry (DEXA) test. The Time Up and Go (TUG) test was used to evaluate the functional capacity, and a maximum repetition test (1MR) for knee extension and handgrip (HG) was applied for strength magnitude analysis. Statistical analysis was performed using the statistical package SPSS 22.0. Shapiro Wilk's normality test was performed. For the possible correlations, the suggestions of the Pearson or Spearman tests were adopted. The results obtained after the interpretation identified that the sample (n = 18) was composed of 66.7% of female individuals with mean age of 66.07 ± 8.95 years. The sample’s body fat percentage (%BF) (35.65 ± 10.73) exceeds the recommendations for age group, as well as the anthropometric parameters of hip (90.91 ± 8.44cm) and waist circumference (80.37 ± 17.5cm). The relationship between height (1.55 ± 0.1m) and body mass (63.44 ± 11.25Kg) generated a BMI of 24.16 ± 7.09Kg/m2, that was considered normal. The TUG performance was 10.71 ± 1.85s. In the 1MR test, 46.67 ± 13.06Kg and in the HG 23.93±7.96Kgf were obtained, respectively. Correlation analyzes were characterized by the high frequency of significant correlations for height, dominant arm mass (DAM), %BF, 1MR and HG variables. In addition, correlations between HG and BM (r = 0.67, p = 0.005), height (r = 0.51, p = 0.004) and DAM (r = 0.55, p = 0.026) were also observed. The strength of the lower limbs correlates with BM (r = 0.69, p = 0.003), height (r = 0.62, p = 0.01) and DAM (r = 0.772, p = 0.001). In this way, we can conclude that not only the simple spatial relationship of mass and height can influence in predictive parameters of strength or functionality, being important the verification of the conditions of the corporal composition. For this population, height seems to be a good predictor of strength and body composition.

Keywords: African Continental Ancestry Group, body composition, functional capacity, strength

Procedia PDF Downloads 250
4170 Assessing the Accessibility to Primary Percutaneous Coronary Intervention

Authors: Tzu-Jung Tseng, Pei-Hsuen Han, Tsung-Hsueh Lu

Abstract:

Background: Ensuring patients with ST-elevation myocardial infarction (STEMI) access to hospitals that could perform percutaneous coronary intervention (PCI) in time is an important concern of healthcare managers. One commonly used the method to assess the coverage of population access to PCI hospital is the use GIS-estimated linear distance (crow's fly distance) between the district centroid and the nearest PCI hospital. If the distance is within a given distance (such as 20 km), the entire population of that district is considered to have appropriate access to PCI. The premise of using district centroid to estimate the coverage of population resident in that district is that the people live in the district are evenly distributed. In reality, the population density is not evenly distributed within the administrative district, especially in rural districts. Fortunately, the Taiwan government released basic statistical area (on average 450 population within the area) recently, which provide us an opportunity to estimate the coverage of population access to PCI services more accurate. Objectives: We aimed in this study to compare the population covered by a give PCI hospital according to traditional administrative district versus basic statistical area. We further examined if the differences between two geographic units used would be larger in a rural area than in urban area. Method: We selected two hospitals in Tainan City for this analysis. Hospital A is in urban area, hospital B is in rural area. The population in each traditional administrative district and basic statistical area are obtained from Taiwan National Geographic Information System, Ministry of Internal Affairs. Results: Estimated population live within 20 km of hospital A and B was 1,515,846 and 323,472 according to traditional administrative district and was 1,506,325 and 428,556 according to basic statistical area. Conclusion: In urban area, the estimated access population to PCI services was similar between two geographic units. However, in rural areas, the access population would be overestimated.

Keywords: accessibility, basic statistical area, modifiable areal unit problem (MAUP), percutaneous coronary intervention (PCI)

Procedia PDF Downloads 430
4169 Verification of Low-Dose Diagnostic X-Ray as a Tool for Relating Vital Internal Organ Structures to External Body Armour Coverage

Authors: Natalie A. Sterk, Bernard van Vuuren, Petrie Marais, Bongani Mthombeni

Abstract:

Injuries to the internal structures of the thorax and abdomen remain a leading cause of death among soldiers. Body armour is a standard issue piece of military equipment designed to protect the vital organs against ballistic and stab threats. When configured for maximum protection, the excessive weight and size of the armour may limit soldier mobility and increase physical fatigue and discomfort. Providing soldiers with more armour than necessary may, therefore, hinder their ability to react rapidly in life-threatening situations. The capability to determine the optimal trade-off between the amount of essential anatomical coverage and hindrance on soldier performance may significantly enhance the design of armour systems. The current study aimed to develop and pilot a methodology for relating internal anatomical structures with actual armour plate coverage in real-time using low-dose diagnostic X-ray scanning. Several pilot scanning sessions were held at Lodox Systems (Pty) Ltd head-office in South Africa. Testing involved using the Lodox eXero-dr to scan dummy trunk rigs at various degrees and heights of measurement; as well as human participants, wearing correctly fitted body armour while positioned in supine, prone shooting, seated and kneeling shooting postures. The verification of sizing and metrics obtained from the Lodox eXero-dr were then confirmed through a verification board with known dimensions. Results indicated that the low-dose diagnostic X-ray has the capability to clearly identify the vital internal structures of the aortic arch, heart, and lungs in relation to the position of the external armour plates. Further testing is still required in order to fully and accurately identify the inferior liver boundary, inferior vena cava, and spleen. The scans produced in the supine, prone, and seated postures provided superior image quality over the kneeling posture. The X-ray-source and-detector distance from the object must be standardised to control for possible magnification changes and for comparison purposes. To account for this, specific scanning heights and angles were identified to allow for parallel scanning of relevant areas. The low-dose diagnostic X-ray provides a non-invasive, safe, and rapid technique for relating vital internal structures with external structures. This capability can be used for the re-evaluation of anatomical coverage required for essential protection while optimising armour design and fit for soldier performance.

Keywords: body armour, low-dose diagnostic X-ray, scanning, vital organ coverage

Procedia PDF Downloads 96
4168 Transient Simulation Using SPACE for ATLAS Facility to Investigate the Effect of Heat Loss on Major Parameters

Authors: Suhib A. Abu-Seini, Kyung-Doo Kim

Abstract:

A heat loss model for ATLAS facility was introduced using SPACE code predefined correlations and various dialing factors. As all previous simulations were carried out using a heat loss free input; the facility was considered to be completely insulated and the core power was reduced by the experimentally measured values of heat loss to compensate to the account for the loss of heat, this study will consider heat loss throughout the simulation. The new heat loss model will be affecting SPACE code simulation as heat being leaked out of the system throughout a transient will alter many parameters corresponding to temperature and temperature difference. For that, a Station Blackout followed by a multiple Steam Generator Tube Rupture accident will be simulated using both the insulated system approach and the newly introduced heat loss input of the steady state. Major parameters such as system temperatures, pressure values, and flow rates to be put into comparison and various analysis will be suggested upon it as the experimental values will not be the reference to validate the expected outcome. This study will not only show the significance of heat loss consideration in the processes of prevention and mitigation of various incidents, design basis and beyond accidents as it will give a detailed behavior of ATLAS facility during both processes of steady state and major transient, but will also present a verification of how credible the data acquired of ATLAS are; since heat loss values for steady state were already mismatched between SPACE simulation results and ATLAS data acquiring system. Acknowledgement- This work was supported by the Korean institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea.

Keywords: ATLAS, heat loss, simulation, SPACE, station blackout, steam generator tube rupture, verification

Procedia PDF Downloads 202
4167 A Microwave Heating Model for Endothermic Reaction in the Cement Industry

Authors: Sofia N. Gonçalves, Duarte M. S. Albuquerque, José C. F. Pereira

Abstract:

Microwave technology has been gaining importance in contributing to decarbonization processes in high energy demand industries. Despite the several numerical models presented in the literature, a proper Verification and Validation exercise is still lacking. This is important and required to evaluate the physical process model accuracy and adequacy. Another issue addresses impedance matching, which is an important mechanism used in microwave experiments to increase electromagnetic efficiency. Such mechanism is not available in current computational tools, thus requiring an external numerical procedure. A numerical model was implemented to study the continuous processing of limestone with microwave heating. This process requires the material to be heated until a certain temperature that will prompt a highly endothermic reaction. Both a 2D and 3D model were built in COMSOL Multiphysics to solve the two-way coupling between Maxwell and Energy equations, along with the coupling between both heat transfer phenomena and limestone endothermic reaction. The 2D model was used to study and evaluate the required numerical procedure, being also a benchmark test, allowing other authors to implement impedance matching procedures. To achieve this goal, a controller built in MATLAB was used to continuously matching the cavity impedance and predicting the required energy for the system, thus successfully avoiding energy inefficiencies. The 3D model reproduces realistic results and therefore supports the main conclusions of this work. Limestone was modeled as a continuous flow under the transport of concentrated species, whose material and kinetics properties were taken from literature. Verification and Validation of the coupled model was taken separately from the chemical kinetic model. The chemical kinetic model was found to correctly describe the chosen kinetic equation by comparing numerical results with experimental data. A solution verification was made for the electromagnetic interface, where second order and fourth order accurate schemes were found for linear and quadratic elements, respectively, with numerical uncertainty lower than 0.03%. Regarding the coupled model, it was demonstrated that the numerical error would diverge for the heat transfer interface with the mapped mesh. Results showed numerical stability for the triangular mesh, and the numerical uncertainty was less than 0.1%. This study evaluated limestone velocity, heat transfer, and load influence on thermal decomposition and overall process efficiency. The velocity and heat transfer coefficient were studied with the 2D model, while different loads of material were studied with the 3D model. Both models demonstrated to be highly unstable when solving non-linear temperature distributions. High velocity flows exhibited propensity to thermal runways, and the thermal efficiency showed the tendency to stabilize for the higher velocities and higher filling ratio. Microwave efficiency denoted an optimal velocity for each heat transfer coefficient, pointing out that electromagnetic efficiency is a consequence of energy distribution uniformity. The 3D results indicated the inefficient development of the electric field for low filling ratios. Thermal efficiencies higher than 90% were found for the higher loads and microwave efficiencies up to 75% were accomplished. The 80% fill ratio was demonstrated to be the optimal load with an associated global efficiency of 70%.

Keywords: multiphysics modeling, microwave heating, verification and validation, endothermic reactions modeling, impedance matching, limestone continuous processing

Procedia PDF Downloads 114
4166 Stability Design by Geometrical Nonlinear Analysis Using Equivalent Geometric Imperfections

Authors: S. Fominow, C. Dobert

Abstract:

The present article describes the research that deals with the development of equivalent geometric imperfections for the stability design of steel members considering lateral-torsional buckling. The application of these equivalent imperfections takes into account the stiffness-reducing effects due to inelasticity and residual stresses, which lead to a reduction of the load carrying capacity of slender members and structures. This allows the application of a simplified design method, that is performed in three steps. Application of equivalent geometric imperfections, determination of internal forces using geometrical non-linear analysis (GNIA) and verification of the cross-section resistance at the most unfavourable location. All three verification steps are closely related and influence the results. The derivation of the equivalent imperfections was carried out in several steps. First, reference lateral-torsional buckling resistances for various rolled I-sections, slenderness grades, load shapes and steel grades were determined. This was done either with geometric and material non-linear analysis with geometrical imperfections and residual stresses (GMNIA) or for standard cases based on the equivalent member method. With the aim of obtaining identical lateral-torsional buckling resistances as the reference resistances from the application of the design method, the required sizes for equivalent imperfections were derived. For this purpose, a program based on the FEM method has been developed. Based on these results, several proposals for the specification of equivalent geometric imperfections have been developed. These differ in the shape of the applied equivalent geometric imperfection, the model of the cross-sectional resistance and the steel grade. The proposed design methods allow a wide range of applications and a reliable calculation of the lateral-torsional buckling resistances, as comparisons between the calculated resistances and the reference resistances have shown.

Keywords: equivalent geometric imperfections, GMNIA, lateral-torsional buckling, non-linear finite element analysis

Procedia PDF Downloads 130
4165 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 242
4164 Simulation and Performance Evaluation of Transmission Lines with Shield Wire Segmentation against Atmospheric Discharges Using ATPDraw

Authors: Marcio S. da Silva, Jose Mauricio de B. Bezerra, Antonio E. de A. Nogueira

Abstract:

This paper aims to make a performance analysis of shield wire transmission lines against atmospheric discharges when it is made the option of sectioning the shield wire and verify if the tolerability of the change. As a goal of this work, it was established to make complete modeling of a transmission line in the ATPDraw program with shield wire grounded in all the towers and in some towers. The methodology used to make the proposed evaluation was to choose an actual transmission line that served as a case study. From the choice of transmission line and verification of all its topology and materials, complete modeling of the line using the ATPDraw software was performed. Then several atmospheric discharges were simulated by striking the grounded shield wires in each tower. These simulations served to identify the behavior of the existing line against atmospheric discharges. After this first analysis, the same line was reconsidered with shield wire segmentation. The shielding wire segmentation technique aims to reduce induced losses in shield wires and is adopted in some transmission lines in Brazil. With the same conditions of atmospheric discharge the transmission line, this time with shield wire segmentation was again evaluated. The results obtained showed that it is possible to obtain similar performances against atmospheric discharges between a shield wired line in multiple towers and the same line with shield wire segmentation if some precautions are adopted as verification of the ground resistance of the wire segmented shield, adequacy of the maximum length of the segmented gap, evaluation of the separation length of the electrodes of the insulator spark, among others. As a conclusion, it is verified that since the correct assessment and adopted the correct criteria of adjustment a transmission line with shielded wire segmentation can perform very similar to the traditional use with multiple earths. This solution contributes in a very important way to the reduction of energy losses in transmission lines.

Keywords: atmospheric discharges, ATPDraw, shield wire, transmission lines

Procedia PDF Downloads 143
4163 Statistical Analysis and Impact Forecasting of Connected and Autonomous Vehicles on the Environment: Case Study in the State of Maryland

Authors: Alireza Ansariyar, Safieh Laaly

Abstract:

Over the last decades, the vehicle industry has shown increased interest in integrating autonomous, connected, and electrical technologies in vehicle design with the primary hope of improving mobility and road safety while reducing transportation’s environmental impact. Using the State of Maryland (M.D.) in the United States as a pilot study, this research investigates CAVs’ fuel consumption and air pollutants (C.O., PM, and NOx) and utilizes meaningful linear regression models to predict CAV’s environmental effects. Maryland transportation network was simulated in VISUM software, and data on a set of variables were collected through a comprehensive survey. The number of pollutants and fuel consumption were obtained for the time interval 2010 to 2021 from the macro simulation. Eventually, four linear regression models were proposed to predict the amount of C.O., NOx, PM pollutants, and fuel consumption in the future. The results highlighted that CAVs’ pollutants and fuel consumption have a significant correlation with the income, age, and race of the CAV customers. Furthermore, the reliability of four statistical models was compared with the reliability of macro simulation model outputs in the year 2030. The error of three pollutants and fuel consumption was obtained at less than 9% by statistical models in SPSS. This study is expected to assist researchers and policymakers with planning decisions to reduce CAV environmental impacts in M.D.

Keywords: connected and autonomous vehicles, statistical model, environmental effects, pollutants and fuel consumption, VISUM, linear regression models

Procedia PDF Downloads 417
4162 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 399
4161 Space Weather and Earthquakes: A Case Study of Solar Flare X9.3 Class on September 6, 2017

Authors: Viktor Novikov, Yuri Ruzhin

Abstract:

The studies completed to-date on a relation of the Earth's seismicity and solar processes provide the fuzzy and contradictory results. For verification of an idea that solar flares can trigger earthquakes, we have analyzed a case of a powerful surge of solar flash activity early in September 2017 during approaching the minimum of 24th solar cycle was accompanied by significant disturbances of space weather. On September 6, 2017, a group of sunspots AR2673 generated a large solar flare of X9.3 class, the strongest flare over the past twelve years. Its explosion produced a coronal mass ejection partially directed towards the Earth. We carried out a statistical analysis of the catalogs of earthquakes USGS and EMSC for determination of the effect of solar flares on global seismic activity. New evidence of earthquake triggering due to the Sun-Earth interaction has been demonstrated by simple comparison of behavior of Earth's seismicity before and after the strong solar flare. The global number of earthquakes with magnitude of 2.5 to 5.5 within 11 days after the solar flare has increased by 30 to 100%. A possibility of electric/electromagnetic triggering of earthquake due to space weather disturbances is supported by results of field and laboratory studies, where the earthquakes (both natural and laboratory) were initiated by injection of electrical current into the Earth crust. For the specific case of artificial electric earthquake triggering the current density at a depth of earthquake, sources are comparable with estimations of a density of telluric currents induced by variation of space weather conditions due to solar flares. Acknowledgment: The work was supported by RFBR grant No. 18-05-00255.

Keywords: solar flare, earthquake activity, earthquake triggering, solar-terrestrial relations

Procedia PDF Downloads 118
4160 The Use of Multivariate Statistical and GIS for Characterization Groundwater Quality in Laghouat Region, Algeria

Authors: Rouighi Mustapha, Bouzid Laghaa Souad, Rouighi Tahar

Abstract:

Due to rain Shortage and the increase of population in the last years, wells excavation and groundwater use for different purposes had been increased without any planning. This is a great challenge for our country. Moreover, this scarcity of water resources in this region is unfortunately combined with rapid fresh water resources quality deterioration, due to salinity and contamination processes. Therefore, it is necessary to conduct the studies about groundwater quality in Algeria. In this work consists in the identification of the factors which influence the water quality parameters in Laghouat region by using statistical analysis Principal Component Analysis (PCA), Hierarchical Cluster Analysis (HCA) and geographic information system (GIS) in an attempt to discriminate the sources of the variation of water quality variations. The results of PCA technique indicate that variables responsible for water quality composition are mainly related to soluble salts variables; natural processes and the nature of the rock which modifies significantly the water chemistry. Inferred from the positive correlation between K+ and NO3-, NO3- is believed to be human induced rather than naturally originated. In this study, the multivariate statistical analysis and GIS allows the hydrogeologist to have supplementary tools in the characterization and evaluating of aquifers.

Keywords: cluster, analysis, GIS, groundwater, laghouat, quality

Procedia PDF Downloads 292
4159 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 702
4158 Statistical Analysis of Surface Roughness and Tool Life Using (RSM) in Face Milling

Authors: Mohieddine Benghersallah, Lakhdar Boulanouar, Salim Belhadi

Abstract:

Currently, higher production rate with required quality and low cost is the basic principle in the competitive manufacturing industry. This is mainly achieved by using high cutting speed and feed rates. Elevated temperatures in the cutting zone under these conditions shorten tool life and adversely affect the dimensional accuracy and surface integrity of component. Thus it is necessary to find optimum cutting conditions (cutting speed, feed rate, machining environment, tool material and geometry) that can produce components in accordance with the project and having a relatively high production rate. Response surface methodology is a collection of mathematical and statistical techniques that are useful for modelling and analysis of problems in which a response of interest is influenced by several variables and the objective is to optimize this response. The work presented in this paper examines the effects of cutting parameters (cutting speed, feed rate and depth of cut) on to the surface roughness through the mathematical model developed by using the data gathered from a series of milling experiments performed.

Keywords: Statistical analysis (RSM), Bearing steel, Coating inserts, Tool life, Surface Roughness, End milling.

Procedia PDF Downloads 402
4157 Desing of PSS and SVC to Improve Power System Stability

Authors: Mahmoud Samkan

Abstract:

In this paper, the design and assessment of new coordination between Power System Stabilizers (PSSs) and Static Var Compensator (SVC) in a multimachine power system via statistical method are proposed. The coordinated design problem of PSSs and SVC over a wide range of loading conditions is handled as an optimization problem. The Bacterial Swarming Optimization (BSO), which synergistically couples the Bacterial Foraging (BF) with the Particle Swarm Optimization (PSO), is employed to seek for optimal controllers parameters. By minimizing the proposed objective function, in which the speed deviations between generators are involved; stability performance of the system is enhanced. To compare the capability of PSS and SVC, both are designed independently, and then in a coordinated manner. Simultaneous tuning of the BSO based coordinated controller gives robust damping performance over wide range of operating conditions and large disturbance in compare to optimized PSS controller based on BSO (BSOPSS) and optimized SVC controller based on BSO (BSOSVC). Moreover, a statistical T test is executed to validate the robustness of coordinated controller versus uncoordinated one.

Keywords: SVC, PSSs, multimachine power system, coordinated design, bacteria swarm optimization, statistical assessment

Procedia PDF Downloads 353
4156 Verification of the Supercavitation Phenomena: Investigation of the Cavity Parameters and Drag Coefficients for Different Types of Cavitator

Authors: Sezer Kefeli, Sertaç Arslan

Abstract:

Supercavitation is a pressure dependent process which gives opportunity to eliminate the wetted surface effects on the underwater vehicle due to the differences of viscosity and velocity effects between liquid (freestream) and gas phase. Cavitation process occurs depending on rapid pressure drop or temperature rising in liquid phase. In this paper, pressure based cavitation is investigated due to the fact that is encountered in the underwater world, generally. Basically, this vapor-filled pressure based cavities are unstable and harmful for any underwater vehicle because these cavities (bubbles or voids) lead to intense shock waves while collapsing. On the other hand, supercavitation is a desired and stabilized phenomena than general pressure based cavitation. Supercavitation phenomena offers the idea of minimizing form drag, and thus supercavitating vehicles are revived. When proper circumstances are set up, which are either increasing the operating speed of the underwater vehicle or decreasing the pressure difference between free stream and artificial pressure, the continuity of the supercavitation is obtainable. There are 2 types of supercavitation to obtain stable and continuous supercavitation, and these are called as natural and artificial supercavitation. In order to generate natural supercavitation, various mechanical structures are discovered, which are called as cavitators. In literature, a lot of cavitator types are studied either experimentally or numerically on a CFD platforms with intent to observe natural supercavitation since the 1900s. In this paper, firstly, experimental results are obtained, and trend lines are generated based on supercavitation parameters in terms of cavitation number (), form drag coefficientC_D, dimensionless cavity diameter (d_m/d_c), and length (L_c/d_c). After that, natural cavitation verification studies are carried out for disk and cone shape cavitators. In addition, supercavitation parameters are numerically analyzed at different operating conditions, and CFD results are fitted into trend lines of experimental results. The aims of this paper are to generate one generally accepted drag coefficient equation for disk and cone cavitators at different cavitator half angle and investigation of the supercavitation parameters with respect to cavitation number. Moreover, 165 CFD analysis are performed at different cavitation numbers on FLUENT version 21R2. Five different cavitator types are modeled on SCDM with respect tocavitator’s half angles. After that, CFD database is generated depending on numerical results, and new trend lines are generated based on supercavitation parameters. These trend lines are compared with experimental results. Finally, the generally accepted drag coefficient equation and equations of supercavitation parameters are generated.

Keywords: cavity envelope, CFD, high speed underwater vehicles, supercavitation, supercavitating flows, supercavitation parameters, drag reduction, viscous force elimination, natural cavitation verification

Procedia PDF Downloads 107
4155 Developing and integrated Clinical Risk Management Model

Authors: Mohammad H. Yarmohammadian, Fatemeh Rezaei

Abstract:

Introduction: Improving patient safety in health systems is one of the main priorities in healthcare systems, so clinical risk management in organizations has become increasingly significant. Although several tools have been developed for clinical risk management, each has its own limitations. Aims: This study aims to develop a comprehensive tool that can complete the limitations of each risk assessment and management tools with the advantage of other tools. Methods: Procedure was determined in two main stages included development of an initial model during meetings with the professors and literature review, then implementation and verification of final model. Subjects and Methods: This study is a quantitative − qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment of the two parts of the fourth phase and seven phases of the research was conducted. Purposive and stratification sampling of various responsible teams for the selected process was conducted in the operating room. Final model verified in eight phases through application of activity breakdown structure, failure mode and effects analysis (FMEA), healthcare risk priority number (RPN), root cause analysis (RCA), FT, and Eindhoven Classification model (ECM) tools. This model has been conducted typically on patients admitted in a day-clinic ward of a public hospital for surgery in October 2012 to June. Statistical Analysis Used: Qualitative data analysis was done through content analysis and quantitative analysis done through checklist and edited RPN tables. Results: After verification the final model in eight-step, patient's admission process for surgery was developed by focus discussion group (FDG) members in five main phases. Then with adopted methodology of FMEA, 85 failure modes along with its causes, effects, and preventive capabilities was set in the tables. Developed tables to calculate RPN index contain three criteria for severity, two criteria for probability, and two criteria for preventability. Tree failure modes were above determined significant risk limitation (RPN > 250). After a 3-month period, patient's misidentification incidents were the most frequent reported events. Each RPN criterion of misidentification events compared and found that various RPN number for tree misidentification reported events could be determine against predicted score in previous phase. Identified root causes through fault tree categorized with ECM. Wrong side surgery event was selected by focus discussion group to purpose improvement action. The most important causes were lack of planning for number and priority of surgical procedures. After prioritization of the suggested interventions, computerized registration system in health information system (HIS) was adopted to prepare the action plan in the final phase. Conclusion: Complexity of health care industry requires risk managers to have a multifaceted vision. Therefore, applying only one of retrospective or prospective tools for risk management does not work and each organization must provide conditions for potential application of these methods in its organization. The results of this study showed that the integrated clinical risk management model can be used in hospitals as an efficient tool in order to improve clinical governance.

Keywords: failure modes and effective analysis, risk management, root cause analysis, model

Procedia PDF Downloads 219
4154 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules

Authors: Michael Naderhirn, Marco Pavone

Abstract:

Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.

Keywords: formal system verification, reachability, real time controller, hybrid system

Procedia PDF Downloads 216
4153 Statistical Shape Analysis of the Human Upper Airway

Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar

Abstract:

The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.

Keywords: medical imaging, image processing, FEM/BEM, statistical modelling

Procedia PDF Downloads 480
4152 Process of Analysis, Evaluation and Verification of the 'Real' Redevelopment of the Public Open Space at the Neighborhood’s Stairs: Case Study of Serres, Greece

Authors: Ioanna Skoufali

Abstract:

The present study is directed towards adaptation to climate change closely related to the phenomenon of the urban heat island (UHI). This issue is widespread and common to different urban realities, but particularly in Mediterranean cities that are characterized by dense urban. The attention of this work of redevelopment of the open space is focused on mitigation techniques aiming to solve local problems such as microclimatic parameters and the conditions of thermal comfort in summer, related to urban morphology. This quantitative analysis, evaluation, and verification survey involves the methodological elaboration applied in a real study case by Serres, through the experimental support of the ENVImet Pro V4.1 and BioMet software developed: i) in two phases concerning the anteoperam (phase a1 # 2013) and the post-operam (phase a2 # 2016); ii) in scenario A (+ 25% of green # 2017). The first study tends to identify the main intervention strategies, namely: the application of cool pavements, the increase of green surfaces, the creation of water surface and external fans; moreover, it obtains the minimum results achieved by the National Program 'Bioclimatic improvement project for public open space', EPPERAA (ESPA 2007-2013) related to the four environmental parameters illustrated below: the TAir = 1.5 o C, the TSurface = 6.5 o C, CDH = 30% and PET = 20%. In addition, the second study proposes a greater potential for improvement than postoperam intervention by increasing the vegetation within the district towards the SW/SE. The final objective of this in-depth design is to be transferable in homogeneous cases of urban regeneration processes with obvious effects on the efficiency of microclimatic mitigation and thermal comfort.

Keywords: cool pavements, microclimate parameters (TAir, Tsurface, Tmrt, CDH), mitigation strategies, outdoor thermal comfort (PET & UTCI)

Procedia PDF Downloads 171
4151 The Influence of Market Attractiveness and Core Competence on Value Creation Strategy and Competitive Advantage and Its Implication on Business Performance

Authors: Firsan Nova

Abstract:

The average Indonesian watches 5.5 hours of TV a day. With a population of 242 million people and a Free-to-Air (FTA) TV penetration rate of 56%, that equates to 745 million hours of television watched each day. With such potential, it is no wonder that many companies are now attempting to get into the Pay TV market. Research firm Media Partner Asia has forecast in its study that the number of Indonesian pay-television subscribers will climb from 2.4 million in 2012 to 8.7 million by 2020, with penetration scaling up from 7 percent to 21 percent. Key drivers of market growth, the study says, include macro trends built around higher disposable income and a rising middle class, with leading players continuing to invest significantly in sales, distribution and content. New entrants, in the meantime, will boost overall prospects. This study aims to examine and analyze the effect of Market Attractiveness and the Core Competence on Value Creation and Competitive Advantage and its impact to Business Performance in the pay TV industry in Indonesia. The study using strategic management science approach with the census method in which all members of the population are as sample. Verification method is used to examine the relationship between variables. The unit of analysis in this research is all Indonesian Pay TV business units totaling 19 business units. The unit of observation is the director and managers of each business unit. Hypothesis testing is performed by using statistical Partial Least Square (PLS). The conclusion of the study shows that the market attractiveness affects business performance through value creation and competitive advantage. The appropriate value creation comes from the company ability to optimize its core competence and exploit market attractiveness. Value creation affects competitive advantage. The competitive advantage can be determined based on the company's ability to create value for customers and the competitive advantage has an impact on business performance.

Keywords: market attractiveness, core competence, value creation, competitive advantage, business performance

Procedia PDF Downloads 319
4150 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 432
4149 A Brief Study about Nonparametric Adherence Tests

Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim

Abstract:

The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.

Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests

Procedia PDF Downloads 414
4148 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 257
4147 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 103
4146 Second Order Statistics of Dynamic Response of Structures Using Gamma Distributed Damping Parameters

Authors: Badreddine Chemali, Boualem Tiliouine

Abstract:

This article presents the main results of a numerical investigation on the uncertainty of dynamic response of structures with statistically correlated random damping Gamma distributed. A computational method based on a Linear Statistical Model (LSM) is implemented to predict second order statistics for the response of a typical industrial building structure. The significance of random damping with correlated parameters and its implications on the sensitivity of structural peak response in the neighborhood of a resonant frequency are discussed in light of considerable ranges of damping uncertainties and correlation coefficients. The results are compared to those generated using Monte Carlo simulation techniques. The numerical results obtained show the importance of damping uncertainty and statistical correlation of damping coefficients when obtaining accurate probabilistic estimates of dynamic response of structures. Furthermore, the effectiveness of the LSM model to efficiently predict uncertainty propagation for structural dynamic problems with correlated damping parameters is demonstrated.

Keywords: correlated random damping, linear statistical model, Monte Carlo simulation, uncertainty of dynamic response

Procedia PDF Downloads 242
4145 Irrigation Water Quality Evaluation Based on Multivariate Statistical Analysis: A Case Study of Jiaokou Irrigation District

Authors: Panpan Xu, Qiying Zhang, Hui Qian

Abstract:

Groundwater is main source of water supply in the Guanzhong Basin, China. To investigate the quality of groundwater for agricultural purposes in Jiaokou Irrigation District located in the east of the Guanzhong Basin, 141 groundwater samples were collected for analysis of major ions (K+, Na+, Mg2+, Ca2+, SO42-, Cl-, HCO3-, and CO32-), pH, and total dissolved solids (TDS). Sodium percentage (Na%), residual sodium carbonate (RSC), magnesium hazard (MH), and potential salinity (PS) were applied for irrigation water quality assessment. In addition, multivariate statistical techniques were used to identify the underlying hydrogeochemical processes. Results show that the content of TDS mainly depends on Cl-, Na+, Mg2+, and SO42-, and the HCO3- content is generally high except for the eastern sand area. These are responsible for complex hydrogeochemical processes, such as dissolution of carbonate minerals (dolomite and calcite), gypsum, halite, and silicate minerals, the cation exchange, as well as evaporation and concentration. The average evaluation levels of Na%, RSC, MH, and PS for irrigation water quality are doubtful, good, unsuitable, and injurious to unsatisfactory, respectively. Therefore, it is necessary for decision makers to comprehensively consider the indicators and thus reasonably evaluate the irrigation water quality.

Keywords: irrigation water quality, multivariate statistical analysis, groundwater, hydrogeochemical process

Procedia PDF Downloads 116
4144 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 127
4143 Introduction of Robust Multivariate Process Capability Indices

Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian

Abstract:

Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.

Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control

Procedia PDF Downloads 250
4142 Fusion of Finger Inner Knuckle Print and Hand Geometry Features to Enhance the Performance of Biometric Verification System

Authors: M. L. Anitha, K. A. Radhakrishna Rao

Abstract:

With the advent of modern computing technology, there is an increased demand for developing recognition systems that have the capability of verifying the identity of individuals. Recognition systems are required by several civilian and commercial applications for providing access to secured resources. Traditional recognition systems which are based on physical identities are not sufficiently reliable to satisfy the security requirements due to the use of several advances of forgery and identity impersonation methods. Recognizing individuals based on his/her unique physiological characteristics known as biometric traits is a reliable technique, since these traits are not transferable and they cannot be stolen or lost. Since the performance of biometric based recognition system depends on the particular trait that is utilized, the present work proposes a fusion approach which combines Inner knuckle print (IKP) trait of the middle, ring and index fingers with the geometrical features of hand. The hand image captured from a digital camera is preprocessed to find finger IKP as region of interest (ROI) and hand geometry features. Geometrical features are represented as the distances between different key points and IKP features are extracted by applying local binary pattern descriptor on the IKP ROI. The decision level AND fusion was adopted, which has shown improvement in performance of the combined scheme. The proposed approach is tested on the database collected at our institute. Proposed approach is of significance since both hand geometry and IKP features can be extracted from the palm region of the hand. The fusion of these features yields a false acceptance rate of 0.75%, false rejection rate of 0.86% for verification tests conducted, which is less when compared to the results obtained using individual traits. The results obtained confirm the usefulness of proposed approach and suitability of the selected features for developing biometric based recognition system based on features from palmar region of hand.

Keywords: biometrics, hand geometry features, inner knuckle print, recognition

Procedia PDF Downloads 192