Search results for: the statistical measure
6343 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution
Authors: Haiyan Wu, Ying Liu, Shaoyun Shi
Abstract:
Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction
Procedia PDF Downloads 1366342 Strategies to Enhance Compliance of Health and Safety Standards at the Selected Mining Industries in Limpopo Province, South Africa: Occupational Health Nurse’s Perspective
Authors: Livhuwani Muthelo
Abstract:
The health and safety of the miners in the South African mining industry are guided by the regulations and standards which are anticipated to promote a healthy work environment and fatalities. It is of utmost importance for the miners to comply with these regulations/standards to protect themselves from potential occupational health and safety risks, accidents, and fatalities. The purpose of this study was to develop and validate strategies to enhance compliance with the Health and safety standards within the mining industries of Limpopo province in South Africa. A mixed-method exploratory sequential research design was adopted. The population consisted of 5350 miners. Purposive sampling was used to select the participants in the qualitative strand and stratified random sampling in the quantitative strand. Semi-structured interviews were conducted among the occupational health nurse practitioners and the health and safety team. Thematic analysis was used to generate an understanding of the interviews. In the quantitative strand, a survey was conducted using a self-administered questionnaire. Data were analysed using SPSS version 26.0. A descriptive statistical test was used in the analysis of data including frequencies, means, and standard deviation. Cronbach's alpha test was used to measure internal consistency. The integrated results revealed that there are diverse experiences related to health and safety standards compliance among the mineworkers. The main findings were challenges related to leadership compliance and also related to the cost of maintaining safety, Miner's behavior-related challenges; the impact of non-compliance on the overall health of the miners was also described, the conflict between production and safety. Health and safety compliance is not just mere compliance with regulations and standards but a culture that warrants the miners and organization to take responsibility for their behavior and actions towards health and safety. Thus taking responsibility for your well-being and other miners.Keywords: perceptions, compliance, health and safety, legislation, standards, miners
Procedia PDF Downloads 1046341 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading
Authors: Peter Shi
Abstract:
Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market
Procedia PDF Downloads 726340 Application of Multivariate Statistics and Hydro-Chemical Approach for Groundwater Quality Assessment: A Study on Birbhum District, West Bengal, India
Authors: N. C. Ghosh, Niladri Das, Prolay Mondal, Ranajit Ghosh
Abstract:
Groundwater quality deterioration due to human activities has become a prime factor of modern life. The major concern of the study is to access spatial variation of groundwater quality and to identify the sources of groundwater chemicals and its impact on human health of the concerned area. Multivariate statistical techniques, cluster, principal component analysis, and hydrochemical fancies are been applied to measure groundwater quality data on 14 parameters from 107 sites distributed randomly throughout the Birbhum district. Five factors have been extracted using Varimax rotation with Kaiser Normalization. The first factor explains 27.61% of the total variance where high positive loading have been concentrated in TH, Ca, Mg, Cl and F (Fluoride). In the studied region, due to the presence of basaltic Rajmahal trap fluoride contamination is highly concentrated and that has an adverse impact on human health such as fluorosis. The second factor explains 24.41% of the total variance which includes Na, HCO₃, EC, and SO₄. The last factor or the fifth factor explains 8.85% of the total variance, and it includes pH which maintains the acidic and alkaline character of the groundwater. Hierarchical cluster analysis (HCA) grouped the 107 sampling station into two clusters. One cluster having high pollution and another cluster having less pollution. Moreover hydromorphological facies viz. Wilcox diagram, Doneen’s chart, and USSL diagram reveal the quality of the groundwater like the suitability of the groundwater for irrigation or water used for drinking purpose like permeability index of the groundwater, quality assessment of groundwater for irrigation. Gibb’s diagram depicts that the major portion of the groundwater of this region is rock dominated origin, as the western part of the region characterized by the Jharkhand plateau fringe comprises basalt, gneiss, granite rocks.Keywords: correlation, factor analysis, hydrological facies, hydrochemistry
Procedia PDF Downloads 2136339 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct
Procedia PDF Downloads 2256338 Carbon Sequestration Modeling in the Implementation of REDD+ Programmes in Nigeria
Authors: Oluwafemi Samuel Oyamakin
Abstract:
The forest in Nigeria is currently estimated to extend to around 9.6 million hectares, but used to expand over central and southern Nigeria decades ago. The forest estate is shrinking due to long-term human exploitation for agricultural development, fuel wood demand, uncontrolled forest harvesting and urbanization, amongst other factors, compounded by population growth in rural areas. Nigeria has lost more than 50% of its forest cover since 1990 and currently less than 10% of the country is forested. The current deforestation rate is estimated at 3.7%, which is one of the highest in the world. Reducing Emissions from Deforestation and forest Degradation plus conservation, sustainable management of forests and enhancement of forest carbon stocks constituted what is referred to as REDD+. This study evaluated some of the existing way of computing carbon stocks using eight indigenous tree species like Mansonia, Shorea, Bombax, Terminalia superba, Khaya grandifolia, Khaya senegalenses, Pines and Gmelina arborea. While these components are the essential elements of REDD+ programme, they can be brought under a broader framework of systems analysis designed to arrive at optimal solutions for future predictions through statistical distribution pattern of carbon sequestrated by various species of tree. Available data on height and diameter of trees in Ibadan were studied and their respective potentials of carbon sequestration level were assessed and subjected to tests so as to determine the best statistical distribution that would describe the carbon sequestration pattern of trees. The result of this study suggests a reasonable statistical distribution for carbons sequestered in simulation studies and hence, allow planners and government in determining resources forecast for sustainable development especially where experiments with real-life systems are infeasible. Sustainable management of forest can then be achieved by projecting future condition of forests under different management regimes thereby supporting conservation and REDD+ programmes in Nigeria.Keywords: REDD+, carbon, climate change, height and diameter
Procedia PDF Downloads 1666337 Energetic and Exergetic Evaluation of Box-Type Solar Cookers Using Different Insulation Materials
Authors: A. K. Areamu, J. C. Igbeka
Abstract:
The performance of box-type solar cookers has been reported by several researchers but little attention was paid to the effect of the type of insulation material on the energy and exergy efficiency of these cookers. This research aimed at evaluating the energy and exergy efficiencies of the box-type cookers containing different insulation materials. Energy and exergy efficiencies of five box-type solar cookers insulated with maize cob, air (control), maize husk, coconut coir and polyurethane foam respectively were obtained over a period of three years. The cookers were evaluated using water heating test procedures in determining the energy and exergy analysis. The results were subjected to statistical analysis using ANOVA. The result shows that the average energy input for the five solar cookers were: 245.5, 252.2, 248.7, 241.5 and 245.5J respectively while their respective average energy losses were: 201.2, 212.7, 208.4, 189.1 and 199.8J. The average exergy input for five cookers were: 228.2, 234.4, 231.1, 224.4 and 228.2J respectively while their respective average exergy losses were: 223.4, 230.6, 226.9, 218.9 and 223.0J. The energy and exergy efficiency was highest in the cooker with coconut coir (37.35 and 3.90% respectively) in the first year but was lowest for air (11 and 1.07% respectively) in the third year. Statistical analysis showed significant difference between the energy and exergy efficiencies over the years. These results reiterate the importance of a good insulating material for a box-type solar cooker.Keywords: efficiency, energy, exergy, heating insolation
Procedia PDF Downloads 3676336 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People
Authors: Marlene Rosa, Susana Lopes
Abstract:
There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.Keywords: board game, aging, executive function, evaluation
Procedia PDF Downloads 1426335 Visualization and Performance Measure to Determine Number of Topics in Twitter Data Clustering Using Hybrid Topic Modeling
Authors: Moulana Mohammed
Abstract:
Topic models are widely used in building clusters of documents for more than a decade, yet problems occurring in choosing optimal number of topics. The main problem is the lack of a stable metric of the quality of topics obtained during the construction of topic models. The authors analyzed from previous works, most of the models used in determining the number of topics are non-parametric and quality of topics determined by using perplexity and coherence measures and concluded that they are not applicable in solving this problem. In this paper, we used the parametric method, which is an extension of the traditional topic model with visual access tendency for visualization of the number of topics (clusters) to complement clustering and to choose optimal number of topics based on results of cluster validity indices. Developed hybrid topic models are demonstrated with different Twitter datasets on various topics in obtaining the optimal number of topics and in measuring the quality of clusters. The experimental results showed that the Visual Non-negative Matrix Factorization (VNMF) topic model performs well in determining the optimal number of topics with interactive visualization and in performance measure of the quality of clusters with validity indices.Keywords: interactive visualization, visual mon-negative matrix factorization model, optimal number of topics, cluster validity indices, Twitter data clustering
Procedia PDF Downloads 1346334 Blood Volume Pulse Extraction for Non-Contact Photoplethysmography Measurement from Facial Images
Authors: Ki Moo Lim, Iman R. Tayibnapis
Abstract:
According to WHO estimation, 38 out of 56 million (68%) global deaths in 2012, were due to noncommunicable diseases (NCDs). To avert NCD, one of the solutions is early detection of diseases. In order to do that, we developed 'U-Healthcare Mirror', which is able to measure vital sign such as heart rate (HR) and respiration rate without any physical contact and consciousness. To measure HR in the mirror, we utilized digital camera. The camera records red, green, and blue (RGB) discoloration from user's facial image sequences. We extracted blood volume pulse (BVP) from the RGB discoloration because the discoloration of the facial skin is accordance with BVP. We used blind source separation (BSS) to extract BVP from the RGB discoloration and adaptive filters for removing noises. We utilized singular value decomposition (SVD) method to implement the BSS and the adaptive filters. HR was estimated from the obtained BVP. We did experiment for HR measurement by using our method and previous method that used independent component analysis (ICA) method. We compared both of them with HR measurement from commercial oximeter. The experiment was conducted under various distance between 30~110 cm and light intensity between 5~2000 lux. For each condition, we did measurement 7 times. The estimated HR showed 2.25 bpm of mean error and 0.73 of pearson correlation coefficient. The accuracy has improved compared to previous work. The optimal distance between the mirror and user for HR measurement was 50 cm with medium light intensity, around 550 lux.Keywords: blood volume pulse, heart rate, photoplethysmography, independent component analysis
Procedia PDF Downloads 3296333 Geochemistry of Nutrients in the South Lagoon of Tunis, Northeast of Tunisia, Using Multivariable Methods
Authors: Abidi Myriam, Ben Amor Rim, Gueddari Moncef
Abstract:
Understanding ecosystem response to the restoration project is essential to assess its rehabilitation. Indeed, the time elapsed after restoration is a critical indicator to shows the real of the restoration success. In this order, the south lagoon of Tunis, a shallow Mediterranean coastal area, has witnessed several pollutions. To resolve this environmental problem, a large restoration project of the lagoon was undertaken. In this restoration works, the main changes are the decrease of the residence time of the lagoon water and the nutrient concentrations. In this paper, we attempt to evaluate the trophic state of lagoon water for evaluating the risk of eutrophication after almost 16 years of its restoration. To attend this objectives water quality monitoring was untaken. In order to identify and to analyze the natural and anthropogenic factor governing the nutrients concentrations of lagoon water geochemical methods and multivariate statistical tools were used. Results show that nutrients have duel sources due to the discharge of municipal wastewater of Megrine City in the south side of the lagoon. The Carlson index shows that the South lagoon of Tunis Lagoon Tunis is eutrophic, and may show limited summer anoxia.Keywords: geochemistry, nutrients, statistical analysis, the south lagoon of Tunis, trophic state
Procedia PDF Downloads 1876332 The Automated Soil Erosion Monitoring System (ASEMS)
Authors: George N. Zaimes, Valasia Iakovoglou, Paschalis Koutalakis, Konstantinos Ioannou, Ioannis Kosmadakis, Panagiotis Tsardaklis, Theodoros Laopoulos
Abstract:
The advancements in technology allow the development of a new system that can continuously measure surface soil erosion. Continuous soil erosion measurements are required in order to comprehend the erosional processes and propose effective and efficient conservation measures to mitigate surface erosion. Mitigating soil erosion, especially in Mediterranean countries such as Greece, is essential in order to maintain environmental and agricultural sustainability. In this paper, we present the Automated Soil Erosion Monitoring System (ASEMS) that measures surface soil erosion along with other factors that impact erosional process. Specifically, this system measures ground level changes (surface soil erosion), rainfall, air temperature, soil temperature and soil moisture. Another important innovation is that the data will be collected by remote communication. In addition, stakeholder’s awareness is a key factor to help reduce any environmental problem. The different dissemination activities that were utilized are described. The overall outcomes were the development of an innovative system that can measure erosion very accurately. These data from the system help study the process of erosion and find the best possible methods to reduce erosion. The dissemination activities enhance the stakeholder's and public's awareness on surface soil erosion problems and will lead to the adoption of more effective soil erosion conservation practices in Greece.Keywords: soil management, climate change, new technologies, conservation practices
Procedia PDF Downloads 3456331 The Effect of Non-Surgical Periodontal Therapy on Metabolic Control in Children
Authors: Areej Al-Khabbaz, Swapna Goerge, Majedah Abdul-Rasoul
Abstract:
Introduction: The most prevalent periodontal disease among children is gingivitis, and it usually becomes more severe in adolescence. A number of intervention studies suggested that resolution of periodontal inflammation can improve metabolic control in patients diagnosed with diabetes mellitus. Aim: to assess the effect of non-surgical periodontal therapy on glycemic control of children diagnosed with diabetes mellitus. Method: Twenty-eight children diagnosed with diabetes mellitus were recruited with established diagnosis diabetes for at least 1 year. Informed consent and child assent form were obtained from children and parents prior to enrolment. The dental examination for the participants was performed on the same week directly following their annual medical assessment. All patients had their glycosylated hemoglobin (HbA1c%) test one week prior to their annual medical and dental visit and 3 months following non-surgical periodontal therapy. All patients received a comprehensive periodontal examination The periodontal assessment included clinical attachment loss, bleeding on probing, plaque score, plaque index and gingival index. All patients were referred for non-surgical periodontal therapy, which included oral hygiene instruction and motivation followed by supra-gingival and subg-ingival scaling using ultrasonic and hand instruments. Statistical Analysis: Data were entered and analyzed using the Statistical Package for Social Science software (SPSS, Chicago, USA), version 18. Statistical analysis of clinical findings was performed to detect differences between the two groups in term of periodontal findings and HbA1c%. Binary logistic regression analysis was performed in order to examine which factors were significant in multivariate analysis after adjusting for confounding between effects. The regression model used the dependent variable ‘Improved glycemic control’, and the independent variables entered in the model were plaque index, gingival index, bleeding %, plaque Statistical significance was set at p < 0.05. Result: A total of 28 children. The mean age of the participants was 13.3±1.92 years. The study participants were divided into two groups; Compliant group (received dental scaling) and non-complaints group (received oral hygiene instructions only). No statistical difference was found between compliant and non-compliant group in age, gender distribution, oral hygiene practice and the level of diabetes control. There was a significant difference between compliant and non-compliant group in term of improvement of HBa1c before and after periodontal therapy. Mean gingival index was the only significant variable associated with improved glycemic control level. In conclusion, this study has demonstrated that non-surgical mechanical periodontal therapy can improve HbA1c% control. The result of this study confirmed that children with diabetes mellitus who are compliant to dental care and have routine professional scaling may have better metabolic control compared to diabetic children who are erratic with dental care.Keywords: children, diabetes, metabolic control, periodontal therapy
Procedia PDF Downloads 1616330 Electroencephalography Correlates of Memorability While Viewing Advertising Content
Authors: Victor N. Anisimov, Igor E. Serov, Ksenia M. Kolkova, Natalia V. Galkina
Abstract:
The problem of memorability of the advertising content is closely connected with the key issues of neuromarketing. The memorability of the advertising content contributes to the marketing effectiveness of the promoted product. Significant directions of studying the phenomenon of memorability are the memorability of the brand (detected through the memorability of the logo) and the memorability of the product offer (detected through the memorization of dynamic audiovisual advertising content - commercial). The aim of this work is to reveal the predictors of memorization of static and dynamic audiovisual stimuli (logos and commercials). An important direction of the research was revealing differences in psychophysiological correlates of memorability between static and dynamic audiovisual stimuli. We assumed that static and dynamic images are perceived in different ways and may have a difference in the memorization process. Objective methods of recording psychophysiological parameters while watching static and dynamic audiovisual materials are well suited to achieve the aim. The electroencephalography (EEG) method was performed with the aim of identifying correlates of the memorability of various stimuli in the electrical activity of the cerebral cortex. All stimuli (in the groups of statics and dynamics separately) were divided into 2 groups – remembered and not remembered based on the results of the questioning method. The questionnaires were filled out by survey participants after viewing the stimuli not immediately, but after a time interval (for detecting stimuli recorded through long-term memorization). Using statistical method, we developed the classifier (statistical model) that predicts which group (remembered or not remembered) stimuli gets, based on psychophysiological perception. The result of the statistical model was compared with the results of the questionnaire. Conclusions: Predictors of the memorability of static and dynamic stimuli have been identified, which allows prediction of which stimuli will have a higher probability of remembering. Further developments of this study will be the creation of stimulus memory model with the possibility of recognizing the stimulus as previously seen or new. Thus, in the process of remembering the stimulus, it is planned to take into account the stimulus recognition factor, which is one of the most important tasks for neuromarketing.Keywords: memory, commercials, neuromarketing, EEG, branding
Procedia PDF Downloads 2516329 Intra and International Collaborations as Important Factors of Organisational Innovation of Government Agencies in STI Ecosystem in ASEAN
Authors: Salinthip Thipayang, Achara Chandrachai, Rath Pichyangkura, Sukree Sinthupinyo
Abstract:
Most of the well-known frameworks and tools to measure and compare organisational innovation of the public or government agencies have been designed and used in the developed economies such as the EU, Nordic Region, Australia, and South Korea. This project is one of the very first attempts to develop a measurement tool to adequately measure the organisational (administrative) innovation of the government agencies in the developing economies in ASEAN. New measurement framework with the components including the intra and international collaborations of these government agencies to other private, public and academic sectors were added to the proposed measurement framework. Questionnaires and in-depth interviews with the experts and the middle to top executives of the participating public agencies in the ASEAN member states were conducted to determine the suitability and develop the indicators that should be included in the measurement model. The results showed that intra and international collaborations of these government organisations to other agencies in the public, private and academic sectors can lead to new changes and greatly impact the ways in which these government agencies in the ASEAN STI ecosystem are operated and administered. Government organisations in less developing countries in ASEAN are ready and willing to learn from their counterparts in other more advanced countries and adjust their internal management to be more innovative and to better handle international collaborative projects and commitments.Keywords: organisational innovation, administrative innovation, government agencies, public agencies, ASEAN science technology and innovation ecosystem, international collaborations
Procedia PDF Downloads 3856328 Performance Evaluation and Planning for Road Safety Measures Using Data Envelopment Analysis and Fuzzy Decision Making
Authors: Hamid Reza Behnood, Esmaeel Ayati, Tom Brijs, Mohammadali Pirayesh Neghab
Abstract:
Investment projects in road safety planning can benefit from an effectiveness evaluation regarding their expected safety outcomes. The objective of this study is to develop a decision support system (DSS) to support policymakers in taking the right choice in road safety planning based on the efficiency of previously implemented safety measures in a set of regions in Iran. The measures considered for each region in the study include performance indicators about (1) police operations, (2) treated black spots, (3) freeway and highway facility supplies, (4) speed control cameras, (5) emergency medical services, and (6) road lighting projects. To this end, inefficiency measure is calculated, defined by the proportion of fatality rates in relation to the combined measure of road safety performance indicators (i.e., road safety measures) which should be minimized. The relative inefficiency for each region is modeled by the Data Envelopment Analysis (DEA) technique. In a next step, a fuzzy decision-making system is constructed to convert the information obtained from the DEA analysis into a rule-based system that can be used by policy makers to evaluate the expected outcomes of certain alternative investment strategies in road safety.Keywords: performance indicators, road safety, decision support system, data envelopment analysis, fuzzy reasoning
Procedia PDF Downloads 3526327 MAOD Is Estimated by Sum of Contributions
Authors: David W. Hill, Linda W. Glass, Jakob L. Vingren
Abstract:
Maximal accumulated oxygen deficit (MAOD), the gold standard measure of anaerobic capacity, is the difference between the oxygen cost of exhaustive severe intensity exercise and the accumulated oxygen consumption (O2; mL·kg–1). In theory, MAOD can be estimated as the sum of independent estimates of the phosphocreatine and glycolysis contributions, which we refer to as PCr+glycolysis. Purpose: The purpose was to test the hypothesis that PCr+glycolysis provides a valid measure of anaerobic capacity in cycling and running. Methods: The participants were 27 women (mean ± SD, age 22 ±1 y, height 165 ± 7 cm, weight 63.4 ± 9.7 kg) and 25 men (age 22 ± 1 y, height 179 ± 6 cm, weight 80.8 ± 14.8 kg). They performed two exhaustive cycling and running tests, at speeds and work rates that were tolerable for ~5 min. The rate of oxygen consumption (VO2; mL·kg–1·min–1) was measured in warmups, in the tests, and during 7 min of recovery. Fingerprick blood samples obtained after exercise were analysed to determine peak blood lactate concentration (PeakLac). The VO2 response in exercise was fitted to a model, with a fast ‘primary’ phase followed by a delayed ‘slow’ component, from which was calculated the accumulated O2 and the excess O2 attributable to the slow component. The VO2 response in recovery was fitted to a model with a fast phase and slow component, sharing a common time delay. Oxygen demand (in mL·kg–1·min–1) was determined by extrapolation from steady-state VO2 in warmups; the total oxygen cost (in mL·kg–1) was determined by multiplying this demand by time to exhaustion and adding the excess O2; then, MAOD was calculated as total oxygen cost minus accumulated O2. The phosphocreatine contribution (area under the fast phase of the post-exercise VO2) and the glycolytic contribution (converted from PeakLac) were summed to give PCr+glycolysis. There was not an interaction effect involving sex, so values for anaerobic capacity were examined using a two-way ANOVA, with repeated measures across method (PCr+glycolysis vs MAOD) and mode (cycling vs running). Results: There was a significant effect only for exercise mode. There was no difference between MAOD and PCr+glycolysis: values were 59 ± 6 mL·kg–1 and 61 ± 8 mL·kg–1 in cycling and 78 ± 7 mL·kg–1 and 75 ± 8 mL·kg–1 in running. Discussion: PCr+glycolysis is a valid measure of anaerobic capacity in cycling and running, and it is as valid for women as for men.Keywords: alactic, anaerobic, cycling, ergometer, glycolysis, lactic, lactate, oxygen deficit, phosphocreatine, running, treadmill
Procedia PDF Downloads 1366326 Cat Stool as an Additive Aggregate to Garden Bricks
Authors: Mary Joy B. Amoguis, Alonah Jane D. Labtic, Hyna Wary Namoca, Aira Jane V. Original
Abstract:
Animal waste has been rapidly increasing due to the growing animal population and the lack of innovative waste management practices. In a country like the Philippines, animal waste is rampant. This study aims to minimize animal waste by producing garden bricks using cat stool as an additive. The research study analyzes different levels of concentration to determine the most efficient combination in terms of compressive strength and durability of cat stool as an additive to garden bricks. The researcher's first collects the cat stool and incinerates the different concentrations. The first concentration is 25% cat stool and 75% cement mixture. The second concentration is 50% cat stool and 50% cement mixture. And the third concentration is 75% cat stool and 25% cement mixture. The researchers analyze the statistical data using one-way ANOVA, and the statistical analysis revealed a significant difference compared to the controlled variable. The research findings show an inversely proportional relationship: the higher the concentration of cat stool additive, the lower the compressive strength of the bricks, and the lower the concentration of cat stool additive, the higher the compressive strength of the bricks.Keywords: cat stool, garden bricks, cement, concentrations, animal wastes, compressive strength, durability, one-way ANOVA, additive, incineration, aggregates, stray cats
Procedia PDF Downloads 646325 Resilience in Refuge Context: The Validity Assessment Using Child and Youth Resilience Measure-28 among Afghan Young Immigrants in Iran
Authors: Baqir Rezai, Leila Heydarinasab, Rasol Roshan, Mohammad Ghulami
Abstract:
Introduction: The resilience process is one of the controversial and important subjects for child and youth immigrants throughout the world. Positive adaptation to the environment is a consequence of resilience which can affect the quality of life and physical and mental health among immigrants. Objective: A total of 714 Afghan young immigrants (14 to 18-years-old) who live in Iran for more than three years were entered into the study. A random sampling method was applied to obtain data. The study samples were divided into two groups (N1 =360 and N2=354) for exploratory and confirmation analysis. Exploratory factorial analysis was applied to confirm the construct validity of CYRM-28. Results: The results showed that this scale has useful validity content, and the study samples include three factors of individuals, context, and relational in child and youth resilience measure-28. However, from a total of 28 main items, only 15 items could identify these factors. Discussion: The resilience process among young immigrants is mainly explained by individuals, social and cultural conditions. For instance, young immigrants search the resilience process in conditions that caused their immigration. In this context, some questions about the content of security and personal promotion in society could identify three main factors.Keywords: CYRM-28, factorial analysis, resilience, Afghan young immigrants
Procedia PDF Downloads 1396324 Systematic Identification of Noncoding Cancer Driver Somatic Mutations
Authors: Zohar Manber, Ran Elkon
Abstract:
Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements
Procedia PDF Downloads 1046323 Characteristics of Cumulative Distribution Function of Grown Crack Size at Specified Fatigue Crack Propagation Life under Different Maximum Fatigue Loads in AZ31
Authors: Seon Soon Choi
Abstract:
Magnesium alloy has been widely used in structure such as an automobile. It is necessary to consider probabilistic characteristics of a structural material because a fatigue behavior of a structure has a randomness and uncertainty. The purpose of this study is to find the characteristics of the cumulative distribution function (CDF) of the grown crack size at a specified fatigue crack propagation life and to investigate a statistical crack propagation in magnesium alloys. The statistical fatigue data of the grown crack size are obtained through the fatigue crack propagation (FCP) tests under different maximum fatigue load conditions conducted on the replicated specimens of magnesium alloys. The 3-parameter Weibull distribution is used to find the CDF of grown crack size. The CDF of grown crack size in case of larger maximum fatigue load has longer tail in below 10 percent and above 90 percent. The fatigue failure occurs easily as the tail of CDF of grown crack size becomes long. The fatigue behavior under the larger maximum fatigue load condition shows more rapid propagation and failure mode.Keywords: cumulative distribution function, fatigue crack propagation, grown crack size, magnesium alloys, maximum fatigue load
Procedia PDF Downloads 2886322 Destination Port Detection For Vessels: An Analytic Tool For Optimizing Port Authorities Resources
Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin
Abstract:
Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/ unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages AIS messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring Automatic Identification System (AIS) messages. Our RRoT method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measure to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Fr´echet Distance (DFD), Dynamic Time Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an fmeasure of 99.08% using Dynamic Time Warping (DTW) similarity measure.Keywords: spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization
Procedia PDF Downloads 1216321 Examines the Proportionality between the Needs of Industry and Technical and Vocational Training of Male and Female Vocational Schools
Authors: Khalil Aryanfar, Pariya Gholipor, Elmira Hafez
Abstract:
This study examines the proportionality between the needs of industry and technical and vocational training of male and female vocational schools. The research method was descriptive that was conducted in two parts: documentary analysis and needs assessment and Delphi method was used in the need assessment. The statistical population of the study included 312 individuals from the industry sector employers and 52 of them were selected through stratified random sampling. Methods of data collection in this study, upstream documents include: document of the development of technical and vocational training, Statistical Yearbook 1393 in Tehran, the available documents in Isfahan Planning Department, the findings indicate that there is an almost proportionality between the needs of industry and Vocational training of male and female vocational schools in fields of welding, industrial electronics, electro technique, industrial drawing, auto mechanics, design, packaging, machine tool, metalworking, construction, accounting, computer graphics and the Administrative Affairs. The findings indicate that there is no proportionality between the needs of industry and Vocational training of male and female vocational schools in fields of Thermal - cooling systems, building electricity, building drawing, interior architecture, car electricity and motor repair.Keywords: needs assessment, technical and vocational training, industry
Procedia PDF Downloads 4546320 Effectiveness of Homoeopathic Medicine Conium Maculatum 200 C for Management of Pyuria
Authors: Amir Ashraf
Abstract:
Homoeopathy is an alternative system of medicine discovered by German physician Samuel Hahnemann in 1796. It has been used by several people for various health conditions globally for more than last 200 years. In India, homoeopathy is considered as a major system of alternative medicine. Homoeopathy is found effective in various medical conditions including Pyuria. Pyuria is the condition in which pus cells are found in urine. Homoeopathy is very useful for reducing pus cells, and homeopathically potentized Conium Mac (Hemlock) is an important remedy commonly used for reducing pyuria. Aim: To reduce the amount pus cells found in urine using Conium Mac 200C. Methods: Design. Small N Design. Samples: Purposive Sampling with 5 cases diagnosed as pyuria. Tools: Personal Data Schedule and ICD-10 Criteria for Pyuria. Techniques: Potentized homoeopathic medicine, Conium Mac 200th potency is used. Statistical Analysis: The statistical analyses were done using non-parametric tests. Results: There is significant pre/post difference has been identified. Conclusion: Homoeopathic potency, Conium Mac 200 C is effective in reducing the increased level of pus cells found in urine samples.Keywords: homoeopathy, alternative medicine, Pyuria, Conim Mac, small N design, non-parametric tests, homeopathic physician, Ashirvad Hospital, Kannur
Procedia PDF Downloads 3356319 Received Signal Strength Indicator Based Localization of Bluetooth Devices Using Trilateration: An Improved Method for the Visually Impaired People
Authors: Muhammad Irfan Aziz, Thomas Owens, Uzair Khaleeq uz Zaman
Abstract:
The instantaneous and spatial localization for visually impaired people in dynamically changing environments with unexpected hazards and obstacles, is the most demanding and challenging issue faced by the navigation systems today. Since Bluetooth cannot utilize techniques like Time Difference of Arrival (TDOA) and Time of Arrival (TOA), it uses received signal strength indicator (RSSI) to measure Receive Signal Strength (RSS). The measurements using RSSI can be improved significantly by improving the existing methodologies related to RSSI. Therefore, the current paper focuses on proposing an improved method using trilateration for localization of Bluetooth devices for visually impaired people. To validate the method, class 2 Bluetooth devices were used along with the development of a software. Experiments were then conducted to obtain surface plots that showed the signal interferences and other environmental effects. Finally, the results obtained show the surface plots for all Bluetooth modules used along with the strong and weak points depicted as per the color codes in red, yellow and blue. It was concluded that the suggested improved method of measuring RSS using trilateration helped to not only measure signal strength affectively but also highlighted how the signal strength can be influenced by atmospheric conditions such as noise, reflections, etc.Keywords: Bluetooth, indoor/outdoor localization, received signal strength indicator, visually impaired
Procedia PDF Downloads 1346318 Jejunostomy and Protective Ileostomy in a Patient with Massive Necrotizing Enterocolitis: A Case Report
Authors: Rafael Ricieri, Rogerio Barros
Abstract:
Objective: This study is to report a case of massive necrotizing enterocolitis in a six-month-old patient, requiring ileostomy and protective jejunostomy as a damage control measure in the first exploratory laparotomy surgery in massive enterocolitis without a previous diagnosis. Methods: This study is a case report of success in making and closing a protective jejunostomy. However, the low number of publications on this staged and risky measure of surgical resolution encouraged the team to study the indication and especially the correct time for closing the patient's protective jejunostomy. The main study instrument will be the six-month-old patient's medical record. Results: Based on the observation of the case described, it was observed that the time for the closure of the described procedure (protective jejunostomy) varies according to the level of compromise of the health status of your patient and of an individual of each person. Early closure, or failure to close, can lead to a favorable problem for the patient since several problems can result from this closure, such as new intestinal perforations, hydroelectrolyte disturbances. Despite the risk of new perforations, we suggest closing the protective jejunostomy around the 14th day of the procedure, thus keeping the patient on broad-spectrum antibiotic therapy and absolute fasting, thus reducing the chances of new intestinal perforations. Associated with the closure of the jejunostomy, a gastric tube for decompression is necessary, and care in an intensive care unit and electrolyte replacement is necessary to maintain the stability of the case.Keywords: jejunostomy, ileostomy, enterocolitis, pediatric surgery, gastric surgery
Procedia PDF Downloads 846317 Characterization on Molecular Weight of Polyamic Acids Using GPC Coupled with Multiple Detectors
Authors: Mei Hong, Wei Liu, Xuemin Dai, Yanxiong Pan, Xiangling Ji
Abstract:
Polyamic acid (PAA) is the precursor of polyimide (PI) prepared by a two-step method, its molecular weight and molecular weight distribution not only play an important role during the preparation and processing, but also influence the final performance of PI. However, precise characterization on molecular weight of PAA is still a challenge because of the existence of very complicated interactions in the solution system, including the electrostatic interaction, hydrogen bond interaction, dipole-dipole interaction, etc. Thus, it is necessary to establisha suitable strategy which can completely suppress these complex effects and get reasonable data on molecular weight. Herein, the gel permeation chromatography (GPC) coupled with differential refractive index (RI) and multi-angle laser light scattering (MALLS) detectors were applied to measure the molecular weight of (6FDA-DMB) PAA using different mobile phases, LiBr/DMF, LiBr/H3PO4/THF/DMF, LiBr/HAc/THF/DMF, and LiBr/HAc/DMF, respectively. It was found that combination of LiBr with HAc can shield the above-mentioned complex interactions and is more conducive to the separation of PAA than only addition of LiBr in DMF. LiBr/HAc/DMF was employed for the first time as a mild mobile phase to effectively separate PAA and determine its molecular weight. After a series of conditional experiments, 0.02M LiBr/0.2M HAc/DMF was fixed as an optimized mobile phase to measure the relative and absolute molecular weights of (6FDA-DMB) PAA prepared, and the obtained Mw from GPC-MALLS and GPC-RI were 35,300 g/mol and 125,000 g/mol, respectively. Particularly, such a mobile phase is also applicable to other PAA samples with different structures, and the final results on molecular weight are also reproducible.Keywords: Polyamic acids, Polyelectrolyte effects, Gel permeation chromatography, Mobile phase, Molecular weight
Procedia PDF Downloads 546316 Statistical Model to Examine the Impact of the Inflation Rate and Real Interest Rate on the Bahrain Economy
Authors: Ghada Abo-Zaid
Abstract:
Introduction: Oil is one of the most income source in Bahrain. Low oil price influence on the economy growth and the investment rate in Bahrain. For example, the economic growth was 3.7% in 2012, and it reduced to 2.9% in 2015. Investment rate was 9.8% in 2012, and it is reduced to be 5.9% and -12.1% in 2014 and 2015, respectively. The inflation rate is increased to the peak point in 2013 with 3.3 %. Objectives: The objectives here are to build statistical models to examine the effect of the interest rate inflation rate on the growth economy in Bahrain from 2000 to 2018. Methods: This study based on 18 years, and the multiple regression model is used for the analysis. All of the missing data are omitted from the analysis. Results: Regression model is used to examine the association between the Growth national product (GNP), the inflation rate, and real interest rate. We found that (i) Increase the real interest rate decrease the GNP. (ii) Increase the inflation rate does not effect on the growth economy in Bahrain since the average of the inflation rate was almost 2%, and this is considered as a low percentage. Conclusion: There is a positive impact of the real interest rate on the GNP in Bahrain. While the inflation rate does not show any negative influence on the GNP as the inflation rate was not large enough to effect negatively on the economy growth rate in Bahrain.Keywords: growth national product, egypt, regression model, interest rate
Procedia PDF Downloads 1646315 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow
Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam
Abstract:
Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety
Procedia PDF Downloads 3026314 Regional Flood-Duration-Frequency Models for Norway
Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu
Abstract:
Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV
Procedia PDF Downloads 72