Search results for: statistical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17155

Search results for: statistical methods

17035 Main Tendencies of Youth Unemployment and the Regulation Mechanisms for Decreasing Its Rate in Georgia

Authors: Nino Paresashvili, Nino Abesadze

Abstract:

The modern world faces huge challenges. Globalization changed the socio-economic conditions of many countries. The current processes in the global environment have a different impact on countries with different cultures. However, an alleviation of poverty and improvement of living conditions is still the basic challenge for the majority of countries, because much of the population still lives under the official threshold of poverty. It is very important to stimulate youth employment. In order to prepare young people for the labour market, it is essential to provide them with the appropriate professional skills and knowledge. It is necessary to plan efficient activities for decreasing an unemployment rate and for developing the perfect mechanisms for regulation of a labour market. Such planning requires thorough study and analysis of existing reality, as well as development of corresponding mechanisms. Statistical analysis of unemployment is one of the main platforms for regulation of the labour market key mechanisms. The corresponding statistical methods should be used in the study process. Such methods are observation, gathering, grouping, and calculation of the generalized indicators. Unemployment is one of the most severe socioeconomic problems in Georgia. According to the past as well as the current statistics, unemployment rates always have been the most problematic issue to resolve for policy makers. Analytical works towards to the above-mentioned problem will be the basis for the next sustainable steps to solve the main problem. The results of the study showed that the choice of young people is not often due to their inclinations, their interests and the labour market demand. That is why the wrong professional orientation of young people in most cases leads to their unemployment. At the same time, it was shown that there are a number of professions in the labour market with a high demand because of the deficit the appropriate specialties. To achieve healthy competitiveness in youth employment, it is necessary to formulate regional employment programs with taking into account the regional infrastructure specifications.

Keywords: unemployment, analysis, methods, tendencies, regulation mechanisms

Procedia PDF Downloads 343
17034 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: calibration model, monitoring, quality improvement, feature selection

Procedia PDF Downloads 320
17033 Geostatistical and Geochemical Study of the Aquifer System Waters Complex Terminal in the Valley of Oued Righ-Arid Area Algeria

Authors: Asma Bettahar, Imed Eddine Nezli, Sameh Habes

Abstract:

Groundwater resources in the Oued Righ valley are represented like the parts of the eastern basin of the Algerian Sahara, superposed by two major aquifers: the Intercalary Continental (IC) and the Terminal Complex (TC). From a qualitative point of view, various studies have highlighted that the waters of this region showed excessive mineralization, including the waters of the terminal complex (EC Avg equal 5854.61 S/cm) .The present article is a statistical approach by two multi methods various complementary (ACP, CAH), applied to the analytical data of multilayered aquifer waters Terminal Complex of the Oued Righ valley. The approach is to establish a correlation between the chemical composition of water and the lithological nature of different aquifer levels formations, and predict possible connection between groundwater’s layers. The results show that the mineralization of water is from geological origin. They concern the composition of the layers that make up the complex terminal.

Keywords: complex terminal, mineralization, oued righ, statistical approach

Procedia PDF Downloads 349
17032 Improving Road Infrastructure Safety Management Through Statistical Analysis of Road Accident Data. Case Study: Streets in Bucharest

Authors: Dimitriu Corneliu-Ioan, Gheorghe FrațIlă

Abstract:

Romania has one of the highest rates of road deaths among European Union Member States, and there is a concern that the country will not meet its goal of "zero deaths" by 2050. The European Union also aims to halve the number of people seriously injured in road accidents by 2030. Therefore, there is a need to improve road infrastructure safety management in Romania. The aim of this study is to analyze road accident data through statistical methods to assess the current state of road infrastructure safety in Bucharest. The study also aims to identify trends and make forecasts regarding serious road accidents and their consequences. The objective is to provide insights that can help prioritize measures to increase road safety, particularly in urban areas. The research utilizes statistical analysis methods, including exploratory analysis and descriptive statistics. Databases from the Traffic Police and the Romanian Road Authority are analyzed using Excel. Road risks are compared with the main causes of road accidents to identify correlations. The study emphasizes the need for better quality and more diverse collection of road accident data for effective analysis in the field of road infrastructure engineering. The research findings highlight the importance of prioritizing measures to improve road safety in urban areas, where serious accidents and their consequences are more frequent. There is a correlation between the measures ordered by road safety auditors and the main causes of serious accidents in Bucharest. The study also reveals the significant social costs of road accidents, amounting to approximately 3% of GDP, emphasizing the need for collaboration between local and central administrations in allocating resources for road safety. This research contributes to a clearer understanding of the current road infrastructure safety situation in Romania. The findings provide critical insights that can aid decision-makers in allocating resources efficiently and institutionally cooperating to achieve sustainable road safety. The data used for this study are collected from the Traffic Police and the Romanian Road Authority. The data processing involves exploratory analysis and descriptive statistics using the Excel tool. The analysis allows for a better understanding of the factors contributing to the current road safety situation and helps inform managerial decisions to eliminate or reduce road risks. The study addresses the state of road infrastructure safety in Bucharest and analyzes the trends and forecasts regarding serious road accidents and their consequences. It studies the correlation between road safety measures and the main causes of serious accidents. To improve road safety, cooperation between local and central administrations towards joint financial efforts is important. This research highlights the need for statistical data processing methods to substantiate managerial decisions in road infrastructure management. It emphasizes the importance of improving the quality and diversity of road accident data collection. The research findings provide a critical perspective on the current road safety situation in Romania and offer insights to identify appropriate solutions to reduce the number of serious road accidents in the future.

Keywords: road death rate, strategic objective, serious road accidents, road safety, statistical analysis

Procedia PDF Downloads 28
17031 Presenting a Model in the Analysis of Supply Chain Management Components by Using Statistical Distribution Functions

Authors: Ramin Rostamkhani, Thurasamy Ramayah

Abstract:

One of the most important topics of today’s industrial organizations is the challenging issue of supply chain management. In this field, scientists and researchers have published numerous practical articles and models, especially in the last decade. In this research, to our best knowledge, the discussion of data modeling of supply chain management components using well-known statistical distribution functions has been considered. The world of science owns mathematics, and showing the behavior of supply chain data based on the characteristics of statistical distribution functions is innovative research that has not been published anywhere until the moment of doing this research. In an analytical process, describing different aspects of functions including probability density, cumulative distribution, reliability, and failure function can reach the suitable statistical distribution function for each of the components of the supply chain management. It can be applied to predict the behavior data of the relevant component in the future. Providing a model to adapt the best statistical distribution function in the supply chain management components will be a big revolution in the field of the behavior of the supply chain management elements in today's industrial organizations. Demonstrating the final results of the proposed model by introducing the process capability indices before and after implementing it alongside verifying the approach through the relevant assessment as an acceptable verification is a final step. The introduced approach can save the required time and cost to achieve the organizational goals. Moreover, it can increase added value in the organization.

Keywords: analyzing, process capability indices, statistical distribution functions, supply chain management components

Procedia PDF Downloads 55
17030 Catalytic Thermodynamics of Nanocluster Adsorbates from Informational Statistical Mechanics

Authors: Forrest Kaatz, Adhemar Bultheel

Abstract:

We use an informational statistical mechanics approach to study the catalytic thermodynamics of platinum and palladium cuboctahedral nanoclusters. Nanoclusters and their adatoms are viewed as chemical graphs with a nearest neighbor adjacency matrix. We use the Morse potential to determine bond energies between cluster atoms in a coordination type calculation. We use adsorbate energies calculated from density functional theory (DFT) to study the adatom effects on the thermodynamic quantities, which are derived from a Hamiltonian. Oxygen radical and molecular adsorbates are studied on platinum clusters and hydrogen on palladium clusters. We calculate the entropy, free energy, and total energy as the coverage of adsorbates increases from bridge and hollow sites on the surface. Thermodynamic behavior versus adatom coverage is related to the structural distribution of adatoms on the nanocluster surfaces. The thermodynamic functions are characterized using a simple adsorption model, with linear trends as the coverage of adatoms increases. The data exhibits size effects for the measured thermodynamic properties with cluster diameters between 2 and 5 nm. Entropy and enthalpy calculations of Pt-O2 compare well with previous theoretical data for Pt(111)-O2, and our Pd-H results show similar trends as experimental measurements for Pd-H2 nanoclusters. Our methods are general and may be applied to wide variety of nanocluster adsorbate systems.

Keywords: catalytic thermodynamics, palladium nanocluster absorbates, platinum nanocluster absorbates, statistical mechanics

Procedia PDF Downloads 114
17029 Optimization of Lead Bioremediation by Marine Halomonas sp. ES015 Using Statistical Experimental Methods

Authors: Aliaa M. El-Borai, Ehab A. Beltagy, Eman E. Gadallah, Samy A. ElAssar

Abstract:

Bioremediation technology is now used for treatment instead of traditional metal removal methods. A strain was isolated from Marsa Alam, Red sea, Egypt showed high resistance to high lead concentration and was identified by the 16S rRNA gene sequencing technique as Halomonas sp. ES015. Medium optimization was carried out using Plackett-Burman design, and the most significant factors were yeast extract, casamino acid and inoculums size. The optimized media obtained by the statistical design raised the removal efficiency from 84% to 99% from initial concentration 250 ppm of lead. Moreover, Box-Behnken experimental design was applied to study the relationship between yeast extract concentration, casamino acid concentration and inoculums size. The optimized medium increased removal efficiency to 97% from initial concentration 500 ppm of lead. Immobilized Halomonas sp. ES015 cells on sponge cubes, using optimized medium in loop bioremediation column, showed relatively constant lead removal efficiency when reused six successive cycles over the range of time interval. Also metal removal efficiency was not affected by flow rate changes. Finally, the results of this research refer to the possibility of lead bioremediation by free or immobilized cells of Halomonas sp. ES015. Also, bioremediation can be done in batch cultures and semicontinuous cultures using column technology.

Keywords: bioremediation, lead, Box–Behnken, Halomonas sp. ES015, loop bioremediation, Plackett-Burman

Procedia PDF Downloads 153
17028 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 218
17027 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data

Authors: Hyun-Woo Cho

Abstract:

It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.

Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring

Procedia PDF Downloads 203
17026 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 88
17025 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 21
17024 Next Generation Radiation Risk Assessment and Prediction Tools Generation Applying AI-Machine (Deep) Learning Algorithms

Authors: Selim M. Khan

Abstract:

Indoor air quality is strongly influenced by the presence of radioactive radon (222Rn) gas. Indeed, exposure to high 222Rn concentrations is unequivocally linked to DNA damage and lung cancer and is a worsening issue in North American and European built environments, having increased over time within newer housing stocks as a function of as yet unclear variables. Indoor air radon concentration can be influenced by a wide range of environmental, structural, and behavioral factors. As some of these factors are quantitative while others are qualitative, no single statistical model can determine indoor radon level precisely while simultaneously considering all these variables across a complex and highly diverse dataset. The ability of AI- machine (deep) learning to simultaneously analyze multiple quantitative and qualitative features makes it suitable to predict radon with a high degree of precision. Using Canadian and Swedish long-term indoor air radon exposure data, we are using artificial deep neural network models with random weights and polynomial statistical models in MATLAB to assess and predict radon health risk to human as a function of geospatial, human behavioral, and built environmental metrics. Our initial artificial neural network with random weights model run by sigmoid activation tested different combinations of variables and showed the highest prediction accuracy (>96%) within the reasonable iterations. Here, we present details of these emerging methods and discuss strengths and weaknesses compared to the traditional artificial neural network and statistical methods commonly used to predict indoor air quality in different countries. We propose an artificial deep neural network with random weights as a highly effective method for assessing and predicting indoor radon.

Keywords: radon, radiation protection, lung cancer, aI-machine deep learnng, risk assessment, risk prediction, Europe, North America

Procedia PDF Downloads 60
17023 Global Developmental Delay and Its Association with Risk Factors: Validation by Structural Equation Modelling

Authors: Bavneet Kaur Sidhu, Manoj Tiwari

Abstract:

Global Developmental Delay (GDD) is a common pediatric condition. Etiologies of GDD might, however, differ in developing countries. In the last decade, sporadic families are being reported in various countries. As to the author’s best knowledge, many risk factors and their correlation with the prevalence of GDD have been studied but its statistical correlation has not been done. Thus we propose the present study by targeting the risk factor, prevalence and their statistical correlation with GDD. FMR1 gene was studied to confirm the disease and its penetrance. A complete questionnaire-based performance was designed for the statistical studies having a personal, past and present medical history along with their socio-economic status as well. Methods: We distributed the children’s age in 4 different age groups having 5-year intervals and applied structural equation modeling (SEM) techniques, Spearman’s rank correlation coefficient, Karl Pearson correlation coefficient, and chi-square test.Result: A total of 1100 families were enrolled for this study; among them, 330 were clinically and biologically confirmed (radiological studies) for the disease, 204 were males (61.8%), 126 were females (38.18%). We found that 27.87% were genetic and 72.12 were sporadic, out of 72.12 %, 43.277% cases from urban and 56.72% from the rural locality, the mothers' literacy rate was 32.12% and working women numbers were 41.21%. Conclusions: There is a significant association between mothers' age and GDD prevalence, which is also followed by mothers' literacy rate and mothers' occupation, whereas there was no association between fathers' age and GDD.

Keywords: global developmental delay, FMR1 gene, spearman’ rank correlation coefficient, structural equation modeling

Procedia PDF Downloads 95
17022 Defining of the Shape of the Spine Using Moiré Method in Case of Patients with Scheuermann Disease

Authors: Petra Balla, Gabor Manhertz, Akos Antal

Abstract:

Nowadays spinal deformities are very frequent problems among teenagers. Scheuermann disease is a one dimensional deformity of the spine, but it has prevalence over 11% of the children. A traditional technology, the moiré method was used by us for screening and diagnosing this type of spinal deformity. A LabVIEW program has been developed to evaluate the moiré pictures of patients with Scheuermann disease. Two different solutions were tested in this computer program, the extreme and the inflexion point calculation methods. Effects using these methods were compared and according to the results both solutions seemed to be appropriate. Statistical results showed better efficiency in case of the extreme search method where the average difference was only 6,09⁰.

Keywords: spinal deformity, picture evaluation, Moiré method, Scheuermann disease, curve detection, Moiré topography

Procedia PDF Downloads 319
17021 Effectiveness of Technology Enhanced Learning in Orthodontic Teaching

Authors: Mohammed Shaath

Abstract:

Aims Technological advancements in teaching and learning have made significant improvements over the past decade and have been incorporated in institutions to aid the learner’s experience. This review aims to assess whether Technology Enhanced Learning (TEL) pedagogy is more effective at improving students’ attitude and knowledge retention in orthodontic training than traditional methods. Methodology The searches comprised Systematic Reviews (SRs) related to the comparison of TEL and traditional teaching methods from the following databases: PubMed, SCOPUS, Medline, and Embase. One researcher performed the screening, data extraction, and analysis and assessed the risk of bias and quality using A Measurement Tool to Assess Systematic Reviews 2 (AMSTAR-2). Kirkpatrick’s 4-level evaluation model was used to evaluate the educational values. Results A sum of 34 SRs was identified after the removal of duplications and irrelevant SRs; 4 fit the inclusion criteria. On Level 1, students showed positivity to TEL methods, although acknowledging that the harder the platforms to use, the less favourable. Nonetheless, the students still showed high levels of acceptability. Level 2 showed there is no significant overall advantage of increased knowledge when it comes to TEL methods. One SR showed that certain aspects of study within orthodontics deliver a statistical improvement with TEL. Level 3 was the least reported on. Results showed that if left without time restrictions, TEL methods may be advantageous. Level 4 shows that both methods are equally as effective, but TEL has the potential to overtake traditional methods in the future as a form of active, student-centered approach. Conclusion TEL has a high level of acceptability and potential to improve learning in orthodontics. Current reviews have potential to be improved, but the biggest aspect that needs to be addressed is the primary study, which shows a lower level of evidence and heterogeneity in their results. As it stands, the replacement of traditional methods with TEL cannot be fully supported in an evidence-based manner. The potential of TEL methods has been recognized and is already starting to show some evidence of the ability to be more effective in some aspects of learning to cater for a more technology savvy generation.

Keywords: TEL, orthodontic, teaching, traditional

Procedia PDF Downloads 12
17020 A Statistical Analysis on Relationship between Temperature Variations with Latitude and Altitude regarding Total Amount of Atmospheric Carbon Dioxide in Iran

Authors: Masoumeh Moghbel

Abstract:

Nowadays, carbon dioxide which is produced by human activities is considered as the main effective factor in the global warming occurrence. Regarding to the role of CO2 and its ability in trapping the heat, the main objective of this research is study the effect of atmospheric CO2 (which is recorded in Manaloa) on variations of temperature parameters (daily mean temperature, minimum temperature and maximum temperature) in 5 meteorological stations in Iran which were selected according to the latitude and altitude in 40 years statistical period. Firstly, the trend of temperature parameters was studied by Regression and none-graphical Man-Kendal methods. Then, relation between temperature variations and CO2 were studied by Correlation technique. Also, the impact of CO2 amount on temperature in different atmospheric levels (850 and 500 hpa) was analyzed. The results illustrated that correlation coefficient between temperature variations and CO2 in low latitudes and high altitudes is more significant rather than other regions. it is important to note that altitude as the one of the main geographic factor has limitation in affecting the temperature variations, so that correlation coefficient between these two parameters in 850 hpa (r=0.86) is more significant than 500 hpa (r = 0.62).

Keywords: altitude, atmospheric carbon dioxide, latitude, temperature variations

Procedia PDF Downloads 368
17019 Smokeless Tobacco Oral Manifestation and Inflammatory Biomarkers in Saliva

Authors: Sintija Miļuna, Ričards Melderis, Loreta Briuka, Dagnija Rostoka, Ingus Skadiņš, Juta Kroiča

Abstract:

Objectives Smokeless tobacco products in Latvia become more available and favorable to young adults, especially students and athletes like hockey and floorball players. The aim of the research was to detect visual mucosal changes in the oral cavity in smokeless tobacco users and to evaluate pro - inflammatory and anti - inflammatory cytokine (IL-6, IL-1, IL-8, TNF Alpha) levels in saliva from smokeless tobacco users. Methods A smokeless tobacco group (n=10) and a control group (non-tobacco users) (n=10) were intraorally examined for oral lesions and 5 ml of saliva were collected. Saliva was analysed for Il-6, IL-1, Il-8, TNF Alpha using ELISA Sigma-Aldrich. For statistical analysis IBM Statistics 27 was used (Mann - Whitney U test, Spearman’s Rank Correlation coefficient). This research was approved by the Ethics Committee of Rīga Stradiņš University No.22/28.01.2016. This research has been developed with financing from the European Social Fund and Latvian state budget within the project no. 8.2.2.0/20/I/004 “Support for involving doctoral students in scientific research and studies” at Rīga Stradiņš University. Results IL-1, IL-6, IL-8, TNF Alpha levels were higher in the smokeless tobacco group (IL-1 83.34 pg/ml vs. 74.26 pg/ml; IL-6 195.10 pg/ml vs. 6.16 pg/ml; IL-8 736.34 pg/ml vs. 285.26 pg/ml; TNF Alpha 489.27 pg/ml vs. 200.9 pg/ml), but statistically there is no difference between control group and smokeless tobacco group (IL1 p=0.190, IL6 p=0.052, IL8 p=0.165, TNF alpha p=0.089). There was statistical correlation between IL1 and IL6 (p=0.023), IL6 and TNF alpha (p=0.028), IL8 and IL6 (p=0.005). Conclusions White localized lesions were detected in places where smokeless tobacco users placed sachets. There is a statistical correlation between IL6 and IL1 levels, IL6 and TNF alpha levels, IL8 and IL6 levels in saliva. There are no differences in the inflammatory cytokine levels between control group and smokeless tobacco group.

Keywords: smokeless tobacco, Snus, inflammatory biomarkers, oral lesions, oral pathology

Procedia PDF Downloads 102
17018 Reliability and Validity of Determining Ventilatory Threshold and Respiratory Compensation Point by Near-Infrared Spectroscopy

Authors: Tso-Yen Mao, De-Yen Liu, Chun-Feng Huang

Abstract:

Purpose: This research intends to investigate the reliability and validity of ventilatory threshold (VT) and respiratory compensation point (RCP) determined by skeletal muscle hemodynamic status. Methods: One hundred healthy male (age: 22±3 yrs; height: 173.1±6.0 cm; weight: 67.1±10.5 kg) performed graded cycling exercise test which ventilatory and skeletal muscle hemodynamic data were collected simultaneously. VT and RCP were determined by combined V-slope (VE vs. VCO2) and ventilatory efficiency (VE/VO2 vs. VE/VCO2) methods. Pearson correlation, paired t-test, and Bland-Altman plots were used to analyze reliability, validity, and similarities. Statistical significance was set at α =. 05. Results: There are high test-retest correlations of VT and RCP in ventilatory or near-infrared spectroscopy (NIRS) methods (VT vs. VTNIRS: 0.95 vs. 0.94; RCP vs. RCPNIRS: 0.93 vs. 0.93, p<. 05). There are high coefficient of determination at the first timing point of O2Hb decreased (R2 = 0.88, p<. 05) with VT, and high coefficient of determination at the second timing point of O2Hb declined (R2 = 0.89, p< .05) with RCP. VO2 of VT and RCP are not significantly different between ventilatory and NIRS methods (p>. 05). Conclusion: Using NIRS method to determine VT and RCP is reliable and valid in male individuals during graded exercise. Non-invasive skeletal muscle hemodynamics monitor also can be used for controlling training intensity in the future.

Keywords: anaerobic threshold, exercise intensity, hemodynamic, NIRS

Procedia PDF Downloads 276
17017 A Survey of Response Generation of Dialogue Systems

Authors: Yifan Fan, Xudong Luo, Pingping Lin

Abstract:

An essential task in the field of artificial intelligence is to allow computers to interact with people through natural language. Therefore, researches such as virtual assistants and dialogue systems have received widespread attention from industry and academia. The response generation plays a crucial role in dialogue systems, so to push forward the research on this topic, this paper surveys various methods for response generation. We sort out these methods into three categories. First one includes finite state machine methods, framework methods, and instance methods. The second contains full-text indexing methods, ontology methods, vast knowledge base method, and some other methods. The third covers retrieval methods and generative methods. We also discuss some hybrid methods based knowledge and deep learning. We compare their disadvantages and advantages and point out in which ways these studies can be improved further. Our discussion covers some studies published in leading conferences such as IJCAI and AAAI in recent years.

Keywords: deep learning, generative, knowledge, response generation, retrieval

Procedia PDF Downloads 92
17016 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam

Authors: Sahand Golmohammadi, Sana Hosseini Shirazi

Abstract:

Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the rock quality designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and stress reduction factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the rock engineering system (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.

Keywords: Q-system, rock engineering system, statistical analysis, rock mass, tunnel

Procedia PDF Downloads 28
17015 Review of Research on Effectiveness Evaluation of Technology Innovation Policy

Authors: Xue Wang, Li-Wei Fan

Abstract:

The technology innovation has become the driving force of social and economic development and transformation. The guidance and support of public policies is an important condition to promote the realization of technology innovation goals. Policy effectiveness evaluation is instructive in policy learning and adjustment. This paper reviews existing studies and systematically evaluates the effectiveness of policy-driven technological innovation. We used 167 articles from WOS and CNKI databases as samples to clarify the measurement of technological innovation indicators and analyze the classification and application of policy evaluation methods. In general, technology innovation input and technological output are the two main aspects of technological innovation index design, among which technological patents are the focus of research, the number of patents reflects the scale of technological innovation, and the quality of patents reflects the value of innovation from multiple aspects. As for policy evaluation methods, statistical analysis methods are applied to the formulation, selection and evaluation of the after-effect of policies to analyze the effect of policy implementation qualitatively and quantitatively. The bibliometric methods are mainly based on the public policy texts, discriminating the inter-government relationship and the multi-dimensional value of the policy. Decision analysis focuses on the establishment and measurement of the comprehensive evaluation index system of public policy. The economic analysis methods focus on the performance and output of technological innovation to test the policy effect. Finally, this paper puts forward the prospect of the future research direction.

Keywords: technology innovation, index, policy effectiveness, evaluation of policy, bibliometric analysis

Procedia PDF Downloads 29
17014 Comparison between Some of Robust Regression Methods with OLS Method with Application

Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq

Abstract:

The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.

Keywords: Robest, LTS, M estimate, MSE

Procedia PDF Downloads 197
17013 Factors Influencing the Enjoyment and Performance of Students in Statistics Service Courses: A Mixed-Method Study

Authors: Wilma Coetzee

Abstract:

Statistics lecturers experience that many students who are taking a service course in statistics do not like statistics. Students in these courses tend to struggle and do not perform well. This research takes a look at the student’s perspective, with the aim to determine how to change the teaching of statistics so that students will enjoy it more and perform better. Questionnaires were used to determine the perspectives of first year service statistics students at a South African university. Factors addressed included motivation to study, attitude toward statistics, statistical anxiety, mathematical abilities and tendency to procrastinate. Logistic regression was used to determine what contributes to students performing badly in statistics. The results show that the factors that contribute the most to students performing badly are: statistical anxiety, not being motivated and having had mathematical literacy instead of mathematics in secondary school. Two open ended questions were included in the questionnaire: 'I will enjoy statistics more if…' and 'I will perform better in statistics if…'. The answers to these questions were analyzed using qualitative methods. Frequent themes were identified for each of the questions. A simulation study incorporating bootstrapping was done to determine the saturation of the themes. The majority of the students indicated that they would perform better in statistics if they studied more, managed their time better, had a flare for mathematics and if the lecturer was able to explain difficult concepts better. They also want more active learning. To ensure that students enjoy statistics more, they want an active learning experience. They want fun activities, more interaction with the lecturer and with one another, more computer based problems, and more challenges. They want a better understanding of the subject, want to understand the relevance of statistics to their future career and want excellent lecturers. These findings can be used to direct the improvement of the tuition of statistics.

Keywords: active learning, performance in statistics, statistical anxiety, statistics education

Procedia PDF Downloads 113
17012 Meta-Review of Scholarly Publications on Biosensors: A Bibliometric Study

Authors: Nasrine Olson

Abstract:

With over 70,000 scholarly publications on the topic of biosensors, an overview of the field has become a challenge. To facilitate, there are currently over 700 expert-reviews of publications on biosensors and related topics. This study focuses on these review papers in order to provide a Meta-Review of the area. This paper provides a statistical analysis and overview of biosensor-related review papers. Comprehensive searches are conducted in the Web of Science, and PubMed databases and the resulting empirical material are analyzed using bibliometric methods and tools. The study finds that the biosensor-related review papers can be categorized in five related subgroups, broadly denoted by (i) properties of materials and particles, (ii) analysis and indicators, (iii) diagnostics, (iv) pollutant and analytical devices, and (v) treatment/ application. For an easy and clear access to the findings visualization of clusters and networks of connections are presented. The study includes a temporal dimension and identifies the trends over the years with an emphasis on the most recent developments. This paper provides useful insights for those who wish to form a better understanding of the research trends in the area of biosensors.

Keywords: bibliometrics, biosensors, meta-review, statistical analysis, trends visualization

Procedia PDF Downloads 179
17011 A Study of Binding Methods and Techniques in Safavid Era Emphasizing on Iran Shahnamehs (16-18th Century AD/10-12th Century AH)

Authors: Ashrafosadat Mousavi Laer, Elaheh Moravej

Abstract:

The art of binding was simple and elementary at the beginning of Islam. This art thrived gradually and continued its development as an independent art. Identification of the binding techniques and used materials in covers and investigation of the arrays give us indexes for the better identification of different doctrines and methods of that time. The catalogers of the manuscripts usually pay attention to four items: gender, color, art elegances, injury, and exquisiteness of the cover. The criterion for classification of the covers is their art nature and gender. 15th century AD (9th century AH) was the period of the binding art development in which the most beautiful covers were produced by the so-called method of ‘burning’. At 16th century AD (10th century AH), in Safavid era, art changed completely and a fundamental evolution occurred in the technique and method of binding. The greatest change in this art was the extensive use of stamp that was made mostly of steel and copper. Theses stamps were presses against leather. These covers were called ‘beat’. In this paper, writing and bookbinding of about 32 Shahnamehs of Safavid era available in the Iranian libraries and museums are studied. An analytical-statistical study shows that four methods have been used including beat, burning, mosaic, and oily. 69 percent of the covers of these copies are cardboards with a leathery coating (goatskin) and have been produced by burning and beat methods. Its reasons are that these two methods have been common methods in Safavid era and performing them was only feasible on leather and the most desirable and commonly used leather of that time was goatskin which was the best option for cover legend durability and preserving the book and it was more durable because it had been made of goat skin. In addition, it had prepared a suitable opportunity for the binding artist’s creativity and innovation.

Keywords: Shahnameh, Safavid era, bookbinding, beat cover, burning cover

Procedia PDF Downloads 200
17010 Modern Trends in Foreign Direct Investments in Georgia

Authors: Rusudan Kinkladze, Guguli Kurashvili, Ketevan Chitaladze

Abstract:

Foreign direct investment is a driving force in the development of the interdependent national economies, and the study and analysis of investments is an urgent problem. It is particularly important for transitional economies, such as Georgia, and the study and analysis of investments is an urgent problem. Consequently, the goal of the research is the study and analysis of direct foreign investments in Georgia, and identification and forecasting of modern trends, and covers the period of 2006-2015. The study uses the methods of statistical observation, grouping and analysis, the methods of analytical indicators of time series, trend identification and the predicted values are calculated, as well as various literary and Internet sources relevant to the research. The findings showed that modern investment policy In Georgia is favorable for domestic as well as foreign investors. Georgia is still a net importer of investments. In 2015, the top 10 investing countries was led by Azerbaijan, United Kingdom and Netherlands, and the largest share of FDIs were allocated in the transport and communication sector; the financial sector was the second, followed by the health and social work sector, and the same trend will continue in the future. 

Keywords: foreign direct investments, methods, statistics, analysis

Procedia PDF Downloads 283
17009 Generating Product Description with Generative Pre-Trained Transformer 2

Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen

Abstract:

Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.

Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining

Procedia PDF Downloads 159
17008 Development of a Biomechanical Method for Ergonomic Evaluation: Comparison with Observational Methods

Authors: M. Zare, S. Biau, M. Corq, Y. Roquelaure

Abstract:

A wide variety of observational methods have been developed to evaluate the ergonomic workloads in manufacturing. However, the precision and accuracy of these methods remain a subject of debate. The aims of this study were to develop biomechanical methods to evaluate ergonomic workloads and to compare them with observational methods. Two observational methods, i.e. SCANIA Ergonomic Standard (SES) and Rapid Upper Limb Assessment (RULA), were used to assess ergonomic workloads at two simulated workstations. They included four tasks such as tightening & loosening, attachment of tubes and strapping as well as other actions. Sensors were also used to measure biomechanical data (Inclinometers, Accelerometers, and Goniometers). Our findings showed that in assessment of some risk factors both RULA & SES were in agreement with the results of biomechanical methods. However, there was disagreement on neck and wrist postures. In conclusion, the biomechanical approach was more precise than observational methods, but some risk factors evaluated with observational methods were not measurable with the biomechanical techniques developed.

Keywords: ergonomic, observational method, biomechanical methods, workload

Procedia PDF Downloads 344
17007 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 105
17006 A Sociological Study of Rural Women Attitudes toward Education, Health and Work outside Home in Beheira Governorate, Egypt

Authors: A. A. Betah

Abstract:

This research was performed to evaluate the attitudes of rural women towards education, health and work outside the home. The study was based on a random sample of 147 rural women, Kafr-Rahmaniyah village was chosen for the study because its life expectancy at birth for females, education and percentage of females in the labor force, were the highest in the district. The study data were collected from rural female respondents, using a face-to-face questionnaire. In addition, the study estimated several factors like age, main occupation, family size, monthly household income, geographic cosmopolites, and degree of social participation for rural women respondents. Using Statistical Package for the Social Sciences (SPSS), data were analyzed by non-parametric statistical methods. The main finding in this study was a significant relationship between each of the previous variables and each of rural women’s attitudes toward education, health, and work outside home. The study concluded with some recommendations. The most important element is ensuring attention to rural women’s needs, requirements and rights via raising their health awareness, education and their contributions in their society.

Keywords: attitudes, education, health, rural women, work outside home

Procedia PDF Downloads 257