Search results for: statistical physics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4491

Search results for: statistical physics

3591 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 159
3590 The Effect of Impinging WC-12Co Particles Temperature on Thickness of HVOF Thermally Sprayed Coatings

Authors: M. Jalali Azizpour

Abstract:

In this paper, the effect of WC-12Co particle Temperature in HVOF thermal spraying process on the coating thickness has been studied. The statistical results show that the spray distance and oxygen-to-fuel ratio are more effective factors on particle characterization and thickness of HVOF thermal spraying coatings. Spray Watch diagnostic system, scanning electron microscopy (SEM), X-ray diffraction and thickness measuring system were used for this purpose.

Keywords: HVOF, temperature thickness, velocity, WC-12Co

Procedia PDF Downloads 241
3589 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems

Authors: Bronwen Wade

Abstract:

Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.

Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality

Procedia PDF Downloads 54
3588 Flocking Swarm of Robots Using Artificial Innate Immune System

Authors: Muneeb Ahmad, Ali Raza

Abstract:

A computational method inspired by the immune system (IS) is presented, leveraging its shared characteristics of robustness, fault tolerance, scalability, and adaptability with swarm intelligence. This method aims to showcase flocking behaviors in a swarm of robots (SR). The innate part of the IS offers a variety of reactive and probabilistic cell functions alongside its self-regulation mechanism which have been translated to enable swarming behaviors. Although, the research is specially focused on flocking behaviors in a variety of simulated environments using e-puck robots in a physics-based simulator (CoppeliaSim); the artificial innate immune system (AIIS) can exhibit other swarm behaviors as well. The effectiveness of the immuno-inspired approach has been established with extensive experimentations, for scalability and adaptability, using standard swarm benchmarks as well as the immunological regulatory functions (i.e., Dendritic Cells’ Maturity and Inflammation). The AIIS-based approach has proved to be a scalable and adaptive solution for emulating the flocking behavior of SR.

Keywords: artificial innate immune system, flocking swarm, immune system, swarm intelligence

Procedia PDF Downloads 104
3587 The Incident of Concussion across Popular American Youth Sports: A Retrospective Review

Authors: Rami Hashish, Manon Limousis-Gayda, Caitlin H. McCleery

Abstract:

Introduction: A leading cause of emergency room visits among youth (in the United States), is sports-related traumatic brain injuries. Mild traumatic brain injuries (mTBIs), also called concussions, are caused by linear and/or angular acceleration experienced at the head and represent an increasing societal burden. Due to the developing nature of the brain in youth, there is a great risk for long-term neuropsychological deficiencies following a concussion. Accordingly, the purpose of this paper is to investigate incidence rates of concussion across gender for the five most common youth sports in the United States. These include basketball, track and field, soccer, baseball (boys), softball (girls), football (boys), and volleyball (girls). Methods: A PubMed search was performed for four search themes combined. The first theme identified the outcomes (concussion, brain injuries, mild traumatic brain injury, etc.). The second theme identified the sport (American football, soccer, basketball, softball, volleyball, track, and field, etc.). The third theme identified the population (adolescence, children, youth, boys, girls). The last theme identified the study design (prevalence, frequency, incidence, prospective). Ultimately, 473 studies were surveyed, with 15 fulfilling the criteria: prospective study presenting original data and incidence of concussion in the relevant youth sport. The following data were extracted from the selected studies: population age, total study population, total athletic exposures (AE) and incidence rate per 1000 athletic exposures (IR/1000). Two One-Way ANOVA and a Tukey’s post hoc test were conducted using SPSS. Results: From the 15 selected studies, statistical analysis revealed the incidence of concussion per 1000 AEs across the considered sports ranged from 0.014 (girl’s track and field) to 0.780 (boy’s football). Average IR/1000 across all sports was 0.483 and 0.268 for boys and girls, respectively; this difference in IR was found to be statistically significant (p=0.013). Tukey’s post hoc test showed that football had significantly higher IR/1000 than boys’ basketball (p=0.022), soccer (p=0.033) and track and field (p=0.026). No statistical difference was found for concussion incidence between girls’ sports. Removal of football was found to lower the IR/1000 for boys without a statistical difference (p=0.101) compared to girls. Discussion: Football was the only sport showing a statistically significant difference in concussion incidence rate relative to other sports (within gender). Males were overall more likely to be concussed than females when football was included (1.8x), whereas concussion was more likely for females when football was excluded. While the significantly higher rate of concussion in football is not surprising because of the nature and rules of the sport, it is concerning that research has shown higher incidence of concussion in practices than games. Interestingly, findings indicate that girls’ sports are more concussive overall when football is removed. This appears to counter the common notion that boys’ sports are more physically taxing and dangerous. Future research should focus on understanding the concussive mechanisms of injury in each sport to enable effective rule changes.

Keywords: gender, football, soccer, traumatic brain injury

Procedia PDF Downloads 141
3586 Creation of GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) Nanoparticles Using Pulse Laser Ablation Method

Authors: Yong Pan, Li Wang, Xue Qiong Su, Dong Wen Gao

Abstract:

To date, nanomaterials have received extensive attention over the years because of their wide application. Various nanomaterials such as nanoparticles, nanowire, nanoring, nanostars and other nanostructures have begun to be systematically studied. The preparation of these materials by chemical methods is not only costly, but also has a long cycle and high toxicity. At the same time, preparation of nanoparticles of multi-doped composites has been limited due to the special structure of the materials. In order to prepare multi-doped composites with the same structure as macro-materials and simplify the preparation method, the GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) nanoparticles are prepared by Pulse Laser Ablation (PLA) method. The particle component and structure are systematically investigated by X-ray diffraction (XRD) and Raman spectra, which show that the success of our preparation and the same concentration between nanoparticles (NPs) and target. Morphology of the NPs characterized by Transmission Electron Microscopy (TEM) indicates the circular-shaped particles in preparation. Fluorescence properties are reflected by PL spectra, which demonstrate the best performance in concentration of Ga0.3Co0.3ZnSe0.4. Therefore, all the results suggest that PLA is promising to prepare the multi-NPs since it can modulate performance of NPs.

Keywords: PLA, physics, nanoparticles, multi-doped

Procedia PDF Downloads 170
3585 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices

Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays

Abstract:

Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.

Keywords: ecological momentary assessment, real-time, stress, work

Procedia PDF Downloads 161
3584 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 121
3583 The Methods of Customer Satisfaction Measurement and Its Statistical Analysis towards Sales and Logistic Activities in Food Sector

Authors: Seher Arslankaya, Bahar Uludağ

Abstract:

Meeting the needs and demands of customers and pleasing the customers are important requirements for companies in food sectors where the growth of competition is significantly unpredictable. Customer satisfaction is also one of the key concepts which is mainly driven by wide range of customer preference and expectation upon products and services introduced and delivered to them. In order to meet the customer demands, the companies that engage in food sectors are expected to have a well-managed set of Total Quality Management (TQM), which sets out to improve quality of products and services; to reduce costs and to increase customer satisfaction by restructuring traditional management practices. It aims to increase customer satisfaction by meeting (their) customer expectations and requirements. The achievement would be determined with the help of customer satisfaction surveys, which is done to obtain immediate feedback and to provide quick responses. In addition, the surveys would also assist the making of strategic planning which helps to anticipate customer future needs and expectations. Meanwhile, periodic measurement of customer satisfaction would be a must because with the better understanding of customers perceptions from the surveys (done by questioners), the companies would have a clear idea to identify their own strengths and weaknesses that help the companies keep their loyal customers; to stand in comparison toward their competitors and map out their future progress and improvement. In this study, we propose a survey based on customer satisfaction measurement method and its statistical analysis for sales and logistic activities of food firms. Customer satisfaction would be discussed in details. Furthermore, after analysing the data derived from the questionnaire that applied to customers by using the SPSS software, various results obtained from the application would be presented. By also applying ANOVA test, the study would analysis the existence of meaningful differences between customer demographic proportion and their perceptions. The purpose of this study is also to find out requirements which help to remove the effects that decrease customer satisfaction and produce loyal customers in food industry. For this purpose, the customer complaints are collected. Additionally, comments and suggestions are done according to the obtained results of surveys, which would be useful for the making-process of strategic planning in food industry.

Keywords: customer satisfaction measurement and analysis, food industry, SPSS, TQM

Procedia PDF Downloads 249
3582 On the Hirota Bilinearization of Fokas-Lenells Equation to Obtain Bright N-Soliton Solution

Authors: Sagardeep Talukdar, Gautam Kumar Saharia, Riki Dutta, Sudipta Nandy

Abstract:

In non-linear optics, the Fokas-Lenells equation (FLE) is a well-known integrable equation that describes how ultrashort pulses move across optical fiber. It admits localized wave solutions, just like any other integrable equation. We apply the Hirota bilinearization method to obtain the soliton solution of FLE. The proposed bilinearization makes use of an auxiliary function. We apply the method to FLE with a vanishing boundary condition, that is, to obtain bright soliton. We have obtained bright 1-soliton, 2-soliton solutions and propose the scheme for obtaining N-soliton solution. We have used an additional parameter which is responsible for the shift in the position of the soliton. Further analysis of the 2-soliton solution is done by asymptotic analysis. We discover that the suggested bilinearization approach, which makes use of the auxiliary function, greatly simplifies the process while still producing the desired outcome. We think that the current analysis will be helpful in understanding how FLE is used in nonlinear optics and other areas of physics.

Keywords: asymptotic analysis, fokas-lenells equation, hirota bilinearization method, soliton

Procedia PDF Downloads 119
3581 Ranking Theory-The Paradigm Shift in Statistical Approach to the Issue of Ranking in a Sports League

Authors: E. Gouya Bozorg

Abstract:

The issue of ranking of sports teams, in particular soccer teams is of primary importance in the professional sports. However, it is still based on classical statistics and models outside of area of mathematics. Rigorous mathematics and then statistics despite the expectation held of them have not been able to effectively engage in the issue of ranking. It is something that requires serious pathology. The purpose of this study is to change the approach to get closer to mathematics proper for using in the ranking. We recommend using theoretical mathematics as a good option because it can hermeneutically obtain the theoretical concepts and criteria needful for the ranking from everyday language of a League. We have proposed a framework that puts the issue of ranking into a new space that we have applied in soccer as a case study. This is an experimental and theoretical study on the issue of ranking in a professional soccer league based on theoretical mathematics, followed by theoretical statistics. First, we showed the theoretical definition of constant number Є = 1.33 or ‘golden number’ of a soccer league. Then, we have defined the ‘efficiency of a team’ by this number and formula of μ = (Pts / (k.Є)) – 1, in which Pts is a point obtained by a team in k number of games played. Moreover, K.Є index has been used to show the theoretical median line in the league table and to compare top teams and bottom teams. Theoretical coefficient of σ= 1 / (1+ (Ptx / Ptxn)) has also been defined that in every match between the teams x, xn, with respect to the ability of a team and the points of both of them Ptx, Ptxn, and it gives a performance point resulting in a special ranking for the League. And it has been useful particularly in evaluating the performance of weaker teams. The current theory has been examined for the statistical data of 4 major European Leagues during the period of 1998-2014. Results of this study showed that the issue of ranking is dependent on appropriate theoretical indicators of a League. These indicators allowed us to find different forms of ranking of teams in a league including the ‘special table’ of a league. Furthermore, on this basis the issue of a record of team has been revised and amended. In addition, the theory of ranking can be used to compare and classify the different leagues and tournaments. Experimental results obtained from archival statistics of major professional leagues in the world in the past two decades have confirmed the theory. This topic introduces a new theory for ranking of a soccer league. Moreover, this theory can be used to compare different leagues and tournaments.

Keywords: efficiency of a team, ranking, special table, theoretical mathematic

Procedia PDF Downloads 418
3580 Comparing Numerical Accuracy of Solutions of Ordinary Differential Equations (ODE) Using Taylor's Series Method, Euler's Method and Runge-Kutta (RK) Method

Authors: Palwinder Singh, Munish Sandhir, Tejinder Singh

Abstract:

The ordinary differential equations (ODE) represent a natural framework for mathematical modeling of many real-life situations in the field of engineering, control systems, physics, chemistry and astronomy etc. Such type of differential equations can be solved by analytical methods or by numerical methods. If the solution is calculated using analytical methods, it is done through calculus theories, and thus requires a longer time to solve. In this paper, we compare the numerical accuracy of the solutions given by the three main types of one-step initial value solvers: Taylor’s Series Method, Euler’s Method and Runge-Kutta Fourth Order Method (RK4). The comparison of accuracy is obtained through comparing the solutions of ordinary differential equation given by these three methods. Furthermore, to verify the accuracy; we compare these numerical solutions with the exact solutions.

Keywords: Ordinary differential equations (ODE), Taylor’s Series Method, Euler’s Method, Runge-Kutta Fourth Order Method

Procedia PDF Downloads 358
3579 Feigenbaum Universality, Chaos and Fractal Dimensions in Discrete Dynamical Systems

Authors: T. K. Dutta, K. K. Das, N. Dutta

Abstract:

The salient feature of this paper is primarily concerned with Ricker’s population model: f(x)=x e^(r(1-x/k)), where r is the control parameter and k is the carrying capacity, and some fruitful results are obtained with the following objectives: 1) Determination of bifurcation values leading to a chaotic region, 2) Development of Statistical Methods and Analysis required for the measure of Fractal dimensions, 3) Calculation of various fractal dimensions. These results also help that the invariant probability distribution on the attractor, when it exists, provides detailed information about the long-term behavior of a dynamical system. At the end, some open problems are posed for further research.

Keywords: Feigenbaum universality, chaos, Lyapunov exponent, fractal dimensions

Procedia PDF Downloads 302
3578 The Benefits of Regional Brand for Companies

Authors: H. Starzyczna, M. Stoklasa, K. Matusinska

Abstract:

This article deals with the benefits of regional brands for companies in the Czech Republic. Research was focused on finding out the expected and actual benefits of regional brands for companies. The data were obtained by questionnaire survey and analysed by IBM SPSS. Representative sample of 204 companies was created. The research analysis disclosed the expected benefits that the regional brand should bring to companies. But the actual benefits are much worse. The statistical testing of hypotheses revealed that the benefits depend on the region of origin, which surprised both us and the regional coordinators.

Keywords: Brand, regional brands, product protective branding programs, brand benefits

Procedia PDF Downloads 345
3577 Quantification of the Erosion Effect on Small Caliber Guns: Experimental and Numerical Analysis

Authors: Dhouibi Mohamed, Stirbu Bogdan, Chabotier André, Pirlot Marc

Abstract:

Effects of erosion and wear on the performance of small caliber guns have been analyzed throughout numerical and experimental studies. Mainly, qualitative observations were performed. Correlations between the volume change of the chamber and the maximum pressure are limited. This paper focuses on the development of a numerical model to predict the maximum pressure evolution when the interior shape of the chamber changes in the different weapon’s life phases. To fulfill this goal, an experimental campaign, followed by a numerical simulation study, is carried out. Two test barrels, « 5.56x45mm NATO » and « 7.62x51mm NATO,» are considered. First, a Coordinate Measuring Machine (CMM) with a contact scanning probe is used to measure the interior profile of the barrels after each 300-shots cycle until their worn out. Simultaneously, the EPVAT (Electronic Pressure Velocity and Action Time) method with a special WEIBEL radar are used to measure: (i) the chamber pressure, (ii) the action time, (iii) and the bullet velocity in each barrel. Second, a numerical simulation study is carried out. Thus, a coupled interior ballistic model is developed using the dynamic finite element program LS-DYNA. In this work, two different models are elaborated: (i) coupled Eularien Lagrangian method using fluid-structure interaction (FSI) techniques and a coupled thermo-mechanical finite element using a lumped parameter model (LPM) as a subroutine. Those numerical models are validated and checked through three experimental results, such as (i) the muzzle velocity, (ii) the chamber pressure, and (iii) the surface morphology of fired projectiles. Results show a good agreement between experiments and numerical simulations. Next, a comparison between the two models is conducted. The projectile motions, the dynamic engraving resistances and the maximum pressures are compared and analyzed. Finally, using this obtained database, a statistical correlation between the muzzle velocity, the maximum pressure and the chamber volume is established.

Keywords: engraving process, finite element analysis, gun barrel erosion, interior ballistics, statistical correlation

Procedia PDF Downloads 215
3576 Opportunities of an Industrial City in the Leisure Tourism

Authors: E. Happ, A. Albert Tóth

Abstract:

The aim of the research is to investigate the forms of the demands of leisure tourism in a West-Hungarian industrial city, Győr. Today, Győr is still a traditional industrial city, its industry is mainly based on vehicle industry, but the role of tourism is increasing in the life of the city as well. Because of the industrial nature and the strong economy of the city, the ratio of business tourists is high. It can be stated that MICE tourism is dominating in Győr. Developments of the last decade can help the city with new tourism products to increase the leisure tourism. The new types of tourism – besides business tourism – can help the providers to increase the occupancy rates and the demand at the weekends. The research demonstrates the theoretical background of the topic, and it shows the present situation of the tourism in Győr with secondary data. The secondary research contains statistical data from the Hungarian Statistical Office and the city council, and it is based on the providers’ data. The next part of the paper shows the potential types of leisure tourism with the help of primary research. The primary research contains the results of an online questionnaire with a sample of 1000 potential customers. It is completed with 10 in-depth interviews with tourism experts, who explained their opinions about the opportunities of leisure tourism in Győr from the providers’ side. The online questionnaire was filled out in spring 2017 by customers, who have already stayed in Győr or plan to visit the city. At the same time in-depth interviews were made with hotel managers, head of touristic institutions and employees at the council. Based on the research it can be stated that the touristic supply of Győr allows the increase of the leisure tourism ratio in the city. Primarily, the cultural and health tourism show potential development, but the supply side of touristic services can be developed in order to increase the number of guest nights. The tourism marketing needs to be strengthened in the city, and a distinctive marketing activity - from other cities - is needed as well. To conclude, although Győr is an industrial city, it has a transforming industrial part, and tourism is also strongly present in its economy. Besides the leading role of business tourism, different types of leisure tourism have the opportunity to take place in the city.

Keywords: business tourism, Győr, industrial city, leisure tourism, touristic demand

Procedia PDF Downloads 279
3575 Transforming Higher Education in India

Authors: Samir Sarfraj Terdalkar

Abstract:

India needs to step into affordable higher education with more focus on skill development and employability. The general scenario of higher education in India revolves around two major branches of higher education ie., Engineering and Medical Sciences. These two branches still cannot be considered as affordable. Hence, skill development of each and every student beginning from the school education should emphasize on learning skills with special focus on physics and mathematics. In India, the Central Government initiated a survey based process of all higher Educational Institutes/ Universities and colleges in India. This survey/ process was – All India Survey On Higher Education (AISHE). The focus of this process was understand and Though the increase is significant, it is necessary to propagate skill and vocational education which would add to the employability factor. Similarly, there has been a significant increase in number of higher education institutes, there is need to rethink on the type of education/ curriculum offered by these institutions. In this regard, vocational education has helped to build skill sets to certain extent. There is need to bring in this vocational educational in main stream education which could be complementary for undergraduate / post graduate education. The paper focuses on different policies to bring in vocational/ skill education.

Keywords: higher education, skill, vocational, India

Procedia PDF Downloads 108
3574 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 358
3573 Study and Simulation of a Sever Dust Storm over West and South West of Iran

Authors: Saeed Farhadypour, Majid Azadi, Habibolla Sayyari, Mahmood Mosavi, Shahram Irani, Aliakbar Bidokhti, Omid Alizadeh Choobari, Ziba Hamidi

Abstract:

In the recent decades, frequencies of dust events have increased significantly in west and south west of Iran. First, a survey on the dust events during the period (1990-2013) is investigated using historical dust data collected at 6 weather stations scattered over west and south-west of Iran. After statistical analysis of the observational data, one of the most severe dust storm event that occurred in the region from 3rd to 6th July 2009, is selected and analyzed. WRF-Chem model is used to simulate the amount of PM10 and how to transport it to the areas. The initial and lateral boundary conditions for model obtained from GFS data with 0.5°×0.5° spatial resolution. In the simulation, two aerosol schemas (GOCART and MADE/SORGAM) with 3 options (chem_opt=106,300 and 303) were evaluated. Results of the statistical analysis of the historical data showed that south west of Iran has high frequency of dust events, so that Bushehr station has the highest frequency between stations and Urmia station has the lowest frequency. Also in the period of 1990 to 2013, the years 2009 and 1998 with the amounts of 3221 and 100 respectively had the highest and lowest dust events and according to the monthly variation, June and July had the highest frequency of dust events and December had the lowest frequency. Besides, model results showed that the MADE / SORGAM scheme has predicted values and trends of PM10 better than the other schemes and has showed the better performance in comparison with the observations. Finally, distribution of PM10 and the wind surface maps obtained from numerical modeling showed that the formation of dust plums formed in Iraq and Syria and also transportation of them to the West and Southwest of Iran. In addition, comparing the MODIS satellite image acquired on 4th July 2009 with model output at the same time showed the good ability of WRF-Chem in simulating spatial distribution of dust.

Keywords: dust storm, MADE/SORGAM scheme, PM10, WRF-Chem

Procedia PDF Downloads 270
3572 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 139
3571 Fuzzy Gauge Capability (Cg and Cgk) through Buckley Approach

Authors: Seyed Habib A. Rahmati, Mohsen Sadegh Amalnick

Abstract:

Different terms of the statistical process control (SPC) has sketch in the fuzzy environment. However, measurement system analysis (MSA), as a main branch of the SPC, is rarely investigated in fuzzy area. This procedure assesses the suitability of the data to be used in later stages or decisions of the SPC. Therefore, this research focuses on some important measures of MSA and through a new method introduces the measures in fuzzy environment. In this method, which works based on Buckley approach, imprecision and vagueness nature of the real world measurement are considered simultaneously. To do so, fuzzy version of the gauge capability (Cg and Cgk) are introduced. The method is also explained through example clearly.

Keywords: measurement, SPC, MSA, gauge capability (Cg and Cgk)

Procedia PDF Downloads 650
3570 Analysis on the Feasibility of Landsat 8 Imagery for Water Quality Parameters Assessment in an Oligotrophic Mediterranean Lake

Authors: V. Markogianni, D. Kalivas, G. Petropoulos, E. Dimitriou

Abstract:

Lake water quality monitoring in combination with the use of earth observation products constitutes a major component in many water quality monitoring programs. Landsat 8 images of Trichonis Lake (Greece) acquired on 30/10/2013 and 30/08/2014 were used in order to explore the possibility of Landsat 8 to estimate water quality parameters and particularly CDOM absorption at specific wavelengths, chlorophyll-a and nutrient concentrations in this oligotrophic freshwater body, characterized by inexistent quantitative, temporal and spatial variability. Water samples have been collected at 22 different stations, on late August of 2014 and the satellite image of the same date was used to statistically correlate the in-situ measurements with various combinations of Landsat 8 bands in order to develop algorithms that best describe those relationships and calculate accurately the aforementioned water quality components. Optimal models were applied to the image of late October of 2013 and the validation of the results was conducted through their comparison with the respective available in-situ data of 2013. Initial results indicated the limited ability of the Landsat 8 sensor to accurately estimate water quality components in an oligotrophic waterbody. As resulted by the validation process, ammonium concentrations were proved to be the most accurately estimated component (R = 0.7), followed by chl-a concentration (R = 0.5) and the CDOM absorption at 420 nm (R = 0.3). In-situ nitrate, nitrite, phosphate and total nitrogen concentrations of 2014 were measured as lower than the detection limit of the instrument used, hence no statistical elaboration was conducted. On the other hand, multiple linear regression among reflectance measures and total phosphorus concentrations resulted in low and statistical insignificant correlations. Our results were concurrent with other studies in international literature, indicating that estimations for eutrophic and mesotrophic lakes are more accurate than oligotrophic, owing to the lack of suspended particles that are detectable by satellite sensors. Nevertheless, although those predictive models, developed and applied to Trichonis oligotrophic lake are less accurate, may still be useful indicators of its water quality deterioration.

Keywords: landsat 8, oligotrophic lake, remote sensing, water quality

Procedia PDF Downloads 396
3569 Micromechanics of Stress Transfer across the Interface Fiber-Matrix Bonding

Authors: Fatiha Teklal, Bachir Kacimi, Arezki Djebbar

Abstract:

The study and application of composite materials are a truly interdisciplinary endeavor that has been enriched by contributions from chemistry, physics, materials science, mechanics and manufacturing engineering. The understanding of the interface (or interphase) in composites is the central point of this interdisciplinary effort. From the early development of composite materials of various nature, the optimization of the interface has been of major importance. Even more important, the ideas linking the properties of composites to the interface structure are still emerging. In our study, we need a direct characterization of the interface; the micromechanical tests we are addressing seem to meet this objective and we chose to use two complementary tests simultaneously. The microindentation test that can be applied to real composites and the drop test, preferred to the pull-out because of the theoretical possibility of studying systems with high adhesion (which is a priori the case with our systems). These two tests are complementary because of the principle of the model specimen used for both the first "compression indentation" and the second whose fiber is subjected to tensile stress called the drop test. Comparing the results obtained by the two methods can therefore be rewarding.

Keywords: Fiber, Interface, Matrix, Micromechanics, Pull-out

Procedia PDF Downloads 118
3568 The Effectiveness of the Family-Centered Sensory and Motor Interactive Games Program on Strengthening the Developmental and Motor Skills of Children aged 12 to 24 Months Who Have a Prior History of Low Birth Weight

Authors: Seyede Soraya Alavinezhad, Gholam Ali Afrooz, Seyedsaeid Sajjadianari

Abstract:

The purpose of this study was to assess the efficacy of a family-centered sensory and motor interactive activities program in enhancing the motor and developmental abilities of infants between the ages of 12 and 24 months who have a medical history of low birth weight. The design of the study was a combined method (qualitative and quantitative). The statistical population comprised infants between the ages of 12 and 24 months who had a documented history of low birth weight in Tehran in 2022. The study sample comprised twenty-eight infants, ranging in age from twelve to twenty-four months, whose mothers were selected using a readily available sampling method. The participants were allocated into two groups—experimental and control—at random. The Children's Developmental Screening Scale, the third edition of Ages and Stages Questionnaires (ASQ3TM), was utilized in both cohorts. Two sessions of the family-centered program for mothers and sixteen sessions for children in the experimental group were taken into account. The statistical analysis software SPSS version 26 was utilized to analyze the data. Initially, the descriptive analysis of the variables, the normality of the assumptions, and the equality of the variance of the variables in the groups were examined. Subsequently, univariate analysis of covariance was employed to examine research hypotheses. The results of the covariance analysis demonstrated that the family-centered interactive activities program for sensory and motor development was effective. A significant difference has been observed between the experimental and control groups with regard to developmental skills between the pre-test and post-test (P<0.005). Motor and developmental skills among children aged 12 to 24 months with a history of low birth weight can be enhanced through entertainment programs that incorporate suitable structure, according to the findings of this study. It is recommended that future research investigate the efficacy of this program on children of average weight and conduct longitudinal studies.

Keywords: children, developmental skills, low birth weight, sensory and motor interactive games program

Procedia PDF Downloads 20
3567 Spatiotemporal Evaluation of Climate Bulk Materials Production in Atmospheric Aerosol Loading

Authors: Mehri Sadat Alavinasab Ashgezari, Gholam Reza Nabi Bidhendi, Fatemeh Sadat Alavinasab Ashkezari

Abstract:

Atmospheric aerosol loading (AAL) from anthropogenic sources is an evidence in industrial development. The accelerated trends in material consumption at the global scale in recent years demonstrate consumption paradigms sensible to the planetary boundaries (PB). This paper is a statistical approach on recognizing the path of climate-relevant bulk materials production (CBMP) of steel, cement and plastics to AAL via an updated and validated spatiotemporal distribution. The methodology of statistical analysis used the most updated regional or global databases or instrumental technologies. This corresponded to a selection of processes and areas capable for tracking AAL within the last decade, analyzing the most validated data while leading to explore the behavior functions or models. The results also represented a correlation within socio economic metabolism idea between the materials specified as macronutrients of society and AAL as a PB with an unknown threshold. The selected country contributors of China, India, US and the sample country of Iran show comparable cumulative AAL values vs to the bulk materials domestic extraction and production rate in the study period of 2012 to 2022. Generally, there is a tendency towards gradual descend in the worldwide and regional aerosol concentration after 2015. As of our evaluation, a considerable share of human role, equivalent 20% from CBMP, is for the main anthropogenic species of aerosols, including sulfate, black carbon and organic particulate matters too. This study, in an innovative approach, also explores the potential role of AAL control mechanisms from the economy sectors where ordered and smoothing loading trends are accredited through the disordered phenomena of CBMP and aerosol precursor emissions. The equilibrium states envisioned is an approval to the well-established theory of Spin Glasses applicable in physical system like the Earth and here to AAL.

Keywords: atmospheric aeroso loading, material flows, climate bulk materials, industrial ecology

Procedia PDF Downloads 80
3566 3D Multimedia Model for Educational Design Engineering

Authors: Mohanaad Talal Shakir

Abstract:

This paper tries to propose educational design by using multimedia technology for Engineering of computer Technology, Alma'ref University College in Iraq. This paper evaluates the acceptance, cognition, and interactiveness of the proposed model by students by using the statistical relationship to determine the stage of the model. Objectives of proposed education design are to develop a user-friendly software for education purposes using multimedia technology and to develop animation for 3D model to simulate assembling and disassembling process of high-speed flow.

Keywords: CAL, multimedia, shock tunnel, interactivity, engineering education

Procedia PDF Downloads 623
3565 The Improved Laplace Homotopy Perturbation Method for Solving Non-integrable PDEs

Authors: Noufe H. Aljahdaly

Abstract:

The Laplace homotopy perturbation method (LHPM) is an approximate method that help to compute the approximate solution for partial differential equations. The method has been used for solving several problems in science. It requires the initial condition, so it solves the initial value problem. In physics, when some important terms are taken in account, we may obtain non-integrable partial differential equations that do not have analytical integrals. This type of PDEs do not have exact solution, therefore, we need to compute the solution without initial condition. In this work, we improved the LHPM to be able to solve non-integrable problem, especially the damped PDEs, which are the PDEs that include a damping term which makes the PDEs non-integrable. We improved the LHPM by setting a perturbation parameter and an embedding parameter as the damping parameter and using the initial condition for damped PDE as the initial condition for non-damped PDE.

Keywords: non-integrable PDEs, modified Kawahara equation;, laplace homotopy perturbation method, damping term

Procedia PDF Downloads 100
3564 Structural Behaviour of Concrete Energy Piles in Thermal Loadings

Authors: E. H. N. Gashti, M. Malaska, K. Kujala

Abstract:

The thermo-mechanical behaviour of concrete energy pile foundations with different single and double U-tube shapes incorporated was analysed using the Comsol Multi-physics package. For the analysis, a 3D numerical model in real scale of the concrete pile and surrounding soil was simulated regarding actual operation of ground heat exchangers (GHE) and the surrounding ambient temperature. Based on initial ground temperature profile measured in situ, tube inlet temperature was considered to range from 6°C to 0°C (during the contraction process) over a 30-day period. Extra thermal stresses and deformations were calculated during the simulations and differences arising from the use of two different systems (single-tube and double-tube) were analysed. The results revealed no significant difference for extra thermal stresses at the centre of the pile in either system. However, displacements over the pile length were found to be up to 1.5-fold higher in the double-tube system than the single-tube system.

Keywords: concrete energy piles, stresses, displacements, thermo-mechanical behaviour, soil-structure interactions

Procedia PDF Downloads 214
3563 Study of Climate Change Process on Hyrcanian Forests Using Dendroclimatology Indicators (Case Study of Guilan Province)

Authors: Farzad Shirzad, Bohlol Alijani, Mehry Akbary, Mohammad Saligheh

Abstract:

Climate change and global warming are very important issues today. The process of climate change, especially changes in temperature and precipitation, is the most important issue in the environmental sciences. Climate change means changing the averages in the long run. Iran is located in arid and semi-arid regions due to its proximity to the equator and its location in the subtropical high pressure zone. In this respect, the Hyrcanian forest is a green necklace between the Caspian Sea and the south of the Alborz mountain range. In the forty-third session of UNESCO, it was registered as the second natural heritage of Iran. Beech is one of the most important tree species and the most industrial species of Hyrcanian forests. In this research, using dendroclimatology, the width of the tree ring, and climatic data of temperature and precipitation from Shanderman meteorological station located in the study area, And non-parametric Mann-Kendall statistical method to investigate the trend of climate change over a time series of 202 years of growth ringsAnd Pearson statistical method was used to correlate the growth of "ring" growth rings of beech trees with climatic variables in the region. The results obtained from the time series of beech growth rings showed that the changes in beech growth rings had a downward and negative trend and were significant at the level of 5% and climate change occurred. The average minimum, medium, and maximum temperatures and evaporation in the growing season had an increasing trend, and the annual precipitation had a decreasing trend. Using Pearson method during fitting the correlation of diameter of growth rings with temperature, for the average in July, August, and September, the correlation is negative, and the average temperature in July, August, and September is negative, and for the average The average maximum temperature in February was correlation-positive and at the level of 95% was significant, and with precipitation, in June the correlation was at the level of 95% positive and significant.

Keywords: climate change, dendroclimatology, hyrcanian forest, beech

Procedia PDF Downloads 104
3562 Websites for Hypothesis Testing

Authors: Frantisek Mosna

Abstract:

E-learning has become an efficient and widespread means in process of education at all branches of human activities. Statistics is not an exception. Unfortunately the main focus in the statistics teaching is usually paid to the substitution to formulas. Suitable web-sites can simplify and automate calculation and provide more attention and time to the basic principles of statistics, mathematization of real-life situations and following interpretation of results. We introduce our own web-sites for hypothesis testing. Their didactic aspects, technical possibilities of individual tools for their creating, experience and advantages or disadvantages of them are discussed in this paper. These web-sites do not substitute common statistical software but significantly improve the teaching of the statistics at universities.

Keywords: e-learning, hypothesis testing, PHP, web-sites

Procedia PDF Downloads 422