Search results for: bivariate statistical techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10407

Search results for: bivariate statistical techniques

9987 Psychological Distress and Associated Factors among Patients Attending Orthopedic Unit of at Dilla University Referral Hospital in Ethiopia, 2022

Authors: Chalachew Kassaw, Henok Ababu, Bethelhem Sileshy, Lulu Abebe, Birhanie Mekuriaw

Abstract:

Background: Psychological discomfort is a state of emotional distress caused by everyday stressors and obligations that are difficult to manage. Orthopedic trauma has a wide range of effects on survivors' physical health, as well as a variety of mental health concerns that impede recovery. Psychiatric and behavioral conditions are 3-5 times more common in people who have undergone physical trauma, and they are a predictor of poor outcomes. Despite the above facts, there is a shortage of research done on the subject. Therefore, this study aimed to determine the magnitude of psychological distress and associated factor among patients attending orthopedic treatment at Gedeo zone, South Ethiopia 2022. Methods: A cross-sectional study was undertaken at Dilla University Referral Hospital from October –November 2022. The data was collected via a face-to-face interview, and the Kessler psychological distress scale (K-10) was used to assess psychological distress. A total of 386 patients receiving outpatient and inpatient services at the orthopedic unit were chosen using a simple random selection technique. A Statistical Package for the Social Science version 21 (SPSS-21) was used to enter and evaluate the data. To find related factors, bivariate, and multivariate logistic regressions were used. Variables having a p-value of less than 0.05 were deemed statistically significant. Result: A total of 386 participants with a response rate of 94.8% were included in the study. Out of all respondents, 114 (31.4%) of the individuals have experienced psychological distress. Independent variables such as Females [Adjusted odds ratio (AOR)=5.8, 95%CI=(4.6-15.6)], Average monthly income of <3500 birrs [Adjusted odds ratio (AOR) =4.8, 95% CI=(2.4-9.8) ], Current history of substance use [Adjusted odds ratio (AOR) =2.6, 95% CI=(1.66-4.7)], Strong social support [Adjusted odds ratio (AOR)=0.4, 95% CI= 0.4(0.2-0.8)], and Poor sleep quality (PSQI score>5) [Adjusted odds ratio (AOR)=2.0, 95%CI= 2.0(1.2-2.8)] were significantly associated with psychological distress. Conclusion: The prevalence of psychological distress was high. Being female, having poor social support, and having a high PSQI score were significantly associated factors with psychological distress. It is good if clinicians emphasize orthopedic patients, especially females and those having poor social support and low sleep quality symptoms.

Keywords: psychological distress, orthopedic unit, Dilla University hospital, Dilla Town, Southern Ethiopia

Procedia PDF Downloads 88
9986 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 159
9985 Introduction of Robust Multivariate Process Capability Indices

Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian

Abstract:

Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.

Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control

Procedia PDF Downloads 283
9984 Strategic Investment in Infrastructure Development to Facilitate Economic Growth in the United States

Authors: Arkaprabha Bhattacharyya, Makarand Hastak

Abstract:

The COVID-19 pandemic is unprecedented in terms of its global reach and economic impacts. Historically, investment in infrastructure development projects has been touted to boost the economic growth of a nation. The State and Local governments responsible for delivering infrastructure assets work under tight budgets. Therefore, it is important to understand which infrastructure projects have the highest potential of boosting economic growth in the post-pandemic era. This paper presents relationships between infrastructure projects and economic growth. Statistical relationships between investment in different types of infrastructure projects (transit, water and wastewater, highways, power, manufacturing etc.) and indicators of economic growth are presented using historic data between 2002 and 2020 from the U.S. Census Bureau and U.S. Bureau of Economic Analysis (BEA). The outcome of the paper is the comparison of statistical correlations between investment in different types of infrastructure projects and indicators of economic growth. The comparison of the statistical correlations is useful in ranking the types of infrastructure projects based on their ability to influence economic prosperity. Therefore, investment in the infrastructures with the higher rank will have a better chance of boosting the economic growth. Once, the ranks are derived, they can be used by the decision-makers in infrastructure investment related decision-making process.

Keywords: economic growth, infrastructure development, infrastructure projects, strategic investment

Procedia PDF Downloads 171
9983 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.

Keywords: censored data, R statistical software, reliability analysis, time to failure

Procedia PDF Downloads 401
9982 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 309
9981 Effect of Cost Control and Cost Reduction Techniques in Organizational Performance

Authors: Babatunde Akeem Lawal

Abstract:

In any organization, the primary aim is to maximize profit, but the major challenges facing them is the increase in cost of operation because of this there is increase in cost of production that could lead to inevitable cost control and cost reduction scheme which make it difficult for most organizations to operate at the cost efficient frontier. The study aims to critically examine and evaluate the application of cost control and cost reduction in organization performance and also to review budget as an effective tool of cost control and cost reduction. A descriptive survey research was adopted. A total number of 40 respondent retrieved were used for the study. The analysis of data collected was undertaken by applying appropriate statistical tools. Regression analysis was used to test the hypothesis with the use of SPSS. Based on the findings; it was evident that cost control has a positive impact on organizational performance and also the style of management has a positive impact on organizational performance.

Keywords: organization, cost reduction, cost control, performance, budget, profit

Procedia PDF Downloads 603
9980 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 89
9979 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis

Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias

Abstract:

Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.

Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification

Procedia PDF Downloads 365
9978 Generation of Quasi-Measurement Data for On-Line Process Data Analysis

Authors: Hyun-Woo Cho

Abstract:

For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.

Keywords: data analysis, diagnosis, monitoring, process data, quality control

Procedia PDF Downloads 481
9977 Pattern Identification in Statistical Process Control Using Artificial Neural Networks

Authors: M. Pramila Devi, N. V. N. Indra Kiran

Abstract:

Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.

Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping

Procedia PDF Downloads 372
9976 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks

Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher

Abstract:

Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.

Keywords: neural networks, rainfall, prediction, climatic variables

Procedia PDF Downloads 488
9975 Two-Stage Hospital Efficiency Analysis Including Qualitative Evidence: A Greek Case

Authors: Panos Xenos, Milton Nektarios, John Yfantopoulos

Abstract:

Background: Policy makers, professional organizations and payers have introduced a variety of initiatives and reforms for the health systems worldwide, aimed at improving hospital efficiency. Their efforts are concentrated in two main categories: to constrain increasing healthcare costs and to enhance quality of services provided. Research Objectives: This study examines the efficiency of 112 Greek public hospitals for the year 2009, evaluates the importance of bootstrapping techniques and investigates the effect of contextual factors on hospital efficiency. Furthermore, the effect of qualitative evidence, on hospital efficiency is explored using data from 28 large hospitals. Methods: We applied Data Envelopment Analysis, augmented by bootstrapping techniques, to estimate efficiency scores. In order to measure the effect of environmental factors on hospital efficiency we used Tobit regression analysis. The significance of our models is evaluated using statistical tests to compare distributions. Results: The Kolmogorov-Smirnov test between the original and the bootstrap-corrected efficiency indicates that their distributions are significantly different (p-value<0.01). The environmental factors, that seem to influence efficiency, are Occupancy Rating and the ratio between Outpatient Visits and Inpatient Days. Results indicate that the inclusion of the quality variable in DEA modelling generates statistically significant variations in efficiency scores (p-value<0.05). Conclusions: The inclusion of quality variables and the use of bootstrap resampling in efficiency analysis impose a statistically significant effect on the distribution of efficiency scores. As a policy conclusion we highlight the importance of these methods on hospital efficiency analysis and, by implication, on healthcare resource allocation.

Keywords: hospitals, efficiency, quality, data envelopment analysis, Greek public hospital sector

Procedia PDF Downloads 309
9974 Review of Ultrasound Image Processing Techniques for Speckle Noise Reduction

Authors: Kwazikwenkosi Sikhakhane, Suvendi Rimer, Mpho Gololo, Khmaies Oahada, Adnan Abu-Mahfouz

Abstract:

Medical ultrasound imaging is a crucial diagnostic technique due to its affordability and non-invasiveness compared to other imaging methods. However, the presence of speckle noise, which is a form of multiplicative noise, poses a significant obstacle to obtaining clear and accurate images in ultrasound imaging. Speckle noise reduces image quality by decreasing contrast, resolution, and signal-to-noise ratio (SNR). This makes it difficult for medical professionals to interpret ultrasound images accurately. To address this issue, various techniques have been developed to reduce speckle noise in ultrasound images, which improves image quality. This paper aims to review some of these techniques, highlighting the advantages and disadvantages of each algorithm and identifying the scenarios in which they work most effectively.

Keywords: image processing, noise, speckle, ultrasound

Procedia PDF Downloads 110
9973 Chemical Variability in the Essential Oils from the Leaves and Buds of Syzygium Species

Authors: Rabia Waseem, Low Kah Hin, Najihah Mohamed Hashim

Abstract:

The variability in the chemical components of the Syzygium species essential oils has been evaluated. The leaves of Syzygium species have been collected from Perak, Malaysia. The essential oils extracted by using the conventional Hydro-distillation extraction procedure and analyzed by using Gas chromatography System attached with Mass Spectrometry (GCMS). Twenty-seven constituents were found in Syzygium species in which the major constituents include: α-Pinene (3.94%), α-Thujene (2.16%), α-Terpineol (2.95%), g-Elemene (2.89%) and D-Limonene (14.59%). The aim of this study was the comparison between the evaluated data and existing literature to fortify the major variability through statistical analysis.

Keywords: chemotaxonomy, cluster analysis, essential oil, medicinal plants, statistical analysis

Procedia PDF Downloads 312
9972 Underrepresentation of Right Middle Cerebral Infarct: A Statistical Parametric Mapping

Authors: Wi-Sun Ryu, Eun-Kee Bae

Abstract:

Prior studies have shown that patients with right hemispheric stroke are likely to seek medical service compared with those with left hemispheric stroke. However, the underlying mechanism for this phenomenon is unknown. In the present study, we generated lesion probability maps in a patient with right and left middle cerebral artery infarct and statistically compared. We found that precentral gyrus-Brodmann area 44, a language area in the left hemisphere - involvement was significantly higher in patients with left hemispheric stroke. This finding suggests that a language dysfunction was more noticeable, thereby taking more patients to hospitals.

Keywords: cerebral infarct, brain MRI, statistical parametric mapping, middle cerebral infarct

Procedia PDF Downloads 338
9971 Resonant Fluorescence in a Two-Level Atom and the Terahertz Gap

Authors: Nikolai N. Bogolubov, Andrey V. Soldatov

Abstract:

Terahertz radiation occupies a range of frequencies somewhere from 100 GHz to approximately 10 THz, just between microwaves and infrared waves. This range of frequencies holds promise for many useful applications in experimental applied physics and technology. At the same time, reliable, simple techniques for generation, amplification, and modulation of electromagnetic radiation in this range are far from been developed enough to meet the requirements of its practical usage, especially in comparison to the level of technological abilities already achieved for other domains of the electromagnetic spectrum. This situation of relative underdevelopment of this potentially very important range of electromagnetic spectrum is known under the name of the 'terahertz gap.' Among other things, technological progress in the terahertz area has been impeded by the lack of compact, low energy consumption, easily controlled and continuously radiating terahertz radiation sources. Therefore, development of new techniques serving this purpose as well as various devices based on them is of obvious necessity. No doubt, it would be highly advantageous to employ the simplest of suitable physical systems as major critical components in these techniques and devices. The purpose of the present research was to show by means of conventional methods of non-equilibrium statistical mechanics and the theory of open quantum systems, that a thoroughly studied two-level quantum system, also known as an one-electron two-level 'atom', being driven by external classical monochromatic high-frequency (e.g. laser) field, can radiate continuously at much lower (e.g. terahertz) frequency in the fluorescent regime if the transition dipole moment operator of this 'atom' possesses permanent non-equal diagonal matrix elements. This assumption contradicts conventional assumption routinely made in quantum optics that only the non-diagonal matrix elements persist. The conventional assumption is pertinent to natural atoms and molecules and stems from the property of spatial inversion symmetry of their eigenstates. At the same time, such an assumption is justified no more in regard to artificially manufactured quantum systems of reduced dimensionality, such as, for example, quantum dots, which are often nicknamed 'artificial atoms' due to striking similarity of their optical properties to those ones of the real atoms. Possible ways to experimental observation and practical implementation of the predicted effect are discussed too.

Keywords: terahertz gap, two-level atom, resonant fluorescence, quantum dot, resonant fluorescence, two-level atom

Procedia PDF Downloads 271
9970 Differential Effect of Technique Majors on Isokinetic Strength in Youth Judoka Athletes

Authors: Chungyu Chen, Yi-Cheng Chen, Po-Hsian Hsu, Hsin-Ying Chen, Yen-Po Hsiao

Abstract:

The purpose of this study was to assess the muscular strength performance of upper and lower extremity in isokinetic system for the youth judo players, and also to compare the strength difference between major techniques. Sixteen male and 20 female judo players (age: 16.7 ± 1.6 years old, training age: 4.5 ± 0.8 years) were served as the volunteers for this study. There were 21 players major hand techniques and 15 players major foot techniques. The Biodex S4 Pro was used to assess the strength performance of extensor and flexor of concentric action under the load condition of 30 degree/sec, 60 degree/sec, and 120 degree/sec for elbow joints and knee joints. The strength parameters were included the maximal torque, the normalized maximal torque, the average power, and the average maximal torque. A t test for independent groups was used to evaluate whether hand major and foot major differ significantly with an alpha level of .05. The result showed the maximal torque of left knee extensor in foot major players (243.5 ± 36.3 Nm) was higher significantly than hand major (210.7 ± 21.0 Nm) under the load of 30 degree/sec (p < .05). There were no differences in upper extremity strength between the hand and foot techniques major in three loads (ps < .05). It indicated that the judo player is required to develop the upper extremity strength overall to secure the execution of major techniques.

Keywords: knee, elbow, power, judo

Procedia PDF Downloads 455
9969 Comparative Analysis of Edge Detection Techniques for Extracting Characters

Authors: Rana Gill, Chandandeep Kaur

Abstract:

Segmentation of images can be implemented using different fundamental algorithms like edge detection (discontinuity based segmentation), region growing (similarity based segmentation), iterative thresholding method. A comprehensive literature review relevant to the study gives description of different techniques for vehicle number plate detection and edge detection techniques widely used on different types of images. This research work is based on edge detection techniques and calculating threshold on the basis of five edge operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny. Segmentation of characters present in different type of images like vehicle number plate, name plate of house and characters on different sign boards are selected as a case study in this work. The proposed methodology has seven stages. The proposed system has been implemented using MATLAB R2010a. Comparison of all the five operators has been done on the basis of their performance. From the results it is found that Canny operators produce best results among the used operators and performance of different edge operators in decreasing order is: Canny>Log>Sobel>Prewitt>Roberts.

Keywords: segmentation, edge detection, text, extracting characters

Procedia PDF Downloads 426
9968 A Dynamic Equation for Downscaling Surface Air Temperature

Authors: Ch. Surawut, D. Sukawat

Abstract:

In order to utilize results from global climate models, dynamical and statistical downscaling techniques have been developed. For dynamical downscaling, usually a limited area numerical model is used, with associated high computational cost. This research proposes dynamic equation for specific space-time regional climate downscaling from the Educational Global Climate Model (EdGCM) for Southeast Asia. The equation is for surface air temperature. These equations provide downscaling values of surface air temperature at any specific location and time without running a regional climate model. In the proposed equations, surface air temperature is approximated from ground temperature, sensible heat flux and 2m wind speed. Results from the application of the equation show that the errors from the proposed equations are less than the errors for direct interpolation from EdGCM.

Keywords: dynamic equation, downscaling, inverse distance, weight interpolation

Procedia PDF Downloads 304
9967 Passive Retrofitting Strategies for Windows in Hot and Humid Climate Vijayawada

Authors: Monica Anumula

Abstract:

Nowadays human beings attain comfort zone artificially for heating, cooling and lighting the spaces they live, and their main importance is given to aesthetics of building and they are not designed to protect themselves from climate. They depend on artificial sources of energy resulting in energy wastage. In order to reduce the amount of energy being spent in the construction industry and Energy Package goals by 2020, new ways of constructing houses is required. The larger part of energy consumption of a building is directly related to architectural aspects hence nature has to be integrated into the building design to attain comfort zone and reduce the dependency on artificial source of energy. The research is to develop bioclimatic design strategies and techniques for the walls and roofs of Vijayawada houses. Study and analysis of design strategies and techniques of various cases like Kerala, Mangalore etc. for similar kind of climate is examined in this paper. Understanding the vernacular architecture and modern techniques of that various cases and implementing in the housing of Vijayawada not only decreases energy consumption but also enhances socio cultural values of Vijayawada. This study focuses on the comparison of vernacular techniques and modern building bio climatic strategies to attain thermal comfort and energy reduction in hot and humid climate. This research provides further thinking of new strategies which include both vernacular and modern bioclimatic techniques.

Keywords: bioclimatic design, energy consumption, hot and humid climates, thermal comfort

Procedia PDF Downloads 179
9966 Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study

Authors: Saeed Ullah Jan, Shaukat Ullah

Abstract:

Purpose of the study: The main theme of this study is to explore the information literacy skills of the law practitioners in Khyber Pakhtunkhwa-Pakistan under the heading "Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study." Research Method and Procedure: To conduct this quantitative study, the simple random sample approach is used. An adapted questionnaire is distributed among 254 lawyers of Dera Ismail Khan through personal visits and electronic means. The data collected is analyzed through SPSS (Statistical Package for Social Sciences) software. Delimitations of the study: The study is delimited to the southern district of Khyber Pakhtunkhwa: Dera Ismael Khan. Key Findings: Most of the lawyers of District Dera Ismail Khan of Khyber Pakhtunkhwa can recognize and understand the needed information. A large number of lawyers are capable of presenting information in both written and electronic forms. They are not comfortable with different legal databases and using various searching and keyword techniques. They have less knowledge of Boolean operators for locating online information. Conclusion and Recommendations: Efforts should be made to arrange refresher courses and training workshops on the utilization of different legal databases and different search techniques for retrieval of information sources. This practice will enhance the information literacy skills of lawyers, which will ultimately result in a better legal system in Pakistan. Practical implication(s): The findings of the study will motivate the policymakers and authorities of legal forums to restructure the information literacy programs to fulfill the lawyers' information needs. Contribution to the knowledge: No significant work has been done on the lawyers' information literacy skills in Khyber Pakhtunkhwa-Pakistan. It will bring a clear picture of the information literacy skills of law practitioners and address the problems faced by them during the seeking process.

Keywords: information literacy-Pakistan, infromation literacy-lawyers, information literacy-lawyers-KP, law practitioners-Pakistan

Procedia PDF Downloads 149
9965 Durrmeyer Type Modification of q-Generalized Bernstein Operators

Authors: Ruchi, A. M. Acu, Purshottam N. Agrawal

Abstract:

The purpose of this paper to introduce the Durrmeyer type modification of q-generalized-Bernstein operators which include the Bernstein polynomials in the particular α = 0. We investigate the rate of convergence by means of the Lipschitz class and the Peetre’s K-functional. Also, we define the bivariate case of Durrmeyer type modification of q-generalized-Bernstein operators and study the degree of approximation with the aid of the partial modulus of continuity and the Peetre’s K-functional. Finally, we introduce the GBS (Generalized Boolean Sum) of the Durrmeyer type modification of q- generalized-Bernstein operators and investigate the approximation of the Bögel continuous and Bögel differentiable functions with the aid of the Lipschitz class and the mixed modulus of smoothness.

Keywords: Bögel continuous, Bögel differentiable, generalized Boolean sum, Peetre’s K-functional, Lipschitz class, mixed modulus of smoothness

Procedia PDF Downloads 213
9964 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods

Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim

Abstract:

Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.

Keywords: economical analysis, probability of failure, retaining walls, statistical analysis

Procedia PDF Downloads 406
9963 Discovery of the Piano Extended Techniques by Focusing on Symbols That George Crumb Used in Makrokosmos Volumes

Authors: Parham Bakhtiari

Abstract:

George Crumb's Makrokosmos Volumes are considered significant pieces in twentieth-century piano music and showcase the extensive use of different tones and extended techniques on the piano. Crumb's works are known for making references, particularly to music from previous eras which the visual, aural, and numerical characteristics are symbolic in nature. Crumb created a list of symbols and shortened letters to clarify his unique directions to those who performed his compositions. The pianists who prepare to play Makrokosmos must dedicate time to study and analyze Crumb's markings diligently to accurately capture the composer's wishes. The aim of this paper is to provide a collection for pianists looking to perform George Crumb's compositions known as Makrokosmos Volumes. The research traits of unconventional playing techniques and discussions on the music explored by the composer are being described.

Keywords: music, piano, Crumb, Makrokosmos, performance

Procedia PDF Downloads 47
9962 Software Quality Assurance in Network Security using Cryptographic Techniques

Authors: Sidra Shabbir, Ayesha Manzoor, Mehreen Sirshar

Abstract:

The use of the network communication has imposed serious threats to the security of assets over the network. Network security is getting more prone to active and passive attacks which may result in serious consequences to data integrity, confidentiality and availability. Various cryptographic techniques have been proposed in the past few years to combat with the concerned problem by ensuring quality but in order to have a fully secured network; a framework of new cryptosystem was needed. This paper discusses certain cryptographic techniques which have shown far better improvement in the network security with enhanced quality assurance. The scope of this research paper is to cover the security pitfalls in the current systems and their possible solutions based on the new cryptosystems. The development of new cryptosystem framework has paved a new way to the widespread network communications with enhanced quality in network security.

Keywords: cryptography, network security, encryption, decryption, integrity, confidentiality, security algorithms, elliptic curve cryptography

Procedia PDF Downloads 733
9961 Machine Learning Techniques in Seismic Risk Assessment of Structures

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.

Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine

Procedia PDF Downloads 106
9960 Aerobic Bioprocess Control Using Artificial Intelligence Techniques

Authors: M. Caramihai, Irina Severin

Abstract:

This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.

Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques

Procedia PDF Downloads 421
9959 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context

Authors: M. G. N. A. S. Fernando

Abstract:

This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.

Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies

Procedia PDF Downloads 163
9958 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement: A Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: failure, pavement, probability, reliability index, simulation, tensile crack

Procedia PDF Downloads 546