Search results for: bivariate statistical techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10024

Search results for: bivariate statistical techniques

9634 Energy Management Techniques in Mobile Robots

Authors: G. Gurguze, I. Turkoglu

Abstract:

Today, the developing features of technological tools with limited energy resources have made it necessary to use energy efficiently. Energy management techniques have emerged for this purpose. As with every field, energy management is vital for robots that are being used in many areas from industry to daily life and that are thought to take up more spaces in the future. Particularly, effective power management in autonomous and multi robots, which are getting more complicated and increasing day by day, will improve the performance and success. In this study, robot management algorithms, usage of renewable and hybrid energy sources, robot motion patterns, robot designs, sharing strategies of workloads in multiple robots, road and mission planning algorithms are discussed for efficient use of energy resources by mobile robots. These techniques have been evaluated in terms of efficient use of existing energy resources and energy management in robots.

Keywords: energy management, mobile robot, robot administration, robot management, robot planning

Procedia PDF Downloads 245
9633 The Effect of Artificial Intelligence on Decoration Designs

Authors: Ayed Mouris Gad Elsayed Khalil

Abstract:

This research focuses on historical techniques associated with the Lajevardin and Haft-Rangi production methods in tile production, with particular attention to identifying techniques for applying gold leaf to the surface of these historical glazed tiles. In this context, the history of the production of glazed, gilded and glazed Lajevardin ceramics from the Khwarizmanshahid and Mongol periods (11th to 13th centuries) was first evaluated in order to better understand the context and history of the methods of historical enameling. After a historical overview of glazed ceramic production techniques and the adoption of these techniques by civilizations, we focused on the niche production methods of glazes and Lajevardin glazes, two categories of decoration commonly found on tiles. A general method for classifying the different types of gold tiles was then introduced, applicable to tiles from to the Safavid period (16th-17th centuries). These categories include gold glazed Lajevardina tiles, haft rangi gold tiles, gold glazed monolithic tiles and gold mosaic tiles.

Keywords: ethnicity, multi-cultural, jewelry, craft techniquemycenaean, ceramic, provenance, pigmentAmorium, glass bracelets, image, Byzantine empire

Procedia PDF Downloads 33
9632 Assessment of Work-Related Stress and Its Predictors in Ethiopian Federal Bureau of Investigation in Addis Ababa

Authors: Zelalem Markos Borko

Abstract:

Work-related stress is a reaction that occurs when the work weight progress toward becoming excessive. Therefore, unless properly managed, stress leads to high employee turnover, decreased performance, illness and absenteeism. Yet, little has been addressed regarding work-related stress and its predictors in the study area. Therefore, the objective of this study was to assess stress prevalence and its predictors in the study area. To that effect, a cross-sectional study design was conducted on 281 employees from the Ethiopian Federal Bureau of Investigation by using stratified random sampling techniques. Survey questionnaire scales were employed to collect data. Data were analyzed by percentage, Pearson correlation coefficients, simple linear regression, multiple linear regressions, independent t-test and one-way ANOVA statistical techniques. In the present study13.9% of participants faced high stress, whereas 13.5% of participants faced low stress and the rest 72.6% of officers experienced moderate stress. There is no significant group difference among workers due to age, gender, marital status, educational level, years of service and police rank. This study concludes that factors such as role conflict, performance over-utilization, role ambiguity, and qualitative and quantitative role overload together predict 39.6% of work-related stress. This indicates that 60.4% of the variation in stress is explained by other factors, so other additional research should be done to identify additional factors predicting stress. To prevent occupational stress among police, the Ethiopian Federal Bureau of Investigation should develop strategies based on factors that will help to develop stress reduction management.

Keywords: work-related stress, Ethiopian federal bureau of investigation, predictors, Addis Ababa

Procedia PDF Downloads 47
9631 Modern Pedagogy Techniques for DC Motor Speed Control

Authors: Rajesh Kumar, Roopali Dogra, Puneet Aggarwal

Abstract:

Based on a survey conducted for second and third year students of the electrical engineering department at Maharishi Markandeshwar University, India, it was found that around 92% of students felt that it would be better to introduce a virtual environment for laboratory experiments. Hence, a need was felt to perform modern pedagogy techniques for students which consist of a virtual environment using MATLAB/Simulink. In this paper, a virtual environment for the speed control of a DC motor is performed using MATLAB/Simulink. The various speed control methods for the DC motor include the field resistance control method and armature voltage control method. The performance analysis of the DC motor is hence analyzed.

Keywords: DC Motor, field control, pedagogy techniques, speed control, virtual environment, voltage control

Procedia PDF Downloads 411
9630 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 97
9629 Urbanization and Income Inequality in Thailand

Authors: Acumsiri Tantikarnpanit

Abstract:

This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020. Using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for nineteen selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (Labor Force Survey: LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.

Keywords: income inequality, nighttime light, population density, Thailand, urbanization

Procedia PDF Downloads 54
9628 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 48
9627 Modelling the Effects of External Factors Affecting Concrete Carbonation

Authors: Abhishek Mangal, Kunal Tongaria, S. Mandal, Devendra Mohan

Abstract:

Carbonation of reinforced concrete structures has emerged as one of the major challenges for Civil engineers across the world. With increasing emissions from various activities, carbon dioxide concentration in the atmosphere has been eve rising, enhancing its penetration in porous concrete, reaching steel bars and ultimately leading to premature failure. Several literatures have been published dealing with the various interdependent variables related to carbonation. However, with innumerable variability a generalization of these data proves to be a troublesome task. This paper looks into this carbonation anomaly in concrete structures caused by various external variables such as relative humidity, concentration of CO2, curing period and ambient temperature. Significant discussions and comparisons have been presented on the basis of various studies conducted with an aim to predict the depth of carbonation as a function of these multidimensional parameters using various numerical and statistical modelling techniques.

Keywords: carbonation, curing, exposure conditions, relative humidity

Procedia PDF Downloads 232
9626 Modelling High-Frequency Crude Oil Dynamics Using Affine and Non-Affine Jump-Diffusion Models

Authors: Katja Ignatieva, Patrick Wong

Abstract:

We investigated the dynamics of high frequency energy prices, including crude oil and electricity prices. The returns of underlying quantities are modelled using various parametric models such as stochastic framework with jumps and stochastic volatility (SVCJ) as well as non-parametric alternatives, which are purely data driven and do not require specification of the drift or the diffusion coefficient function. Using different statistical criteria, we investigate the performance of considered parametric and nonparametric models in their ability to forecast price series and volatilities. Our models incorporate possible seasonalities in the underlying dynamics and utilise advanced estimation techniques for the dynamics of energy prices.

Keywords: stochastic volatility, affine jump-diffusion models, high frequency data, model specification, markov chain monte carlo

Procedia PDF Downloads 76
9625 Statistical Shape Analysis of the Human Upper Airway

Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar

Abstract:

The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.

Keywords: medical imaging, image processing, FEM/BEM, statistical modelling

Procedia PDF Downloads 485
9624 Co-Factors of Hypertension and Decomposition of Inequalities in Its Prevalence in India: Evidence from NFHS-4

Authors: Ayantika Biswas

Abstract:

Hypertension still remains one of the most important preventable contributors to adult mortality and morbidity and a major public health challenge worldwide. Studying regional and rural-urban differences in prevalence and assessment of the contributions of different indicators is essential in determining the drivers of this condition. The 2015-16 National Family Health Survey data has been used for the study. Bivariate analysis, multinomial regression analysis, concentration indices and decomposition of concentration indices assessing contribution of factors has been undertaken in the present study. An overall concentration index of 0.003 has been found for hypertensive population, which shows its concentration among the richer wealth quintiles. The contribution of factors like age 45 to 49 years, years of schooling between 5 to 9 years are factors that are important contributors to inequality in hypertension occurrence. Studies should be conducted to find approaches to prevent or delay the onset of the condition.

Keywords: hypertension, decomposition, inequalities, India

Procedia PDF Downloads 120
9623 Review of Ultrasound Image Processing Techniques for Speckle Noise Reduction

Authors: Kwazikwenkosi Sikhakhane, Suvendi Rimer, Mpho Gololo, Khmaies Oahada, Adnan Abu-Mahfouz

Abstract:

Medical ultrasound imaging is a crucial diagnostic technique due to its affordability and non-invasiveness compared to other imaging methods. However, the presence of speckle noise, which is a form of multiplicative noise, poses a significant obstacle to obtaining clear and accurate images in ultrasound imaging. Speckle noise reduces image quality by decreasing contrast, resolution, and signal-to-noise ratio (SNR). This makes it difficult for medical professionals to interpret ultrasound images accurately. To address this issue, various techniques have been developed to reduce speckle noise in ultrasound images, which improves image quality. This paper aims to review some of these techniques, highlighting the advantages and disadvantages of each algorithm and identifying the scenarios in which they work most effectively.

Keywords: image processing, noise, speckle, ultrasound

Procedia PDF Downloads 74
9622 Count Data Regression Modeling: An Application to Spontaneous Abortion in India

Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan

Abstract:

Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.

Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression

Procedia PDF Downloads 126
9621 Differential Effect of Technique Majors on Isokinetic Strength in Youth Judoka Athletes

Authors: Chungyu Chen, Yi-Cheng Chen, Po-Hsian Hsu, Hsin-Ying Chen, Yen-Po Hsiao

Abstract:

The purpose of this study was to assess the muscular strength performance of upper and lower extremity in isokinetic system for the youth judo players, and also to compare the strength difference between major techniques. Sixteen male and 20 female judo players (age: 16.7 ± 1.6 years old, training age: 4.5 ± 0.8 years) were served as the volunteers for this study. There were 21 players major hand techniques and 15 players major foot techniques. The Biodex S4 Pro was used to assess the strength performance of extensor and flexor of concentric action under the load condition of 30 degree/sec, 60 degree/sec, and 120 degree/sec for elbow joints and knee joints. The strength parameters were included the maximal torque, the normalized maximal torque, the average power, and the average maximal torque. A t test for independent groups was used to evaluate whether hand major and foot major differ significantly with an alpha level of .05. The result showed the maximal torque of left knee extensor in foot major players (243.5 ± 36.3 Nm) was higher significantly than hand major (210.7 ± 21.0 Nm) under the load of 30 degree/sec (p < .05). There were no differences in upper extremity strength between the hand and foot techniques major in three loads (ps < .05). It indicated that the judo player is required to develop the upper extremity strength overall to secure the execution of major techniques.

Keywords: knee, elbow, power, judo

Procedia PDF Downloads 432
9620 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 285
9619 Comparative Analysis of Edge Detection Techniques for Extracting Characters

Authors: Rana Gill, Chandandeep Kaur

Abstract:

Segmentation of images can be implemented using different fundamental algorithms like edge detection (discontinuity based segmentation), region growing (similarity based segmentation), iterative thresholding method. A comprehensive literature review relevant to the study gives description of different techniques for vehicle number plate detection and edge detection techniques widely used on different types of images. This research work is based on edge detection techniques and calculating threshold on the basis of five edge operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny. Segmentation of characters present in different type of images like vehicle number plate, name plate of house and characters on different sign boards are selected as a case study in this work. The proposed methodology has seven stages. The proposed system has been implemented using MATLAB R2010a. Comparison of all the five operators has been done on the basis of their performance. From the results it is found that Canny operators produce best results among the used operators and performance of different edge operators in decreasing order is: Canny>Log>Sobel>Prewitt>Roberts.

Keywords: segmentation, edge detection, text, extracting characters

Procedia PDF Downloads 410
9618 Effect of Cost Control and Cost Reduction Techniques in Organizational Performance

Authors: Babatunde Akeem Lawal

Abstract:

In any organization, the primary aim is to maximize profit, but the major challenges facing them is the increase in cost of operation because of this there is increase in cost of production that could lead to inevitable cost control and cost reduction scheme which make it difficult for most organizations to operate at the cost efficient frontier. The study aims to critically examine and evaluate the application of cost control and cost reduction in organization performance and also to review budget as an effective tool of cost control and cost reduction. A descriptive survey research was adopted. A total number of 40 respondent retrieved were used for the study. The analysis of data collected was undertaken by applying appropriate statistical tools. Regression analysis was used to test the hypothesis with the use of SPSS. Based on the findings; it was evident that cost control has a positive impact on organizational performance and also the style of management has a positive impact on organizational performance.

Keywords: organization, cost reduction, cost control, performance, budget, profit

Procedia PDF Downloads 566
9617 A Brief Study about Nonparametric Adherence Tests

Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim

Abstract:

The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.

Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests

Procedia PDF Downloads 422
9616 Passive Retrofitting Strategies for Windows in Hot and Humid Climate Vijayawada

Authors: Monica Anumula

Abstract:

Nowadays human beings attain comfort zone artificially for heating, cooling and lighting the spaces they live, and their main importance is given to aesthetics of building and they are not designed to protect themselves from climate. They depend on artificial sources of energy resulting in energy wastage. In order to reduce the amount of energy being spent in the construction industry and Energy Package goals by 2020, new ways of constructing houses is required. The larger part of energy consumption of a building is directly related to architectural aspects hence nature has to be integrated into the building design to attain comfort zone and reduce the dependency on artificial source of energy. The research is to develop bioclimatic design strategies and techniques for the walls and roofs of Vijayawada houses. Study and analysis of design strategies and techniques of various cases like Kerala, Mangalore etc. for similar kind of climate is examined in this paper. Understanding the vernacular architecture and modern techniques of that various cases and implementing in the housing of Vijayawada not only decreases energy consumption but also enhances socio cultural values of Vijayawada. This study focuses on the comparison of vernacular techniques and modern building bio climatic strategies to attain thermal comfort and energy reduction in hot and humid climate. This research provides further thinking of new strategies which include both vernacular and modern bioclimatic techniques.

Keywords: bioclimatic design, energy consumption, hot and humid climates, thermal comfort

Procedia PDF Downloads 155
9615 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 70
9614 Two-Stage Hospital Efficiency Analysis Including Qualitative Evidence: A Greek Case

Authors: Panos Xenos, Milton Nektarios, John Yfantopoulos

Abstract:

Background: Policy makers, professional organizations and payers have introduced a variety of initiatives and reforms for the health systems worldwide, aimed at improving hospital efficiency. Their efforts are concentrated in two main categories: to constrain increasing healthcare costs and to enhance quality of services provided. Research Objectives: This study examines the efficiency of 112 Greek public hospitals for the year 2009, evaluates the importance of bootstrapping techniques and investigates the effect of contextual factors on hospital efficiency. Furthermore, the effect of qualitative evidence, on hospital efficiency is explored using data from 28 large hospitals. Methods: We applied Data Envelopment Analysis, augmented by bootstrapping techniques, to estimate efficiency scores. In order to measure the effect of environmental factors on hospital efficiency we used Tobit regression analysis. The significance of our models is evaluated using statistical tests to compare distributions. Results: The Kolmogorov-Smirnov test between the original and the bootstrap-corrected efficiency indicates that their distributions are significantly different (p-value<0.01). The environmental factors, that seem to influence efficiency, are Occupancy Rating and the ratio between Outpatient Visits and Inpatient Days. Results indicate that the inclusion of the quality variable in DEA modelling generates statistically significant variations in efficiency scores (p-value<0.05). Conclusions: The inclusion of quality variables and the use of bootstrap resampling in efficiency analysis impose a statistically significant effect on the distribution of efficiency scores. As a policy conclusion we highlight the importance of these methods on hospital efficiency analysis and, by implication, on healthcare resource allocation.

Keywords: hospitals, efficiency, quality, data envelopment analysis, Greek public hospital sector

Procedia PDF Downloads 279
9613 Generation of Quasi-Measurement Data for On-Line Process Data Analysis

Authors: Hyun-Woo Cho

Abstract:

For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.

Keywords: data analysis, diagnosis, monitoring, process data, quality control

Procedia PDF Downloads 456
9612 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 114
9611 Discovery of the Piano Extended Techniques by Focusing on Symbols That George Crumb Used in Makrokosmos Volumes

Authors: Parham Bakhtiari

Abstract:

George Crumb's Makrokosmos Volumes are considered significant pieces in twentieth-century piano music and showcase the extensive use of different tones and extended techniques on the piano. Crumb's works are known for making references, particularly to music from previous eras which the visual, aural, and numerical characteristics are symbolic in nature. Crumb created a list of symbols and shortened letters to clarify his unique directions to those who performed his compositions. The pianists who prepare to play Makrokosmos must dedicate time to study and analyze Crumb's markings diligently to accurately capture the composer's wishes. The aim of this paper is to provide a collection for pianists looking to perform George Crumb's compositions known as Makrokosmos Volumes. The research traits of unconventional playing techniques and discussions on the music explored by the composer are being described.

Keywords: music, piano, Crumb, Makrokosmos, performance

Procedia PDF Downloads 19
9610 Software Quality Assurance in Network Security using Cryptographic Techniques

Authors: Sidra Shabbir, Ayesha Manzoor, Mehreen Sirshar

Abstract:

The use of the network communication has imposed serious threats to the security of assets over the network. Network security is getting more prone to active and passive attacks which may result in serious consequences to data integrity, confidentiality and availability. Various cryptographic techniques have been proposed in the past few years to combat with the concerned problem by ensuring quality but in order to have a fully secured network; a framework of new cryptosystem was needed. This paper discusses certain cryptographic techniques which have shown far better improvement in the network security with enhanced quality assurance. The scope of this research paper is to cover the security pitfalls in the current systems and their possible solutions based on the new cryptosystems. The development of new cryptosystem framework has paved a new way to the widespread network communications with enhanced quality in network security.

Keywords: cryptography, network security, encryption, decryption, integrity, confidentiality, security algorithms, elliptic curve cryptography

Procedia PDF Downloads 709
9609 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks

Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher

Abstract:

Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.

Keywords: neural networks, rainfall, prediction, climatic variables

Procedia PDF Downloads 462
9608 Resonant Fluorescence in a Two-Level Atom and the Terahertz Gap

Authors: Nikolai N. Bogolubov, Andrey V. Soldatov

Abstract:

Terahertz radiation occupies a range of frequencies somewhere from 100 GHz to approximately 10 THz, just between microwaves and infrared waves. This range of frequencies holds promise for many useful applications in experimental applied physics and technology. At the same time, reliable, simple techniques for generation, amplification, and modulation of electromagnetic radiation in this range are far from been developed enough to meet the requirements of its practical usage, especially in comparison to the level of technological abilities already achieved for other domains of the electromagnetic spectrum. This situation of relative underdevelopment of this potentially very important range of electromagnetic spectrum is known under the name of the 'terahertz gap.' Among other things, technological progress in the terahertz area has been impeded by the lack of compact, low energy consumption, easily controlled and continuously radiating terahertz radiation sources. Therefore, development of new techniques serving this purpose as well as various devices based on them is of obvious necessity. No doubt, it would be highly advantageous to employ the simplest of suitable physical systems as major critical components in these techniques and devices. The purpose of the present research was to show by means of conventional methods of non-equilibrium statistical mechanics and the theory of open quantum systems, that a thoroughly studied two-level quantum system, also known as an one-electron two-level 'atom', being driven by external classical monochromatic high-frequency (e.g. laser) field, can radiate continuously at much lower (e.g. terahertz) frequency in the fluorescent regime if the transition dipole moment operator of this 'atom' possesses permanent non-equal diagonal matrix elements. This assumption contradicts conventional assumption routinely made in quantum optics that only the non-diagonal matrix elements persist. The conventional assumption is pertinent to natural atoms and molecules and stems from the property of spatial inversion symmetry of their eigenstates. At the same time, such an assumption is justified no more in regard to artificially manufactured quantum systems of reduced dimensionality, such as, for example, quantum dots, which are often nicknamed 'artificial atoms' due to striking similarity of their optical properties to those ones of the real atoms. Possible ways to experimental observation and practical implementation of the predicted effect are discussed too.

Keywords: terahertz gap, two-level atom, resonant fluorescence, quantum dot, resonant fluorescence, two-level atom

Procedia PDF Downloads 245
9607 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis

Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias

Abstract:

Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.

Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification

Procedia PDF Downloads 336
9606 Aerobic Bioprocess Control Using Artificial Intelligence Techniques

Authors: M. Caramihai, Irina Severin

Abstract:

This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.

Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques

Procedia PDF Downloads 388
9605 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context

Authors: M. G. N. A. S. Fernando

Abstract:

This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.

Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies

Procedia PDF Downloads 139