Search results for: random hough transform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3436

Search results for: random hough transform

3196 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 259
3195 Optimal Continuous Scheduled Time for a Cumulative Damage System with Age-Dependent Imperfect Maintenance

Authors: Chin-Chih Chang

Abstract:

Many manufacturing systems suffer failures due to complex degradation processes and various environment conditions such as random shocks. Consider an operating system is subject to random shocks and works at random times for successive jobs. When successive jobs often result in production losses and performance deterioration, it would be better to do maintenance or replacement at a planned time. A preventive replacement (PR) policy is presented to replace the system before a failure occurs at a continuous time T. In such a policy, the failure characteristics of the system are designed as follows. Each job would cause a random amount of additive damage to the system, and the system fails when the cumulative damage has exceeded a failure threshold. Suppose that the deteriorating system suffers one of the two types of shocks with age-dependent probabilities: type-I (minor) shock is rectified by a minimal repair, or type-II (catastrophic) shock causes the system to fail. A corrective replacement (CR) is performed immediately when the system fails. In summary, a generalized maintenance model to scheduling replacement plan for an operating system is presented below. PR is carried out at time T, whereas CR is carried out when any type-II shock occurs and the total damage exceeded a failure level. The main objective is to determine the optimal continuous schedule time of preventive replacement through minimizing the mean cost rate function. The existence and uniqueness of optimal replacement policy are derived analytically. It can be seen that the present model is a generalization of the previous models, and the policy with preventive replacement outperforms the one without preventive replacement.

Keywords: preventive replacement, working time, cumulative damage model, minimal repair, imperfect maintenance, optimization

Procedia PDF Downloads 330
3194 Red Blood Cells Deformability: A Chaotic Process

Authors: Ana M. Korol, Bibiana Riquelme, Osvaldo A. Rosso

Abstract:

Since erythrocyte deformability analysis is mostly qualitative, the development of quantitative nonlinear methods is crucial for restricting subjectivity in the study of cell behaviour. An electro-optic mechanic system called erythrodeformeter has been developed and constructed in our laboratory in order to evaluate the erythrocytes' viscoelasticity. A numerical method formulated on the basis of fractal approximation for ordinary (OBM) and fractionary Brownian motion (FBM), as well as wavelet transform analysis, are proposed to distinguish chaos from noise based on the assumption that diffractometric data involves both deterministic and stochastic components, so it could be modelled as a system of bounded correlated random walk. Here we report studies on 25 donors: 4 alpha thalassaemic patients, 11 beta thalassaemic patients, and 10 healthy controls non-alcoholic and non-smoker individuals. The Correlation Coefficient, a nonlinear parameter, showed evidence of the changes in the erythrocyte deformability; the Wavelet Entropy could quantify those differences which are detected by the light diffraction patterns. Such quantifiers allow a good deal of promise and the possibility of a better understanding of the rheological erythrocytes aspects and also could help in clinical diagnosis.

Keywords: red blood cells, deformability, nonlinear dynamics, chaos theory, wavelet trannsform

Procedia PDF Downloads 35
3193 The Modelling of Real Time Series Data

Authors: Valeria Bondarenko

Abstract:

We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.

Keywords: mathematical model, random process, Wiener process, fractional Brownian motion

Procedia PDF Downloads 323
3192 Free Vibration of Orthotropic Plate with Four Clamped Edges

Authors: Yang Zhong, Meijie Xu

Abstract:

The explicit solutions for the natural frequencies and mode shapes of the orthotropic rectangular plate with four clamped edges are presented by the double finite cosine integral transform method. In the analysis procedure, the classical orthotropic rectangular thin plate is considered. Because only are the basic dynamic elasticity equations of the orthotropic thin plate adopted, it is not need prior to select the deformation function arbitrarily. Therefore, the solution developed by this paper is reasonable and theoretical. Finally, an illustrative example is given and the results are compared with those reported earlier. This method is found to be easier and effective. The results show reasonable agreement with other available results, but with a simpler and practical approach.

Keywords: rectangular orthotropic plate, four clamped edges, natural frequencies and mode shapes, finite integral transform

Procedia PDF Downloads 545
3191 Effect of the Aluminium Concentration on the Laser Wavelength of Random Trimer Barrier AlxGa1-xAs Superlattices

Authors: Samir Bentata, Fatima Bendahma

Abstract:

We have numerically investigated the effect of Aluminium concentration on the the laser wavelength of random trimer barrier AlxGa1-xAs superlattices (RTBSL). Such systems consist of two different structures randomly distributed along the growth direction, with the additional constraint that the barriers of one kind appear in triply. An explicit formula is given for evaluating the transmission coefficient of superlattices (SL's) with intentional correlated disorder. The method is based on Airy function formalism and the transfer-matrix technique. We discuss the impact of the Aluminium concentration associate to the structure profile on the laser wavelengths.

Keywords: superlattices, correlated disorder, transmission coefficient, laser wavelength

Procedia PDF Downloads 308
3190 Single Imputation for Audiograms

Authors: Sarah Beaver, Renee Bryce

Abstract:

Audiograms detect hearing impairment, but missing values pose problems. This work explores imputations in an attempt to improve accuracy. This work implements Linear Regression, Lasso, Linear Support Vector Regression, Bayesian Ridge, K Nearest Neighbors (KNN), and Random Forest machine learning techniques to impute audiogram frequencies ranging from 125Hz to 8000Hz. The data contains patients who had or were candidates for cochlear implants. Accuracy is compared across two different Nested Cross-Validation k values. Over 4000 audiograms were used from 800 unique patients. Additionally, training on data combines and compares left and right ear audiograms versus single ear side audiograms. The accuracy achieved using Root Mean Square Error (RMSE) values for the best models for Random Forest ranges from 4.74 to 6.37. The R\textsuperscript{2} values for the best models for Random Forest ranges from .91 to .96. The accuracy achieved using RMSE values for the best models for KNN ranges from 5.00 to 7.72. The R\textsuperscript{2} values for the best models for KNN ranges from .89 to .95. The best imputation models received R\textsuperscript{2} between .89 to .96 and RMSE values less than 8dB. We also show that the accuracy of classification predictive models performed better with our best imputation models versus constant imputations by a two percent increase.

Keywords: machine learning, audiograms, data imputations, single imputations

Procedia PDF Downloads 52
3189 Two-Stage Flowshop Scheduling with Unsystematic Breakdowns

Authors: Fawaz Abdulmalek

Abstract:

The two-stage flowshop assembly scheduling problem is considered in this paper. There are more than one parallel machines at stage one and an assembly machine at stage two. The jobs will be processed into the flowshop based on Johnson rule and two extensions of Johnson rule. A simulation model of the two-stage flowshop is constructed where both machines at stage one are subject to random failures. Three simulation experiments will be conducted to test the effect of the three job ranking rules on the makespan. Johnson Largest heuristic outperformed both Johnson rule and Johnson Smallest heuristic for two performed experiments for all scenarios where each experiments having five scenarios.

Keywords: flowshop scheduling, random failures, johnson rule, simulation

Procedia PDF Downloads 306
3188 Enhancement of Pulsed Eddy Current Response Based on Power Spectral Density after Continuous Wavelet Transform Decomposition

Authors: A. Benyahia, M. Zergoug, M. Amir, M. Fodil

Abstract:

The main objective of this work is to enhance the Pulsed Eddy Current (PEC) response from the aluminum structure using signal processing. Cracks and metal loss in different structures cause changes in PEC response measurements. In this paper, time-frequency analysis is used to represent PEC response, which generates a large quantity of data and reduce the noise due to measurement. Power Spectral Density (PSD) after Wavelet Decomposition (PSD-WD) is proposed for defect detection. The experimental results demonstrate that the cracks in the surface can be extracted satisfactorily by the proposed methods. The validity of the proposed method is discussed.

Keywords: DT, pulsed eddy current, continuous wavelet transform, Mexican hat wavelet mother, defect detection, power spectral density.

Procedia PDF Downloads 205
3187 Dimension Free Rigid Point Set Registration in Linear Time

Authors: Jianqin Qu

Abstract:

This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.

Keywords: covariant point, point matching, dimension free, rigid registration

Procedia PDF Downloads 143
3186 Climate Changes in Albania and Their Effect on Cereal Yield

Authors: Lule Basha, Eralda Gjika

Abstract:

This study is focused on analyzing climate change in Albania and its potential effects on cereal yields. Initially, monthly temperature and rainfalls in Albania were studied for the period 1960-2021. Climacteric variables are important variables when trying to model cereal yield behavior, especially when significant changes in weather conditions are observed. For this purpose, in the second part of the study, linear and nonlinear models explaining cereal yield are constructed for the same period, 1960-2021. The multiple linear regression analysis and lasso regression method are applied to the data between cereal yield and each independent variable: average temperature, average rainfall, fertilizer consumption, arable land, land under cereal production, and nitrous oxide emissions. In our regression model, heteroscedasticity is not observed, data follow a normal distribution, and there is a low correlation between factors, so we do not have the problem of multicollinearity. Machine-learning methods, such as random forest, are used to predict cereal yield responses to climacteric and other variables. Random Forest showed high accuracy compared to the other statistical models in the prediction of cereal yield. We found that changes in average temperature negatively affect cereal yield. The coefficients of fertilizer consumption, arable land, and land under cereal production are positively affecting production. Our results show that the Random Forest method is an effective and versatile machine-learning method for cereal yield prediction compared to the other two methods.

Keywords: cereal yield, climate change, machine learning, multiple regression model, random forest

Procedia PDF Downloads 57
3185 Using Scale Invariant Feature Transform Features to Recognize Characters in Natural Scene Images

Authors: Belaynesh Chekol, Numan Çelebi

Abstract:

The main purpose of this work is to recognize individual characters extracted from natural scene images using scale invariant feature transform (SIFT) features as an input to K-nearest neighbor (KNN); a classification learner algorithm. For this task, 1,068 and 78 images of English alphabet characters taken from Chars74k data set is used to train and test the classifier respectively. For each character image, We have generated describing features by using SIFT algorithm. This set of features is fed to the learner so that it can recognize and label new images of English characters. Two types of KNN (fine KNN and weighted KNN) were trained and the resulted classification accuracy is 56.9% and 56.5% respectively. The training time taken was the same for both fine and weighted KNN.

Keywords: character recognition, KNN, natural scene image, SIFT

Procedia PDF Downloads 254
3184 Real-Time Path Planning for Unmanned Air Vehicles Using Improved Rapidly-Exploring Random Tree and Iterative Trajectory Optimization

Authors: A. Ramalho, L. Romeiro, R. Ventura, A. Suleman

Abstract:

A real-time path planning framework for Unmanned Air Vehicles, and in particular multi-rotors is proposed. The framework is designed to provide feasible trajectories from the current UAV position to a goal state, taking into account constraints such as obstacle avoidance, problem kinematics, and vehicle limitations such as maximum speed and maximum acceleration. The framework computes feasible paths online, allowing to avoid new, unknown, dynamic obstacles without fully re-computing the trajectory. These features are achieved using an iterative process in which the robot computes and optimizes the trajectory while performing the mission objectives. A first trajectory is computed using a modified Rapidly-Exploring Random Tree (RRT) algorithm, that provides trajectories that respect a maximum curvature constraint. The trajectory optimization is accomplished using the Interior Point Optimizer (IPOPT) as a solver. The framework has proven to be able to compute a trajectory and optimize to a locally optimal with computational efficiency making it feasible for real-time operations.

Keywords: interior point optimization, multi-rotors, online path planning, rapidly exploring random trees, trajectory optimization

Procedia PDF Downloads 112
3183 Scintigraphic Image Coding of Region of Interest Based on SPIHT Algorithm Using Global Thresholding and Huffman Coding

Authors: A. Seddiki, M. Djebbouri, D. Guerchi

Abstract:

Medical imaging produces human body pictures in digital form. Since these imaging techniques produce prohibitive amounts of data, compression is necessary for storage and communication purposes. Many current compression schemes provide a very high compression rate but with considerable loss of quality. On the other hand, in some areas in medicine, it may be sufficient to maintain high image quality only in region of interest (ROI). This paper discusses a contribution to the lossless compression in the region of interest of Scintigraphic images based on SPIHT algorithm and global transform thresholding using Huffman coding.

Keywords: global thresholding transform, huffman coding, region of interest, SPIHT coding, scintigraphic images

Procedia PDF Downloads 335
3182 Lossless Secret Image Sharing Based on Integer Discrete Cosine Transform

Authors: Li Li, Ahmed A. Abd El-Latif, Aya El-Fatyany, Mohamed Amin

Abstract:

This paper proposes a new secret image sharing method based on integer discrete cosine transform (IntDCT). It first transforms the original image into the frequency domain (DCT coefficients) using IntDCT, which are operated on each block with size 8*8. Then, it generates shares among each DCT coefficients in the same place of each block, that is, all the DC components are used to generate DC shares, the ith AC component in each block are utilized to generate ith AC shares, and so on. The DC and AC shares components with the same number are combined together to generate DCT shadows. Experimental results and analyses show that the proposed method can recover the original image lossless than those methods based on traditional DCT and is more sensitive to tiny change in both the coefficients and the content of the image.

Keywords: secret image sharing, integer DCT, lossless recovery, sensitivity

Procedia PDF Downloads 373
3181 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 97
3180 ANFIS Approach for Locating Faults in Underground Cables

Authors: Magdy B. Eteiba, Wael Ismael Wahba, Shimaa Barakat

Abstract:

This paper presents a fault identification, classification and fault location estimation method based on Discrete Wavelet Transform and Adaptive Network Fuzzy Inference System (ANFIS) for medium voltage cable in the distribution system. Different faults and locations are simulated by ATP/EMTP, and then certain selected features of the wavelet transformed signals are used as an input for a training process on the ANFIS. Then an accurate fault classifier and locator algorithm was designed, trained and tested using current samples only. The results obtained from ANFIS output were compared with the real output. From the results, it was found that the percentage error between ANFIS output and real output is less than three percent. Hence, it can be concluded that the proposed technique is able to offer high accuracy in both of the fault classification and fault location.

Keywords: ANFIS, fault location, underground cable, wavelet transform

Procedia PDF Downloads 478
3179 Similarities and Differences in Values of Young Women and Their Parents: The Effect of Value Transmission and Value Change

Authors: J. Fryt, K. Pietras, T. Smolen

Abstract:

Intergenerational similarities in values may be effect of value transmission within families or socio-cultural trends prevailing at a specific point in time. According to salience hypothesis, salient family values may be transmitted more frequently. On the other hand, many value studies reveal that generational shift from social values (conservation and self-transcendence) to more individualistic values (openness to change and self-enhancement) suggest that value transmission and value change are two different processes. The first aim of our study was to describe similarities and differences in values of young women and their parents. The second aim was to determine which value similarities may be due to transmission within families. Ninety seven Polish women aged 19-25 and both their mothers and fathers filled in the Portrait Value Questionaire. Intergenerational similarities in values between women were found in strong preference for benevolence, universalism and self-direction as well as low preference for power. Similarities between younger women and older men were found in strong preference for universalism and hedonism as well as lower preference for security and tradition. Young women differed from older generation in strong preference for stimulation and achievement as well as low preference for conformity. To identify the origin of intergenerational similarities (whether they are the effect of value transmission within families or not), we used the comparison between correlations of values in family dyads (mother-daughter, father-daughter) and distribution of correlations in random intergenerational dyads (random mother-daughter, random father-daughter) as well as peer dyads (random daughter-daughter). Values representing conservation (security, tradition and conformity) as well as benevolence and power were transmitted in families between women. Achievement, power and security were transmitted between fathers and daughters. Similarities in openness to change (self-direction, stimulation and hedonism) and universalism were not stronger within families than in random intergenerational and peer dyads. Taken together, our findings suggest that despite noticeable generation shift from social to more individualistic values, we can observe transmission of parents’ salient values such as security, tradition, benevolence and achievement.

Keywords: value transmission, value change, intergenerational similarities, differences in values

Procedia PDF Downloads 403
3178 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform

Authors: Hana Rabbouch

Abstract:

In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.

Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets

Procedia PDF Downloads 114
3177 A Study of Classification Models to Predict Drill-Bit Breakage Using Degradation Signals

Authors: Bharatendra Rai

Abstract:

Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.

Keywords: degradation signal, drill-bit breakage, random forest, multinomial logistic regression

Procedia PDF Downloads 322
3176 Investigation of Alfa Fibers Reinforced Epoxy-Amine Composites Properties

Authors: Amar Boukerrou, Ouerdia Belhadj, Dalila Hammiche, Jean Francois Gerard, Jannick Rumeau

Abstract:

The main goal of this study is the investigation of alfa fiber content, treated with alkali treatment, on the thermal and mechanical properties of epoxy-amine matrix-based composites. The fibers were treated with 5% of sodium hydroxide solution and varied between 10% to 30% weight fractions. The tensile, flexural, and hardness tests are carried out to investigate the mechanical properties of composites. The results show those composites’ mechanical properties are higher than the neat epoxy-amine. It was noticed that the alkali treatment is more effective in the case of the tensile and flexural modulus than the tensile and flexural strength. The decline of both the tensile and flexural behavior of all composites with the increasing of the filler content was due probably to the random dispersion of the fibers in the epoxy resin The Fourier transform infrared (FTIR) was employed to analyze the chemical structure of epoxy resin before and after curing with amine hardener. FTIR and DSC analysis confirmed that epoxy resin was completely cured with amine hardener at room temperature. SEM analysis has highlighted the microstructure of epoxy matrix and its composites.

Keywords: alfa fiber, epoxy resin, alkali treatment, mechanical properties

Procedia PDF Downloads 72
3175 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram

Procedia PDF Downloads 193
3174 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 87
3173 Transform to Succeed: An Empirical Analysis of Digital Transformation in Firms

Authors: Sarah E. Stief, Anne Theresa Eidhoff, Markus Voeth

Abstract:

Despite all progress firms are facing the increasing need to adapt and assimilate digital technologies to transform their business activities in order to pursue business development. By using new digital technologies, firms can implement major business improvements in order to stay competitive and foster new growth potentials. The corresponding phenomenon of digital transformation has received some attention in previous literature in respect to industries such as media and publishing. Nevertheless, there is a lack of understanding of the concept and its organization within firms. With the help of twenty-three in-depth field interviews with German experts responsible for their company’s digital transformation, we examined what digital transformation encompasses, how it is organized and which opportunities and challenges arise within firms. Our results indicate that digital transformation is an inevitable task for all firms, as it bears the potential to comprehensively optimize and reshape established business activities and can thus be seen as a strategy of business development.

Keywords: business development, digitalization, digital strategies, digital transformation

Procedia PDF Downloads 385
3172 Different Sampling Schemes for Semi-Parametric Frailty Model

Authors: Nursel Koyuncu, Nihal Ata Tutkun

Abstract:

Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.

Keywords: frailty model, ranked set sampling, efficiency, simple random sampling

Procedia PDF Downloads 183
3171 Linear fractional differential equations for second kind modified Bessel functions

Authors: Jorge Olivares, Fernando Maass, Pablo Martin

Abstract:

Fractional derivatives have been considered recently as a way to solve different problems in Engineering. In this way, second kind modified Bessel functions are considered here. The order α fractional differential equations of second kind Bessel functions, Kᵥ(x), are studied with simple initial conditions. The Laplace transform and Caputo definition of fractional derivatives are considered. Solutions have been found for ν=1/3, 1/2, 2/3, -1/3, -1/2 and (-2/3). In these cases, the solutions are the sum of two hypergeometric functions. The α fractional derivatives have been for α=1/3, 1/2 and 2/3, and the above values of ν. No convergence has been found for the integer values of ν Furthermore when α has been considered as a rational found m/p, no general solution has been found. Clearly, this case is more difficult to treat than those of first kind Bessel Function.

Keywords: Caputo, modified Bessel functions, hypergeometric, linear fractional differential equations, transform Laplace

Procedia PDF Downloads 306
3170 Vibration Imaging Method for Vibrating Objects with Translation

Authors: Kohei Shimasaki, Tomoaki Okamura, Idaku Ishii

Abstract:

We propose a vibration imaging method for high frame rate (HFR)-video-based localization of vibrating objects with large translations. When the ratio of the translation speed of a target to its vibration frequency is large, obtaining its frequency response in image intensities becomes difficult because one or no waves are observable at the same pixel. Our method can precisely localize moving objects with vibration by virtually translating multiple image sequences for pixel-level short-time Fourier transform to observe multiple waves at the same pixel. The effectiveness of the proposed method is demonstrated by analyzing several HFR videos of flying insects in real scenarios.

Keywords: HFR video analysis, pixel-level vibration source localization, short-time Fourier transform, virtual translation

Procedia PDF Downloads 82
3169 Robust and Transparent Spread Spectrum Audio Watermarking

Authors: Ali Akbar Attari, Ali Asghar Beheshti Shirazi

Abstract:

In this paper, we propose a blind and robust audio watermarking scheme based on spread spectrum in Discrete Wavelet Transform (DWT) domain. Watermarks are embedded in the low-frequency coefficients, which is less audible. The key idea is dividing the audio signal into small frames, and magnitude of the 6th level of DWT approximation coefficients is modifying based upon the Direct Sequence Spread Spectrum (DSSS) technique. Also, the psychoacoustic model for enhancing in imperceptibility, as well as Savitsky-Golay filter for increasing accuracy in extraction, is used. The experimental results illustrate high robustness against most common attacks, i.e. Gaussian noise addition, Low pass filter, Resampling, Requantizing, MP3 compression, without significant perceptual distortion (ODG is higher than -1). The proposed scheme has about 83 bps data payload.

Keywords: audio watermarking, spread spectrum, discrete wavelet transform, psychoacoustic, Savitsky-Golay filter

Procedia PDF Downloads 171
3168 Integrating Process Planning, WMS Dispatching, and WPPW Weighted Due Date Assignment Using a Genetic Algorithm

Authors: Halil Ibrahim Demir, Tarık Cakar, Ibrahim Cil, Muharrem Dugenci, Caner Erden

Abstract:

Conventionally, process planning, scheduling, and due-date assignment functions are performed separately and sequentially. The interdependence of these functions requires integration. Although integrated process planning and scheduling, and scheduling with due date assignment problems are popular research topics, only a few works address the integration of these three functions. This work focuses on the integration of process planning, WMS scheduling, and WPPW due date assignment. Another novelty of this work is the use of a weighted due date assignment. In the literature, due dates are generally assigned without considering the importance of customers. However, in this study, more important customers get closer due dates. Typically, only tardiness is punished, but the JIT philosophy punishes both earliness and tardiness. In this study, all weighted earliness, tardiness, and due date related costs are penalized. As no customer desires distant due dates, such distant due dates should be penalized. In this study, various levels of integration of these three functions are tested and genetic search and random search are compared both with each other and with ordinary solutions. Higher integration levels are superior, while search is always useful. Genetic searches outperformed random searches.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic algorithm, random search

Procedia PDF Downloads 352
3167 Fusion Models for Cyber Threat Defense: Integrating Clustering, Random Forests, and Support Vector Machines to Against Windows Malware

Authors: Azita Ramezani, Atousa Ramezani

Abstract:

In the ever-escalating landscape of windows malware the necessity for pioneering defense strategies turns into undeniable this study introduces an avant-garde approach fusing the capabilities of clustering random forests and support vector machines SVM to combat the intricate web of cyber threats our fusion model triumphs with a staggering accuracy of 98.67 and an equally formidable f1 score of 98.68 a testament to its effectiveness in the realm of windows malware defense by deciphering the intricate patterns within malicious code our model not only raises the bar for detection precision but also redefines the paradigm of cybersecurity preparedness this breakthrough underscores the potential embedded in the fusion of diverse analytical methodologies and signals a paradigm shift in fortifying against the relentless evolution of windows malicious threats as we traverse through the dynamic cybersecurity terrain this research serves as a beacon illuminating the path toward a resilient future where innovative fusion models stand at the forefront of cyber threat defense.

Keywords: fusion models, cyber threat defense, windows malware, clustering, random forests, support vector machines (SVM), accuracy, f1-score, cybersecurity, malicious code detection

Procedia PDF Downloads 35