Search results for: hybrid PSO-GA algorithm and mutual information
14459 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 14214458 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines
Authors: Kamyar Tolouei, Ehsan Moosavi
Abstract:
In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization
Procedia PDF Downloads 10514457 An Improvement of Multi-Label Image Classification Method Based on Histogram of Oriented Gradient
Authors: Ziad Abdallah, Mohamad Oueidat, Ali El-Zaart
Abstract:
Image Multi-label Classification (IMC) assigns a label or a set of labels to an image. The big demand for image annotation and archiving in the web attracts the researchers to develop many algorithms for this application domain. The existing techniques for IMC have two drawbacks: The description of the elementary characteristics from the image and the correlation between labels are not taken into account. In this paper, we present an algorithm (MIML-HOGLPP), which simultaneously handles these limitations. The algorithm uses the histogram of gradients as feature descriptor. It applies the Label Priority Power-set as multi-label transformation to solve the problem of label correlation. The experiment shows that the results of MIML-HOGLPP are better in terms of some of the evaluation metrics comparing with the two existing techniques.Keywords: data mining, information retrieval system, multi-label, problem transformation, histogram of gradients
Procedia PDF Downloads 37414456 Collocation Method Using Quartic B-Splines for Solving the Modified RLW Equation
Authors: A. A. Soliman
Abstract:
The Modified Regularized Long Wave (MRLW) equation is solved numerically by giving a new algorithm based on collocation method using quartic B-splines at the mid-knot points as element shape. Also, we use the fourth Runge-Kutta method for solving the system of first order ordinary differential equations instead of finite difference method. Our test problems, including the migration and interaction of solitary waves, are used to validate the algorithm which is found to be accurate and efficient. The three invariants of the motion are evaluated to determine the conservation properties of the algorithm. The temporal evaluation of a Maxwellian initial pulse is then studied.Keywords: collocation method, MRLW equation, Quartic B-splines, solitons
Procedia PDF Downloads 30314455 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 50214454 Effect of Cryogenic Treatment on Hybrid Natural Fiber Reinforced Polymer Composites
Authors: B. Vinod, L. J. Sudev
Abstract:
Natural fibers as reinforcement in polymer matrix material are gaining lot of attention in recent years. Natural fibers like jute, sisal, coir, hemp, banana etc. have attracted substantial importance as a potential structural material because of its attractive features along with its good mechanical properties. Cryogenic applications of natural fiber reinforced polymer composites are gaining importance. These materials need to possess good mechanical and physical properties at cryogenic temperatures to meet the high requirements by the cryogenic engineering applications. The objective of this work is to investigate the mechanical behavior of hybrid hemp/jute fibers reinforced epoxy composite material at liquid nitrogen temperature. Hybrid hemp/jute fibers reinforced polymer composite is prepared by hand lay-up method and test specimens are cut according to ASTM standards. These test specimens are dipped in liquid nitrogen for different time durations. The tensile properties, flexural properties and impact strength of the specimen are tested immediately after the specimens are removed from liquid nitrogen container. The experimental results indicate that the cryogenic treatment of the polymer composite has a significant effect on the mechanical properties of this material. The tensile properties and flexural properties of the hybrid hemp/jute fibers epoxy composite at liquid nitrogen temperature is higher than at room temperature. The impact strength of the material decreased after subjecting it to liquid nitrogen temperature.Keywords: liquid nitrogen temperature, polymer composite, tensile properties, flexural properties
Procedia PDF Downloads 40314453 Hybrid Treatment Method for Decolorization of Mixed Dyes: Rhodamine-B, Brilliant Green and Congo Red
Authors: D. Naresh Yadav, K. Anand Kishore, Bhaskar Bethi, Shirish H. Sonawane, D. Bhagawan
Abstract:
The untreated industrial wastewater discharged into the environment causes the contamination of soil, water and air. Advanced treatment methods for enhanced wastewater treatment are attracting substantial interest among the currently employed unit processes in wastewater treatment. The textile industry is one of the predominant in wastewater production at current industrialized situation. The refused dyes at textile industry need to be treated in proper manner before its discharge into water bodies. In the present investigation, hybrid treatment process has been developed for the treatment of synthetic mixed dye wastewater. Photocatalysis and ceramic nanoporous membrane are mainly used for process integration to minimize the fouling and increase the flux. Commercial semiconducting powders (TiO2 and ZnO) has used as a nano photocatalyst for the degradation of mixed dye in the hybrid system. Commercial ceramic nanoporous tubular membranes have been used for the rejection of dye and suspended catalysts. Photocatalysis with catalyst has shown the average of 34% of decolorization (RB-32%, BG-34% and CR-36%), whereas ceramic nanofiltration has shown the 56% (RB-54%, BG-56% and CR-58%) of decolorization. Integration of photocatalysis and ceramic nanofiltration has shown 96% (RB-94%, BG-96% and CR-98%) of dye decolorization over 90 min of operation.Keywords: photocatalysis, ceramic nanoporous membrane, wastewater treatment, advanced oxidation process, process integration
Procedia PDF Downloads 26414452 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 51514451 Wait-Optimized Scheduler Algorithm for Efficient Process Scheduling in Computer Systems
Authors: Md Habibur Rahman, Jaeho Kim
Abstract:
Efficient process scheduling is a crucial factor in ensuring optimal system performance and resource utilization in computer systems. While various algorithms have been proposed over the years, there are still limitations to their effectiveness. This paper introduces a new Wait-Optimized Scheduler (WOS) algorithm that aims to minimize process waiting time by dividing them into two layers and considering both process time and waiting time. The WOS algorithm is non-preemptive and prioritizes processes with the shortest WOS. In the first layer, each process runs for a predetermined duration, and any unfinished process is subsequently moved to the second layer, resulting in a decrease in response time. Whenever the first layer is free or the number of processes in the second layer is twice that of the first layer, the algorithm sorts all the processes in the second layer based on their remaining time minus waiting time and sends one process to the first layer to run. This ensures that all processes eventually run, optimizing waiting time. To evaluate the performance of the WOS algorithm, we conducted experiments comparing its performance with traditional scheduling algorithms such as First-Come-First-Serve (FCFS) and Shortest-Job-First (SJF). The results showed that the WOS algorithm outperformed the traditional algorithms in reducing the waiting time of processes, particularly in scenarios with a large number of short tasks with long wait times. Our study highlights the effectiveness of the WOS algorithm in improving process scheduling efficiency in computer systems. By reducing process waiting time, the WOS algorithm can improve system performance and resource utilization. The findings of this study provide valuable insights for researchers and practitioners in developing and implementing efficient process scheduling algorithms.Keywords: process scheduling, wait-optimized scheduler, response time, non-preemptive, waiting time, traditional scheduling algorithms, first-come-first-serve, shortest-job-first, system performance, resource utilization
Procedia PDF Downloads 9114450 Hybrid Velocity Control Approach for Tethered Aerial Vehicle
Authors: Lovesh Goyal, Pushkar Dave, Prajyot Jadhav, GonnaYaswanth, Sakshi Giri, Sahil Dharme, Rushika Joshi, Rishabh Verma, Shital Chiddarwar
Abstract:
With the rising need for human-robot interaction, researchers have proposed and tested multiple models with varying degrees of success. A few of these models performed on aerial platforms are commonly known as Tethered Aerial Systems. These aerial vehicles may be powered continuously by a tether cable, which addresses the predicament of the short battery life of quadcopters. This system finds applications to minimize humanitarian efforts for industrial, medical, agricultural, and service uses. However, a significant challenge in employing such systems is that it necessities attaining smooth and secure robot-human interaction while ensuring that the forces from the tether remain within the standard comfortable range for the humans. To tackle this problem, a hybrid control method that could switch between two control techniques: constant control input and the steady-state solution, is implemented. The constant control approach is implemented when a person is far from the target location, and error is thought to be eventually constant. The controller switches to the steady-state approach when the person reaches within a specific range of the goal position. Both strategies take into account human velocity feedback. This hybrid technique enhances the outcomes by assisting the person to reach the desired location while decreasing the human's unwanted disturbance throughout the process, thereby keeping the interaction between the robot and the subject smooth.Keywords: unmanned aerial vehicle, tethered system, physical human-robot interaction, hybrid control
Procedia PDF Downloads 9814449 The Experience of Community-based Tourism in Yunguilla, Ecuador and Its Social-Cultural Impact
Authors: York Neudel
Abstract:
The phenomenon of tourism has been considered as tool to overcome cultural frontiers, to comprehend the other and to cope with mutual mistrust and suspicion. Well, that has been a myth, at least when it comes to mass-tourism. Other approaches, like community-based tourism, still are based on the idea of embracing the other in order to help or to understand the cultural difference. In 1997, two American NGOs incentivized a tourism-project in a community in the highlands of Ecuador, in order to protect the cloud forest from destructive exploitation of its own inhabitants. Nineteen years after that, I analyze in this investigation the interactions between the Ecuadorian hosts in the mestizo-community of Yunguilla and the foreign tourist in the quest for “authentic life” in the Ecuadorian cloud forest. As a sort of “contemporary pilgrim” the traveller tries to find authenticity in other times and places far away from their everyday life in Europe or North America. Therefore, tourists are guided by stereotypes and expectations that are produced by the touristic industry. The host, on the other hand, has to negotiate this pre-established imaginary. That generates a kind of theatre-play with front- and backstage in organic gardens, little fabrics and even private housing, since this alternative project offers to share the private space of the host with the tourist in the setting the community-based tourism. In order to protect their privacy, the community creates new hybrid spaces that oscillate between front- and backstages that culminates in a game of hide and seek – a phenomenon that promises interesting frictions for an anthropological case-study.Keywords: Tourism, Authenticity, Community-based tourism, Ecuador, Yunguilla
Procedia PDF Downloads 28414448 Comparison of ANFIS Update Methods Using Genetic Algorithm, Particle Swarm Optimization, and Artificial Bee Colony
Authors: Michael R. Phangtriastu, Herriyandi Herriyandi, Diaz D. Santika
Abstract:
This paper presents a comparison of the implementation of metaheuristic algorithms to train the antecedent parameters and consequence parameters in the adaptive network-based fuzzy inference system (ANFIS). The algorithms compared are genetic algorithm (GA), particle swarm optimization (PSO), and artificial bee colony (ABC). The objective of this paper is to benchmark well-known metaheuristic algorithms. The algorithms are applied to several data set with different nature. The combinations of the algorithms' parameters are tested. In all algorithms, a different number of populations are tested. In PSO, combinations of velocity are tested. In ABC, a different number of limit abandonment are tested. Experiments find out that ABC is more reliable than other algorithms, ABC manages to get better mean square error (MSE) than other algorithms in all data set.Keywords: ANFIS, artificial bee colony, genetic algorithm, metaheuristic algorithm, particle swarm optimization
Procedia PDF Downloads 35214447 A Similarity/Dissimilarity Measure to Biological Sequence Alignment
Authors: Muhammad A. Khan, Waseem Shahzad
Abstract:
Analysis of protein sequences is carried out for the purpose to discover their structural and ancestry relationship. Sequence similarity determines similar protein structures, similar function, and homology detection. Biological sequences composed of amino acid residues or nucleotides provide significant information through sequence alignment. In this paper, we present a new similarity/dissimilarity measure to sequence alignment based on the primary structure of a protein. The approach finds the distance between the two given sequences using the novel sequence alignment algorithm and a mathematical model. The algorithm runs at a time complexity of O(n²). A distance matrix is generated to construct a phylogenetic tree of different species. The new similarity/dissimilarity measure outperforms other existing methods.Keywords: alignment, distance, homology, mathematical model, phylogenetic tree
Procedia PDF Downloads 17814446 A Hybrid Film: NiFe₂O₄ Nanoparticles in Poly-3-Hydroxybutyrate as an Antibacterial Agent
Authors: Karen L. Rincon-Granados, América R. Vázquez-Olmos, Adriana-Patricia Rodríguez-Hernández, Gina Prado-Prone, Margarita Rivera, Roberto Y. Sato-Berrú
Abstract:
In this work, a hybrid film based on poly-3-hydroxybutyrate (P3HB) and nickel ferrite (NiFe₂O₄) nanoparticles (NPs) was obtained by a simple and reproducible methodology in order to study its antibacterial and cytotoxic properties. The motivation for this research is the current antimicrobial resistance (RAM). This is a threat to human health and development worldwide. RAM is caused by the emergence of bacterial strains resistant to traditional antibiotics that were used as treatment. Due to this, the need to investigate new alternatives for preventing and treating bacterial infections emerges. In this sense, metal oxide NPs have aroused great interest due to their unique physicochemical properties. However, their use is limited by the nanostructured nature, commonly obtained by chemical and physical synthesis methods, as powders or colloidal dispersions. Therefore, the incorporation of nanostructured materials in polymer matrices to obtain hybrid materials that allow disinfecting and preventing the spread of bacteria on various surfaces. Accordingly, this work presents the synthesis and study of the antibacterial properties of the P3HB@NiFe₂O₄ hybrid film as a potential material to inhibit bacterial growth. The NiFe₂O₄ NPs were previously synthesized by a mechanochemical method. The P3HB and P3HB@NiFe₂O₄ films were obtained by the solvent casting method. The films were characterized by X-ray diffraction (XRD), Raman scattering, and scanning electron microscopy (SEM). The XRD pattern showed that the NiFe₂O₄ NPs were incorporated into the P3HB polymer matrix and retained their nanometric sizes. By energy dispersive X-ray spectroscopy (EDS), it was observed that the NPs are homogeneously distributed in the film. The bactericidal effect of the films obtained was evaluated in vitro using the broth surface method against two opportunistic and nosocomial pathogens, Staphylococcus aureus and Pseudomonas aeruginosa. The bacterial growth results showed that the P3HB@NiFe₂O₄ hybrid film was inhibited by 97% and 96% for S. aureus and P. aeruginosa, respectively. Surprisingly, the P3HB film inhibited both bacterial strains by around 90%. The cytotoxicity of the NiFe₂O₄ NPs, P3HB@NiFe₂O₄ hybrid film, and the P3HB film was evaluated using human skin cells, keratinocytes, and fibroblasts, finding that the NPs are biocompatible. The P3HB film and hybrids are cytotoxic, which demonstrated that although P3HB is known and reported as a biocompatible polymer, under our work conditions, P3HB was cytotoxic. Its bactericidal effect could be related to this activity. Its films are bactericidal and cytotoxic to keratinocytes and fibroblasts, the first barrier of human skin. Despite this, the hybrid film of P3HB@NiFe₂O₄ presents synergy with the bactericidal effect between P3HB and NPs, increasing bacterial inhibition. In addition, NPs decrease the cytotoxicity of P3HB to keratinocytes. The methodology used in this work was successful in producing hybrid films with antibacterial activity. However, future challenges are generated to find relationships between NPs and P3HB that allow taking advantage of their bactericidal properties and do not compromise biocompatibility.Keywords: poly-3-hydroxybutyrate, nanoparticles, hybrid film, antibacterial
Procedia PDF Downloads 8214445 An Efficient Strategy for Relay Selection in Multi-Hop Communication
Authors: Jung-In Baik, Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song
Abstract:
This paper proposes an efficient relaying algorithm to obtain diversity for improving the reliability of a signal. The algorithm achieves time or space diversity gain by multiple versions of the same signal through two routes. Relays are separated between a source and destination. The routes between the source and destination are set adaptive in order to deal with different channels and noises. The routes consist of one or more relays and the source transmits its signal to the destination through the routes. The signals from the relays are combined and detected at the destination. The proposed algorithm provides a better performance than the conventional algorithms in bit error rate (BER).Keywords: multi-hop, OFDM, relay, relaying selection
Procedia PDF Downloads 44514444 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink
Authors: Sanjay Rathee, Arti Kashyap
Abstract:
Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining
Procedia PDF Downloads 29414443 A Robust Hybrid Blind Digital Image Watermarking System Using Discrete Wavelet Transform and Contourlet Transform
Authors: Nidal F. Shilbayeh, Belal AbuHaija, Zainab N. Al-Qudsy
Abstract:
In this paper, a hybrid blind digital watermarking system using Discrete Wavelet Transform (DWT) and Contourlet Transform (CT) has been implemented and tested. The implemented combined digital watermarking system has been tested against five common types of image attacks. The performance evaluation shows improved results in terms of imperceptibility, robustness, and high tolerance against these attacks; accordingly, the system is very effective and applicable.Keywords: discrete wavelet transform (DWT), contourlet transform (CT), digital image watermarking, copyright protection, geometric attack
Procedia PDF Downloads 39414442 The Effect of Improvement Programs in the Mean Time to Repair and in the Mean Time between Failures on Overall Lead Time: A Simulation Using the System Dynamics-Factory Physics Model
Authors: Marcel Heimar Ribeiro Utiyama, Fernanda Caveiro Correia, Dario Henrique Alliprandini
Abstract:
The importance of the correct allocation of improvement programs is of growing interest in recent years. Due to their limited resources, companies must ensure that their financial resources are directed to the correct workstations in order to be the most effective and survive facing the strong competition. However, to our best knowledge, the literature about allocation of improvement programs does not analyze in depth this problem when the flow shop process has two capacity constrained resources. This is a research gap which is deeply studied in this work. The purpose of this work is to identify the best strategy to allocate improvement programs in a flow shop with two capacity constrained resources. Data were collected from a flow shop process with seven workstations in an industrial control and automation company, which process 13.690 units on average per month. The data were used to conduct a simulation with the System Dynamics-Factory Physics model. The main variables considered, due to their importance on lead time reduction, were the mean time between failures and the mean time to repair. The lead time reduction was the output measure of the simulations. Ten different strategies were created: (i) focused time to repair improvement, (ii) focused time between failures improvement, (iii) distributed time to repair improvement, (iv) distributed time between failures improvement, (v) focused time to repair and time between failures improvement, (vi) distributed time to repair and between failures improvement, (vii) hybrid time to repair improvement, (viii) hybrid time between failures improvements, (ix) time to repair improvement strategy towards the two capacity constrained resources, (x) time between failures improvement strategy towards the two capacity constrained resources. The ten strategies tested are variations of the three main strategies for improvement programs named focused, distributed and hybrid. Several comparisons among the effect of the ten strategies in lead time reduction were performed. The results indicated that for the flow shop analyzed, the focused strategies delivered the best results. When it is not possible to perform a large investment on the capacity constrained resources, companies should use hybrid approaches. An important contribution to the academy is the hybrid approach, which proposes a new way to direct the efforts of improvements. In addition, the study in a flow shop with two strong capacity constrained resources (more than 95% of utilization) is an important contribution to the literature. Another important contribution is the problem of allocation with two CCRs and the possibility of having floating capacity constrained resources. The results provided the best improvement strategies considering the different strategies of allocation of improvement programs and different positions of the capacity constrained resources. Finally, it is possible to state that both strategies, hybrid time to repair improvement and hybrid time between failures improvement, delivered best results compared to the respective distributed strategies. The main limitations of this study are mainly regarding the flow shop analyzed. Future work can further investigate different flow shop configurations like a varying number of workstations, different number of products or even different positions of the two capacity constrained resources.Keywords: allocation of improvement programs, capacity constrained resource, hybrid strategy, lead time, mean time to repair, mean time between failures
Procedia PDF Downloads 12414441 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm
Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta
Abstract:
Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates
Procedia PDF Downloads 23714440 Parametric Analysis of Lumped Devices Modeling Using Finite-Difference Time-Domain
Authors: Felipe M. de Freitas, Icaro V. Soares, Lucas L. L. Fortes, Sandro T. M. Gonçalves, Úrsula D. C. Resende
Abstract:
The SPICE-based simulators are quite robust and widely used for simulation of electronic circuits, their algorithms support linear and non-linear lumped components and they can manipulate an expressive amount of encapsulated elements. Despite the great potential of these simulators based on SPICE in the analysis of quasi-static electromagnetic field interaction, that is, at low frequency, these simulators are limited when applied to microwave hybrid circuits in which there are both lumped and distributed elements. Usually the spatial discretization of the FDTD (Finite-Difference Time-Domain) method is done according to the actual size of the element under analysis. After spatial discretization, the Courant Stability Criterion calculates the maximum temporal discretization accepted for such spatial discretization and for the propagation velocity of the wave. This criterion guarantees the stability conditions for the leapfrogging of the Yee algorithm; however, it is known that for the field update, the stability of the complete FDTD procedure depends on factors other than just the stability of the Yee algorithm, because the FDTD program needs other algorithms in order to be useful in engineering problems. Examples of these algorithms are Absorbent Boundary Conditions (ABCs), excitation sources, subcellular techniques, grouped elements, and non-uniform or non-orthogonal meshes. In this work, the influence of the stability of the FDTD method in the modeling of concentrated elements such as resistive sources, resistors, capacitors, inductors and diode will be evaluated. In this paper is proposed, therefore, the electromagnetic modeling of electronic components in order to create models that satisfy the needs for simulations of circuits in ultra-wide frequencies. The models of the resistive source, the resistor, the capacitor, the inductor, and the diode will be evaluated, among the mathematical models for lumped components in the LE-FDTD method (Lumped-Element Finite-Difference Time-Domain), through the parametric analysis of Yee cells size which discretizes the lumped components. In this way, it is sought to find an ideal cell size so that the analysis in FDTD environment is in greater agreement with the expected circuit behavior, maintaining the stability conditions of this method. Based on the mathematical models and the theoretical basis of the required extensions of the FDTD method, the computational implementation of the models in Matlab® environment is carried out. The boundary condition Mur is used as the absorbing boundary of the FDTD method. The validation of the model is done through the comparison between the obtained results by the FDTD method through the electric field values and the currents in the components, and the analytical results using circuit parameters.Keywords: hybrid circuits, LE-FDTD, lumped element, parametric analysis
Procedia PDF Downloads 15314439 The Effect of Feature Selection on Pattern Classification
Authors: Chih-Fong Tsai, Ya-Han Hu
Abstract:
The aim of feature selection (or dimensionality reduction) is to filter out unrepresentative features (or variables) making the classifier perform better than the one without feature selection. Since there are many well-known feature selection algorithms, and different classifiers based on different selection results may perform differently, very few studies consider examining the effect of performing different feature selection algorithms on the classification performances by different classifiers over different types of datasets. In this paper, two widely used algorithms, which are the genetic algorithm (GA) and information gain (IG), are used to perform feature selection. On the other hand, three well-known classifiers are constructed, which are the CART decision tree (DT), multi-layer perceptron (MLP) neural network, and support vector machine (SVM). Based on 14 different types of datasets, the experimental results show that in most cases IG is a better feature selection algorithm than GA. In addition, the combinations of IG with DT and IG with SVM perform best and second best for small and large scale datasets.Keywords: data mining, feature selection, pattern classification, dimensionality reduction
Procedia PDF Downloads 66914438 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 42914437 Novel Algorithm for Restoration of Retina Images
Authors: P. Subbuthai, S. Muruganand
Abstract:
Diabetic Retinopathy is one of the complicated diseases and it is caused by the changes in the blood vessels of the retina. Extraction of retina image through Fundus camera sometimes produced poor contrast and noises. Because of this noise, detection of blood vessels in the retina is very complicated. So preprocessing is needed, in this paper, a novel algorithm is implemented to remove the noisy pixel in the retina image. The proposed algorithm is Extended Median Filter and it is applied to the green channel of the retina because green channel vessels are brighter than the background. Proposed extended median filter is compared with the existing standard median filter by performance metrics such as PSNR, MSE and RMSE. Experimental results show that the proposed Extended Median Filter algorithm gives a better result than the existing standard median filter in terms of noise suppression and detail preservation.Keywords: fundus retina image, diabetic retinopathy, median filter, microaneurysms, exudates
Procedia PDF Downloads 34214436 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification
Authors: Rujia Chen, Ajit Narayanan
Abstract:
Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels
Procedia PDF Downloads 18614435 Construction and Validation of a Hybrid Lumbar Spine Model for the Fast Evaluation of Intradiscal Pressure and Mobility
Authors: Dicko Ali Hamadi, Tong-Yette Nicolas, Gilles Benjamin, Faure Francois, Palombi Olivier
Abstract:
A novel hybrid model of the lumbar spine, allowing fast static and dynamic simulations of the disc pressure and the spine mobility, is introduced in this work. Our contribution is to combine rigid bodies, deformable finite elements, articular constraints, and springs into a unique model of the spine. Each vertebra is represented by a rigid body controlling a surface mesh to model contacts on the facet joints and the spinous process. The discs are modeled using a heterogeneous tetrahedral finite element model. The facet joints are represented as elastic joints with six degrees of freedom, while the ligaments are modeled using non-linear one-dimensional elastic elements. The challenge we tackle is to make these different models efficiently interact while respecting the principles of Anatomy and Mechanics. The mobility, the intradiscal pressure, the facet joint force and the instantaneous center of rotation of the lumbar spine are validated against the experimental and theoretical results of the literature on flexion, extension, lateral bending as well as axial rotation. Our hybrid model greatly simplifies the modeling task and dramatically accelerates the simulation of pressure within the discs, as well as the evaluation of the range of motion and the instantaneous centers of rotation, without penalizing precision. These results suggest that for some types of biomechanical simulations, simplified models allow far easier modeling and faster simulations compared to usual full-FEM approaches without any loss of accuracy.Keywords: hybrid, modeling, fast simulation, lumbar spine
Procedia PDF Downloads 30614434 Lipid-Chitosan Hybrid Nanoparticles for Controlled Delivery of Cisplatin
Authors: Muhammad Muzamil Khan, Asadullah Madni, Nina Filipczek, Jiayi Pan, Nayab Tahir, Hassan Shah, Vladimir Torchilin
Abstract:
Lipid-polymer hybrid nanoparticles (LPHNP) are delivery systems for controlled drug delivery at tumor sites. The superior biocompatible properties of lipid and structural advantages of polymer can be obtained via this system for controlled drug delivery. In the present study, cisplatin-loaded lipid-chitosan hybrid nanoparticles were formulated by the single step ionic gelation method based on ionic interaction of positively charged chitosan and negatively charged lipid. Formulations with various chitosan to lipid ratio were investigated to obtain the optimal particle size, encapsulation efficiency, and controlled release pattern. Transmission electron microscope and dynamic light scattering analysis demonstrated a size range of 181-245 nm and a zeta potential range of 20-30 mV. Compatibility among the components and the stability of formulation were demonstrated with FTIR analysis and thermal studies, respectively. The therapeutic efficacy and cellular interaction of cisplatin-loaded LPHNP were investigated using in vitro cell-based assays in A2780/ADR ovarian carcinoma cell line. Additionally, the cisplatin loaded LPHNP exhibited a low toxicity profile in rats. The in-vivo pharmacokinetics study also proved a controlled delivery of cisplatin with enhanced mean residual time and half-life. Our studies suggested that the cisplatin-loaded LPHNP being a promising platform for controlled delivery of cisplatin in cancer therapy.Keywords: cisplatin, lipid-polymer hybrid nanoparticle, chitosan, in vitro cell line study
Procedia PDF Downloads 13014433 Deployment of Matrix Transpose in Digital Image Encryption
Authors: Okike Benjamin, Garba E J. D.
Abstract:
Encryption is used to conceal information from prying eyes. Presently, information and data encryption are common due to the volume of data and information in transit across the globe on daily basis. Image encryption is yet to receive the attention of the researchers as deserved. In other words, video and multimedia documents are exposed to unauthorized accessors. The authors propose image encryption using matrix transpose. An algorithm that would allow image encryption is developed. In this proposed image encryption technique, the image to be encrypted is split into parts based on the image size. Each part is encrypted separately using matrix transpose. The actual encryption is on the picture elements (pixel) that make up the image. After encrypting each part of the image, the positions of the encrypted images are swapped before transmission of the image can take place. Swapping the positions of the images is carried out to make the encrypted image more robust for any cryptanalyst to decrypt.Keywords: image encryption, matrices, pixel, matrix transpose
Procedia PDF Downloads 42114432 Fabrication of Hybrid Scaffolds Consisting of Cell-laden Electrospun Micro/Nanofibers and PCL Micro-structures for Tissue Regeneration
Authors: MyungGu Yeo, JongHan Ha, Gi-Hoon Yang, JaeYoon Lee, SeungHyun Ahn, Hyeongjin Lee, HoJun Jeon, YongBok Kim, Minseong Kim, GeunHyung Kim
Abstract:
Tissue engineering is a rapidly growing interdisciplinary research area that may provide options for treating damaged tissues and organs. As a promising technique for regenerating various tissues, this technology requires biomedical scaffolds, which serve as an artificial extracellular matrix (ECM) to support neotissue growth. Electrospun micro/nanofibers have been used widely in tissue engineering because of their high surface-area-to-volume ratio and structural similarity to extracellular matrix. However, low mechanical sustainability, low 3D shape-ability, and low cell infiltration have been major limitations to their use. In this work, we propose new hybrid scaffolds interlayered with cell-laden electrospun micro/nano fibers and poly(caprolactone) microstructures. Also, we applied various concentrations of alginate and electric field strengths to determine optimal conditions for the cell-electrospinning process. The combination of cell-laden bioink (2 ⅹ 10^5 osteoblast-like MG63 cells/mL, 2 wt% alginate, 2 wt% poly(ethylene oxide), and 0.7 wt% lecithin) and a 0.16 kV/mm electric field showed the highest cell viability and fiber formation in this process. Using these conditions and PCL microstructures, we achieved mechanically stable hybrid scaffolds. In addition, the cells embedded in the fibrous structure were viable and proliferated. We suggest that the cell-embedded hybrid scaffolds fabricated using the cell-electrospinning process may be useful for various soft- and hard-tissue regeneration applications.Keywords: bioink, cell-laden scaffold, micro/nanofibers, poly(caprolactone)
Procedia PDF Downloads 38014431 Comparison of the Effectiveness of Tree Algorithms in Classification of Spongy Tissue Texture
Authors: Roza Dzierzak, Waldemar Wojcik, Piotr Kacejko
Abstract:
Analysis of the texture of medical images consists of determining the parameters and characteristics of the examined tissue. The main goal is to assign the analyzed area to one of two basic groups: as a healthy tissue or a tissue with pathological changes. The CT images of the thoracic lumbar spine from 15 healthy patients and 15 with confirmed osteoporosis were used for the analysis. As a result, 120 samples with dimensions of 50x50 pixels were obtained. The set of features has been obtained based on the histogram, gradient, run-length matrix, co-occurrence matrix, autoregressive model, and Haar wavelet. As a result of the image analysis, 290 descriptors of textural features were obtained. The dimension of the space of features was reduced by the use of three selection methods: Fisher coefficient (FC), mutual information (MI), minimization of the classification error probability and average correlation coefficients between the chosen features minimization of classification error probability (POE) and average correlation coefficients (ACC). Each of them returned ten features occupying the initial place in the ranking devised according to its own coefficient. As a result of the Fisher coefficient and mutual information selections, the same features arranged in a different order were obtained. In both rankings, the 50% percentile (Perc.50%) was found in the first place. The next selected features come from the co-occurrence matrix. The sets of features selected in the selection process were evaluated using six classification tree methods. These were: decision stump (DS), Hoeffding tree (HT), logistic model trees (LMT), random forest (RF), random tree (RT) and reduced error pruning tree (REPT). In order to assess the accuracy of classifiers, the following parameters were used: overall classification accuracy (ACC), true positive rate (TPR, classification sensitivity), true negative rate (TNR, classification specificity), positive predictive value (PPV) and negative predictive value (NPV). Taking into account the classification results, it should be stated that the best results were obtained for the Hoeffding tree and logistic model trees classifiers, using the set of features selected by the POE + ACC method. In the case of the Hoeffding tree classifier, the highest values of three parameters were obtained: ACC = 90%, TPR = 93.3% and PPV = 93.3%. Additionally, the values of the other two parameters, i.e., TNR = 86.7% and NPV = 86.6% were close to the maximum values obtained for the LMT classifier. In the case of logistic model trees classifier, the same ACC value was obtained ACC=90% and the highest values for TNR=88.3% and NPV= 88.3%. The values of the other two parameters remained at a level close to the highest TPR = 91.7% and PPV = 91.6%. The results obtained in the experiment show that the use of classification trees is an effective method of classification of texture features. This allows identifying the conditions of the spongy tissue for healthy cases and those with the porosis.Keywords: classification, feature selection, texture analysis, tree algorithms
Procedia PDF Downloads 17814430 Identification of Soft Faults in Branched Wire Networks by Distributed Reflectometry and Multi-Objective Genetic Algorithm
Authors: Soumaya Sallem, Marc Olivas
Abstract:
This contribution presents a method for detecting, locating, and characterizing soft faults in a complex wired network. The proposed method is based on multi-carrier reflectometry MCTDR (Multi-Carrier Time Domain Reflectometry) combined with a multi-objective genetic algorithm. In order to ensure complete network coverage and eliminate diagnosis ambiguities, the MCTDR test signal is injected at several points on the network, and the data is merged between different reflectometers (sensors) distributed on the network. An adapted multi-objective genetic algorithm is used to merge data in order to obtain more accurate faults location and characterization. The proposed method performances are evaluated from numerical and experimental results.Keywords: wired network, reflectometry, network distributed diagnosis, multi-objective genetic algorithm
Procedia PDF Downloads 194