Search results for: small baseline subset algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9154

Search results for: small baseline subset algorithm

8464 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering

Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining

Abstract:

DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.

Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)

Procedia PDF Downloads 270
8463 Optimization of Steel Moment Frame Structures Using Genetic Algorithm

Authors: Mohammad Befkin, Alireza Momtaz

Abstract:

Structural design is the challenging aspect of every project due to limitations in dimensions, functionality of the structure, and more importantly, the allocated budget for construction. This research study aims to investigate the optimized design for three steel moment frame buildings with different number of stories using genetic algorithm code. The number and length of spans, and height of each floor were constant in all three buildings. The design of structures are carried out according to AISC code within the provisions of plastic design with allowable stress values. Genetic code for optimization is produced using MATLAB program, while buildings modeled in Opensees program and connected to the MATLAB code to perform iterations in optimization steps. In the end designs resulted from genetic algorithm code were compared with the analysis of buildings in ETABS program. The results demonstrated that suggested structural elements by the code utilize their full capacity, indicating the desirable efficiency of produced code.

Keywords: genetic algorithm, structural analysis, steel moment frame, structural design

Procedia PDF Downloads 111
8462 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 246
8461 Subspace Rotation Algorithm for Implementing Restricted Hopfield Network as an Auto-Associative Memory

Authors: Ci Lin, Tet Yeap, Iluju Kiringa

Abstract:

This paper introduces the subspace rotation algorithm (SRA) to train the Restricted Hopfield Network (RHN) as an auto-associative memory. Subspace rotation algorithm is a gradient-free subspace tracking approach based on the singular value decomposition (SVD). In comparison with Backpropagation Through Time (BPTT) on training RHN, it is observed that SRA could always converge to the optimal solution and BPTT could not achieve the same performance when the model becomes complex, and the number of patterns is large. The AUTS case study showed that the RHN model trained by SRA could achieve a better structure of attraction basin with larger radius(in general) than the Hopfield Network(HNN) model trained by Hebbian learning rule. Through learning 10000 patterns from MNIST dataset with RHN models with different number of hidden nodes, it is observed that an several components could be adjusted to achieve a balance between recovery accuracy and noise resistance.

Keywords: hopfield neural network, restricted hopfield network, subspace rotation algorithm, hebbian learning rule

Procedia PDF Downloads 110
8460 Dissemination of Knowledge on Quality Control for Upgrading Product Standards for Small and Micro Community Enterprises

Authors: Niyom Suwandej

Abstract:

This research paper investigated the opinions of small and micro community enterprises from Jom Pluak Subdistrict, Bangkhontee District, Samut Songkram Province towards product quality control, and the findings are aimed to disseminate knowledge on quality control for upgrading product standards for small and micro community enterprises. The study employed both qualitative and quantitative methods, in which there were 23 samples in the study. The study was divided into 2 steps which were (1) studying the opinions of the respondents towards the community’s product quality control and upgrading product standards; (2) creating development guidance for product quality control and upgrading product standards for small and micro community enterprise. The demographic findings revealed female respondents as the majority, with most above 50 years of age and married. Most had more than 15 years of working experience. The education level reported by most respondents was primary school or lower followed by secondary school or lower with most respondents was vocational certificate level. Most respondents had the highest level of satisfaction with the existing condition of product quality control knowledge management. Pertaining to opinions on the guidance of knowledge creation for product quality control for small and micro community enterprise, the respondents were willing to apply the knowledge in upgrading their product standards. For the opinions of knowledge creation for product quality control and product standards, the respondents had the highest level of satisfaction. Guidance of knowledge creation for product quality control and product standards for small and micro community enterprises received the highest level of satisfaction from the respondents. Furthermore they had knowledge and comprehension in product quality control and product standards and could apply the knowledge in improving the quality of their production and product standards for small and micro community enterprises.

Keywords: product quality control, product standards, community enterprise, marketing management

Procedia PDF Downloads 465
8459 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)

Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim

Abstract:

This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.

Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm

Procedia PDF Downloads 395
8458 RFID Based Indoor Navigation with Obstacle Detection Based on A* Algorithm for the Visually Impaired

Authors: Jayron Sanchez, Analyn Yumang, Felicito Caluyo

Abstract:

The visually impaired individual may use a cane, guide dog or ask for assistance from a person. This study implemented the RFID technology which consists of a low-cost RFID reader and passive RFID tag cards. The passive RFID tag cards served as checkpoints for the visually impaired. The visually impaired was guided through audio output from the system while traversing the path. The study implemented an ultrasonic sensor in detecting static obstacles. The system generated an alternate path based on A* algorithm to avoid the obstacles. Alternate paths were also generated in case the visually impaired traversed outside the intended path to the destination. A* algorithm generated the shortest path to the destination by calculating the total cost of movement. The algorithm then selected the smallest movement cost as a successor to the current tag card. Several trials were conducted to determine the effect of obstacles in the time traversal of the visually impaired. A dependent sample t-test was applied for the statistical analysis of the study. Based on the analysis, the obstacles along the path generated delays while requesting for the alternate path because of the delay in transmission from the laptop to the device via ZigBee modules.

Keywords: A* algorithm, RFID technology, ultrasonic sensor, ZigBee module

Procedia PDF Downloads 404
8457 A Study on Analysis of Magnetic Field in Induction Generator for Small Francis Turbine Generator

Authors: Young-Kwan Choi, Han-Sang Jeong, Yeon-Ho Ok, Jae-Ho Choi

Abstract:

The purpose of this study is to verify validity of design by testing output of induction generator through finite element analysis before manufacture of induction generator designed. Characteristics in the operating domain of induction generator can be understood through analysis of magnetic field according to load (rotational speed) of induction generator. Characteristics of induction generator such as induced voltage, current, torque, magnetic flux density (magnetic flux saturation), and loss can be predicted by analysis of magnetic field.

Keywords: electromagnetic analysis, induction generator, small hydro power generator, small francis turbine generator

Procedia PDF Downloads 1465
8456 A Novel Gateway Location Algorithm for Wireless Mesh Networks

Authors: G. M. Komba

Abstract:

The Internet Gateway (IGW) has extra ability than a simple Mesh Router (MR) and the responsibility to route mostly the all traffic from Mesh Clients (MCs) to the Internet backbone however, IGWs are more expensive. Choosing strategic locations for the Internet Gateways (IGWs) best location in Backbone Wireless Mesh (BWM) precarious to the Wireless Mesh Network (WMN) and the location of IGW can improve a quantity of performance related problem. In this paper, we propose a novel algorithm, namely New Gateway Location Algorithm (NGLA), which aims to achieve four objectives, decreasing the network cost effective, minimizing delay, optimizing the throughput capacity, Different from existing algorithms, the NGLA increasingly recognizes IGWs, allocates mesh routers (MRs) to identify IGWs and promises to find a feasible IGW location and install minimum as possible number of IGWs while regularly conserving the all Quality of Service (QoS) requests. Simulation results showing that the NGLA outperforms other different algorithms by comparing the number of IGWs with a large margin and it placed 40% less IGWs and 80% gain of throughput. Furthermore the NGLA is easy to implement and could be employed for BWM.

Keywords: Wireless Mesh Network, Gateway Location Algorithm, Quality of Service, BWM

Procedia PDF Downloads 363
8455 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm

Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang

Abstract:

Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.

Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR

Procedia PDF Downloads 110
8454 Effects of Oral Resveratrol Supplementation on Inflammation and Quality of Life in Patients with Ulcerative Colitis

Authors: M. Samsami, A. Hekmatdoost, N. Ebrahimi Daryani, P. Rezanejad Asl

Abstract:

Ulcerative colitis (UC) is an inflammatory bowel disease in which immune and inflammatory factors are thought to be effective in this disease. Resveratrol is an antioxidant and anti-inflammatory compound. This study determined the effects of resveratrol compound on inflammatory factors in patients with ulcerative colitis. This study was a double-blind randomized clinical trial conducted on 50 patients with UC. Subjects received one capsule daily for 6 wk of either resveratrol (500 mg) or a placebo. Inflammatory factors, anthropometric measures, and IBDQ-9 (Inflammatory Bowel Disease Questionnaire-9) scores were assessed at baseline and at the end of the study. STATA12 software was used for data analysis. No significant differences were found in the background variables between the two groups at baseline. The results indicated that resveratrol supplementation for 6 week significantly decreased plasma levels of TNF-a and hs-CRP and the activity of NF-κB over the placebo group (p<0.001). Significant differences remained after adjustment for vitamin C (p<0.0001). The IBDQ-9 scores increased significantly in the resveratrol group over the placebo group (p<0.001). The findings of this study showed that resveratrol supplementation can be useful in patients with ulcerative colitis.

Keywords: IBD, inflammation, resveratrol, ulcerative colitis

Procedia PDF Downloads 399
8453 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network

Authors: Abdulaziz Alsadhan, Naveed Khan

Abstract:

In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.

Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)

Procedia PDF Downloads 359
8452 The Potential for Tourism Development in the Greater Chinhoyi Area in Zimbabwe: A Systems Approach in an Appetizer Destination

Authors: Phillip F. Kanokanga, Patrick W. Mamimine, Molline Mwando, Charity Mapingure

Abstract:

Tourism development tends to follow anchor attractions, including cities and their surroundings, while marginalizing small towns and their environs. This is even though the small towns and their hinterlands can also offer competitive tourism products. The Zimbabwe Tourism Authority (ZTA) gathers visitor statistics of major tourist destinations only thereby sidelining the density of tourist traffic that either passes through or visits the small towns in the country. Unless this problem is addressed, the tourism potential of small towns and their hinterlands will not be fully tapped for economic development. Using qualitative research methodology, this study investigated the opportunities for tourism development in the Greater Chinhoyi Area. The study revealed that the Greater Chinhoyi area had potential for heritage tourism, village tourism, urban tourism, educational tourism, dark tourism, recreational tourism, agrotourism, and nature tourism. There is a need to link the various tourism resources in the Greater Chinhoyi area to anchor attractions in dominant resorts, then develop and present the tourism products in transit towns as ‘appetisers’ or ‘appetisser attractions’ before one gets to the main destination.

Keywords: anchor attractions, appetisers, heritage tourism, agrotourism, small towns, tourism corridor, systems approach, hidden treasures

Procedia PDF Downloads 58
8451 The Effects of Governmental Regulation on Technological Innovation in Korean Firms

Authors: SeungKu Ahn, Sewon Lee

Abstract:

This study examines the effects of regulatory policies on corporate R&D activities and innovation and suggests regulatory directions for the enhancement of corporate performance. This study employs a regression model with R&D activities as dependent variables and the regulatory index as an independent variable. The results of this study are as follows: The regulation is negatively associated with the input and output of R&D activities. The regulation encourages small and medium-sized firms to invest in R&D. The regulation has a positive effect on patent applications for small and medium-sized firms.

Keywords: governmental regulation, research and development performance, small and medium-sized firms, technological innovation

Procedia PDF Downloads 259
8450 Research on Control Strategy of Differential Drive Assisted Steering of Distributed Drive Electric Vehicle

Authors: J. Liu, Z. P. Yu, L. Xiong, Y. Feng, J. He

Abstract:

According to the independence, accuracy and controllability of the driving/braking torque of the distributed drive electric vehicle, a control strategy of differential drive assisted steering was designed. Firstly, the assisted curve under different speed and steering wheel torque was developed and the differential torques were distributed to the right and left front wheels. Then the steering return ability assisted control algorithm was designed. At last, the joint simulation was conducted by CarSim/Simulink. The result indicated: the differential drive assisted steering algorithm could provide enough steering drive-assisted under low speed and improve the steering portability. Along with the increase of the speed, the provided steering drive-assisted decreased. With the control algorithm, the steering stiffness of the steering system increased along with the increase of the speed, which ensures the driver’s road feeling. The control algorithm of differential drive assisted steering could avoid the understeer under low speed effectively.

Keywords: differential assisted steering, control strategy, distributed drive electric vehicle, driving/braking torque

Procedia PDF Downloads 473
8449 Loss Minimization by Distributed Generation Allocation in Radial Distribution System Using Crow Search Algorithm

Authors: M. Nageswara Rao, V. S. N. K. Chaitanya, K. Amarendranath

Abstract:

This paper presents an optimal allocation and sizing of Distributed Generation (DG) in Radial Distribution Network (RDN) for total power loss minimization and enhances the voltage profile of the system. The two main important part of this study first is to find optimal allocation and second is optimum size of DG. The locations of DGs are identified by Analytical expressions and crow search algorithm has been employed to determine the optimum size of DG. In this study, the DG has been placed on single and multiple allocations.CSA is a meta-heuristic algorithm inspired by the intelligent behavior of the crows. Crows stores their excess food in different locations and memorizes those locations to retrieve it when it is needed. They follow each other to do thievery to obtain better food source. This analysis is tested on IEEE 33 bus and IEEE 69 bus under MATLAB environment and the results are compared with existing methods.

Keywords: analytical expression, distributed generation, crow search algorithm, power loss, voltage profile

Procedia PDF Downloads 224
8448 Resource Allocation of Small Agribusinesses and Entrepreneurship Development In Nigeria

Authors: Festus M. Epetimehin

Abstract:

Resources are essential materials required for production of goods and services. Effective allocation of these resources can engender the success of current business activities and its sustainability for future generation. The study examined effect of resource allocation of small agribusinesses on entrepreneurship development in Southwest Nigeria. Sample size of 385 was determined using Cochran’s formula. 350 valid copies of questionnaire were used in the analysis. In order to achieve the objective, research design (descriptive and cross sectional designs) was used to gather data for the study through the administration of questionnaire to respondents. Both descriptive and inferential statistics were used to investigate the objective of the study. The result obtained indicated that resource allocation by small agribusinesses had a substantial positive effect on entrepreneurship development with the p-value of (0.0000) which was less than the 5.0% critical value with a positive regression coefficient of 0.53. The implication of this is that the ability of the entrepreneurs to deploy their resources efficiently through adequate realization of better gross margin could enhance business activities and development. The study recommends that business owners still need some level of serious training and exposure on how to manage modern small agribusiness resources to enhance business performance. The intervention of Agricultural Development Programme (ADP) and other Agricultural institutions are needed in this regard.

Keywords: resource, resource allocation, small businesses, agriculture, entrepreneurship development

Procedia PDF Downloads 43
8447 Design and Performance Analysis of Resource Management Algorithms in Response to Emergency and Disaster Situations

Authors: Volkan Uygun, H. Birkan Yilmaz, Tuna Tugcu

Abstract:

This study focuses on the development and use of algorithms that address the issue of resource management in response to emergency and disaster situations. The presented system, named Disaster Management Platform (DMP), takes the data from the data sources of service providers and distributes the incoming requests accordingly both to manage load balancing and minimize service time, which results in improved user satisfaction. Three different resource management algorithms, which give different levels of importance to load balancing and service time, are proposed for the study. The first one is the Minimum Distance algorithm, which assigns the request to the closest resource. The second one is the Minimum Load algorithm, which assigns the request to the resource with the minimum load. Finally, the last one is the Hybrid algorithm, which combines the previous two approaches. The performance of the proposed algorithms is evaluated with respect to waiting time, success ratio, and maximum load ratio. The metrics are monitored from simulations, to find the optimal scheme for different loads. Two different simulations are performed in the study, one is time-based and the other is lambda-based. The results indicate that, the Minimum Load algorithm is generally the best in all metrics whereas the Minimum Distance algorithm is the worst in all cases and in all metrics. The leading position in performance is switched between the Minimum Distance and the Hybrid algorithms, as lambda values change.

Keywords: emergency and disaster response, resource management algorithm, disaster situations, disaster management platform

Procedia PDF Downloads 334
8446 Innovation and Performance of Very Small Agri-Food Enterprises in Cameroon

Authors: Ahmed Moustapha Mfokeu

Abstract:

Agri-food VSEs in Cameroon are facing a succession of crises, lack of security, particularly in the Far North, South West, and North West regions, the consequences of the Covid 19 crisis, and the war in Ukraine . These multiple crises have benefited the reception of the prices of the raw materials. Moreover, the exacerbation of competitive pressures is driven by the technological acceleration of productive systems in emerging countries which increase the demands imposed on the markets. The Cameroonian VSE must therefore be able to meet the new challenges of international competition, especially through innovation. The objective of this research is to contribute to the knowledge of the effects of innovation on the performance of very small agribusinesses in Cameroon. On the methodological level, the data were provided from a sample of 153 companies in the cities of Douala and Yaoundé. This research uses structural equation models with latent variables. The main results show that there is a positive and significant link between innovation and the performance of very small agri-food companies, so if it is important for entrepreneurs to encourage and practice innovation, it is also necessary to make them understand and make them like this aspect in their strategic function.

Keywords: innovation, performance, very small enterprise, agrifood

Procedia PDF Downloads 100
8445 The Impact of Metacognitive Knowledge and Experience on Top Management Team Diversity and Small to Medium Enterprises Performance

Authors: Jo Rhodes, Peter Lok, Zahra Sadeghinejad

Abstract:

The aim of this study is to determine the impact of metacognition on top management team members and firm performance based on full team integration. A survey of 1500 small to medium enterprises (SMEs) was initiated and 140 firms were obtained in this study (with response rate of 9%). The result showed that different metacognitive abilities of managers [knowledge and experience] could enhance team decision-making and problem solving, resulting in greater firm performance. This is a significant finding for SMEs because these organisations have small teams with owner leadership and entrepreneurial orientation.

Keywords: metacognition, behavioural integration, top management team (TMT), performance

Procedia PDF Downloads 368
8444 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data

Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin

Abstract:

The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.

Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test

Procedia PDF Downloads 291
8443 Enhanced Imperialist Competitive Algorithm for the Cell Formation Problem Using Sequence Data

Authors: S. H. Borghei, E. Teymourian, M. Mobin, G. M. Komaki, S. Sheikh

Abstract:

Imperialist competitive algorithm (ICA) is a recent meta-heuristic method that is inspired by the social evolutions for solving NP-Hard problems. The ICA is a population based algorithm which has achieved a great performance in comparison to other meta-heuristics. This study is about developing enhanced ICA approach to solve the cell formation problem (CFP) using sequence data. In addition to the conventional ICA, an enhanced version of ICA, namely EICA, applies local search techniques to add more intensification aptitude and embed the features of exploration and intensification more successfully. Suitable performance measures are used to compare the proposed algorithms with some other powerful solution approaches in the literature. In the same way, for checking the proficiency of algorithms, forty test problems are presented. Five benchmark problems have sequence data, and other ones are based on 0-1 matrices modified to sequence based problems. Computational results elucidate the efficiency of the EICA in solving CFP problems.

Keywords: cell formation problem, group technology, imperialist competitive algorithm, sequence data

Procedia PDF Downloads 449
8442 Rethinking the Use of Online Dispute Resolution in Resolving Cross-Border Small E-Disputes in EU

Authors: Sajedeh Salehi, Marco Giacalone

Abstract:

This paper examines the role of existing online dispute resolution (ODR) mechanisms and their effects on ameliorating access to justice – as a protected right by Art. 47 of the EU Charter of Fundamental Rights – for consumers in EU. The major focus of this study will be on evaluating ODR as the means of dispute resolution for Business-to-Consumer (B2C) cross-border small claims raised in e-commerce transactions. The authors will elaborate the consequences of implementing ODR methods in the context of recent developments in EU regulatory safeguards on promoting consumer protection. In this analysis, both non-judiciary and judiciary ODR redress mechanisms are considered, however, the significant consideration is given to – obligatory and non-obligatory – judiciary ODR methods. For that purpose, this paper will particularly investigate the impact of the EU ODR platform as well as the European Small Claims Procedure (ESCP) Regulation 861/2007 and their role on accelerating the access to justice for consumers in B2C e-disputes. Although, considerable volume of research has been carried out on ODR for consumer claims, rather less (or no-) attention has been paid to provide a combined doctrinal and empirical evaluation of ODR’s potential in resolving cross-border small e-disputes, in EU. Hence, the methodological approach taken in this study is a mixed methodology based on qualitative (interviews) and quantitative (surveys) research methods which will be mainly based on the data acquired through the findings of the Small Claims Analysis Net (SCAN) project. This project contributes towards examining the ESCP Regulation implementation and efficiency in providing consumers with a legal watershed through using the ODR for their transnational small claims. The outcomes of this research may benefit both academia and policymakers at national and international level.

Keywords: access to justice, consumers, e-commerce, small e-Disputes

Procedia PDF Downloads 125
8441 An Experimental Study on Some Conventional and Hybrid Models of Fuzzy Clustering

Authors: Jeugert Kujtila, Kristi Hoxhalli, Ramazan Dalipi, Erjon Cota, Ardit Murati, Erind Bedalli

Abstract:

Clustering is a versatile instrument in the analysis of collections of data providing insights of the underlying structures of the dataset and enhancing the modeling capabilities. The fuzzy approach to the clustering problem increases the flexibility involving the concept of partial memberships (some value in the continuous interval [0, 1]) of the instances in the clusters. Several fuzzy clustering algorithms have been devised like FCM, Gustafson-Kessel, Gath-Geva, kernel-based FCM, PCM etc. Each of these algorithms has its own advantages and drawbacks, so none of these algorithms would be able to perform superiorly in all datasets. In this paper we will experimentally compare FCM, GK, GG algorithm and a hybrid two-stage fuzzy clustering model combining the FCM and Gath-Geva algorithms. Firstly we will theoretically dis-cuss the advantages and drawbacks for each of these algorithms and we will describe the hybrid clustering model exploiting the advantages and diminishing the drawbacks of each algorithm. Secondly we will experimentally compare the accuracy of the hybrid model by applying it on several benchmark and synthetic datasets.

Keywords: fuzzy clustering, fuzzy c-means algorithm (FCM), Gustafson-Kessel algorithm, hybrid clustering model

Procedia PDF Downloads 507
8440 Estimation of Optimum Parameters of Non-Linear Muskingum Model of Routing Using Imperialist Competition Algorithm (ICA)

Authors: Davood Rajabi, Mojgan Yazdani

Abstract:

Non-linear Muskingum model is an efficient method for flood routing, however, the efficiency of this method is influenced by three applied parameters. Therefore, efficiency assessment of Imperialist Competition Algorithm (ICA) to evaluate optimum parameters of non-linear Muskingum model was addressed through this study. In addition to ICA, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were also used aiming at an available criterion to verdict ICA. In this regard, ICA was applied for Wilson flood routing; then, routing of two flood events of DoAab Samsami River was investigated. In case of Wilson flood that the target function was considered as the sum of squared deviation (SSQ) of observed and calculated discharges. Routing two other floods, in addition to SSQ, another target function was also considered as the sum of absolute deviations of observed and calculated discharge. For the first floodwater based on SSQ, GA indicated the best performance, however, ICA was on first place, based on SAD. For the second floodwater, based on both target functions, ICA indicated a better operation. According to the obtained results, it can be said that ICA could be used as an appropriate method to evaluate the parameters of Muskingum non-linear model.

Keywords: Doab Samsami river, genetic algorithm, imperialist competition algorithm, meta-exploratory algorithms, particle swarm optimization, Wilson flood

Procedia PDF Downloads 497
8439 New Segmentation of Piecewise Moving-Average Model by Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

This paper addresses the problem of the signal segmentation within a Bayesian framework by using reversible jump MCMC algorithm. The signal is modelled by piecewise constant Moving-Average (MA) model where the numbers of segments, the position of change-point, the order and the coefficient of the MA model for each segment are unknown. The reversible jump MCMC algorithm is then used to generate samples distributed according to the joint posterior distribution of the unknown parameters. These samples allow calculating some interesting features of the posterior distribution. The performance of the methodology is illustrated via several simulation results.

Keywords: piecewise, moving-average model, reversible jump MCMC, signal segmentation

Procedia PDF Downloads 221
8438 Algorithmic Approach to Management of Complications of Permanent Facial Filler: A Saudi Experience

Authors: Luay Alsalmi

Abstract:

Background: Facial filler is the most common type of cosmetic surgery next to botox. Permanent filler is preferred nowadays due to the low cost brought about by non-recurring injection appointments. However, such fillers pose a higher risk for complications, with even greater adverse effects when the procedure is done using unknown dermal filler injections. AIM: This study aimed to establish an algorithm to categorize and manage patients that receive permanent fillers. Materials and Methods: Twelve participants were presented to the service through emergency or as outpatient from November 2015 to May 2021. Demographics such as age, sex, date of injection, time of onset, and types of complications were collected. After examination, all cases were managed based on an algorithm established. FACE-Q was used to measure overall satisfaction and psychological well-being. Results: The algorithm to diagnose and manage these patients effectively with a high satisfaction rate was established in this study. All participants were non-smoker females with no known medical comorbidities. The algorithm presented determined the treatment plan when faced with complications. Results revealed high appearance-related psychosocial distress was observed prior to surgery, while it significantly dropped after surgery. FACE-Q was able to establish evidence of satisfactory ratings among patients prior to and after surgery. Conclusion: This treatment algorithm can guide the surgeon in formulating a suitable plan with fewer complications and a high satisfaction rate.

Keywords: facial filler, FACE-Q, psycho-social stress, botox, treatment algorithm

Procedia PDF Downloads 81
8437 Effect of Acceptance and Commitment Therapy in Cognitive Function among Breast Cancer Patients in Eastern Country

Authors: Arunima Datta, Prathama Guha Chaudhuri, Ashis Mukhopadhyay

Abstract:

Background: Acceptance and commitment therapy (ACT) is one of the newer forms (third wave) therapy. This therapy helps a cancer patient to increase acceptance level about their disease as well as their present situation. Breast cancer patients are known to suffer from depression and mild cognitive impairment; both affect their quality of life. Objectives:The present study had assessed effect of structured ACT intervention on cognitive function and acceptance level among breast cancer patients who were undergoing chemotherapy. Method: Data was collected from 123 breast cancer patients those who were undergoing chemotherapy were willing to undergo psychological treatment, with no history of past psychiatric illness. Their baseline of cognitive function and acceptance levels were assessed using validated tools. The effect of sociodemographic factors and clinical factors on cognitive function was determined at baseline.The participants were randomly divided into two groups: experimental (ACT, 4 sessions over 2 months) and control group. Cognitive function and acceptance level were measured during post intervention on 2months follow-up. Appropriate statistical analyses were performed to determine the effect on cognitive function and acceptance level in two groups. Result: At baseline, the factors that significantly influenced slower speed of task performance were ER PR HER2 status; number of chemo cycle, treatment type (Adjuvant and neo-adjuvant) was related with that. Sociodemographic characteristics did not show any significant difference between slow and fast performance. Per and post intervention analysis showed that ACT intervention resulted in significant difference both in terms of speed of cognitive performance and acceptance level. Conclusion: ACT is an effective therapeutic option for treating mild cognitive impairment and improve acceptance level among breast cancer patients undergoing chemotherapy.

Keywords: acceptance and commitment therapy, breast cancer, quality of life, cognitive function

Procedia PDF Downloads 300
8436 Commissioning of a Flattening Filter Free (FFF) using an Anisotropic Analytical Algorithm (AAA)

Authors: Safiqul Islam, Anamul Haque, Mohammad Amran Hossain

Abstract:

Aim: To compare the dosimetric parameters of the flattened and flattening filter free (FFF) beam and to validate the beam data using anisotropic analytical algorithm (AAA). Materials and Methods: All the dosimetric data’s (i.e. depth dose profiles, profile curves, output factors, penumbra etc.) required for the beam modeling of AAA were acquired using the Blue Phantom RFA for 6 MV, 6 FFF, 10MV & 10FFF. Progressive resolution Optimizer and Dose Volume Optimizer algorithm for VMAT and IMRT were are also configured in the beam model. Beam modeling of the AAA were compared with the measured data sets. Results: Due to the higher and lover energy component in 6FFF and 10 FFF the surface doses are 10 to 15% higher compared to flattened 6 MV and 10 MV beams. FFF beam has a lower mean energy compared to the flattened beam and the beam quality index were 6 MV 0.667, 6FFF 0.629, 10 MV 0.74 and 10 FFF 0.695 respectively. Gamma evaluation with 2% dose and 2 mm distance criteria for the Open Beam, IMRT and VMAT plans were also performed and found a good agreement between the modeled and measured data. Conclusion: We have successfully modeled the AAA algorithm for the flattened and FFF beams and achieved a good agreement with the calculated and measured value.

Keywords: commissioning of a Flattening Filter Free (FFF) , using an Anisotropic Analytical Algorithm (AAA), flattened beam, parameters

Procedia PDF Downloads 294
8435 Diesel Fault Prediction Based on Optimized Gray Neural Network

Authors: Han Bing, Yin Zhenjie

Abstract:

In order to analyze the status of a diesel engine, as well as conduct fault prediction, a new prediction model based on a gray system is proposed in this paper, which takes advantage of the neural network and the genetic algorithm. The proposed GBPGA prediction model builds on the GM (1.5) model and uses a neural network, which is optimized by a genetic algorithm to construct the error compensator. We verify our proposed model on the diesel faulty simulation data and the experimental results show that GBPGA has the potential to employ fault prediction on diesel.

Keywords: fault prediction, neural network, GM(1, 5) genetic algorithm, GBPGA

Procedia PDF Downloads 297