Search results for: Information Dispersal Algorithm
5202 Implementation of Watch Dog Timer for Fault Tolerant Computing on Cluster Server
Authors: Meenakshi Bheevgade, Rajendra M. Patrikar
Abstract:
In today-s new technology era, cluster has become a necessity for the modern computing and data applications since many applications take more time (even days or months) for computation. Although after parallelization, computation speeds up, still time required for much application can be more. Thus, reliability of the cluster becomes very important issue and implementation of fault tolerant mechanism becomes essential. The difficulty in designing a fault tolerant cluster system increases with the difficulties of various failures. The most imperative obsession is that the algorithm, which avoids a simple failure in a system, must tolerate the more severe failures. In this paper, we implemented the theory of watchdog timer in a parallel environment, to take care of failures. Implementation of simple algorithm in our project helps us to take care of different types of failures; consequently, we found that the reliability of this cluster improves.Keywords: Cluster, Fault tolerant, Grid, Grid ComputingSystem, Meta-computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22205201 An Enhanced Slicing Algorithm Using Nearest Distance Analysis for Layer Manufacturing
Authors: M. Vatani, A. R. Rahimi, F. Brazandeh, A. Sanati nezhad
Abstract:
Although the STL (stereo lithography) file format is widely used as a de facto industry standard in the rapid prototyping industry due to its simplicity and ability to tessellation of almost all surfaces, but there are always some defects and shortcoming in their usage, which many of them are difficult to correct manually. In processing the complex models, size of the file and its defects grow extremely, therefore, correcting STL files become difficult. In this paper through optimizing the exiting algorithms, size of the files and memory usage of computers to process them will be reduced. In spite of type and extent of the errors in STL files, the tail-to-head searching method and analysis of the nearest distance between tails and heads techniques were used. As a result STL models sliced rapidly, and fully closed contours produced effectively and errorless.Keywords: Layer manufacturing, STL files, slicing algorithm, nearest distance analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41675200 Identifying Interactions in a Feeding System
Authors: Jan Busch, Sebastian Schneider, Konja Knüppel, Peter Nyhuis
Abstract:
In production processes, assembly conceals a considerable potential for increased efficiency in terms of lowering production costs. Due to the individualisation of customer requirements, product variants have increased in recent years. Simultaneously, the portion of automated production systems has increased. A challenge is to adapt the flexibility and adaptability of automated systems to these changes. The Institute for Production Systems and Logistics developed an aerodynamic orientation system for feeding technology. When changing to other components, only four parameters must be adjusted. The expenditure of time for setting parameters is high. An objective therefore is developing an optimisation algorithm for automatic parameter configuration. Know how regarding the interaction of the four parameters and their effect on the sizes to be optimised is required in order to be able to develop a more efficient algorithm. This article introduces an analysis of the interactions between parameters and their influence on the quality of feeding.
Keywords: Aerodynamic feeding system, design of experiments, interactions between parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17445199 Reduction of Search Space by Applying Controlled Genetic Operators for Weight Constrained Shortest Path Problem
Authors: A.K.M. Khaled Ahsan Talukder, Taibun Nessa, Kaushik Roy
Abstract:
The weight constrained shortest path problem (WCSPP) is one of most several known basic problems in combinatorial optimization. Because of its importance in many areas of applications such as computer science, engineering and operations research, many researchers have extensively studied the WCSPP. This paper mainly concentrates on the reduction of total search space for finding WCSP using some existing Genetic Algorithm (GA). For this purpose, some controlled schemes of genetic operators are adopted on list chromosome representation. This approach gives a near optimum solution with smaller elapsed generation than classical GA technique. From further analysis on the matter, a new generalized schema theorem is also developed from the philosophy of Holland-s theorem.Keywords: Genetic Algorithm, Evolutionary Optimization, Multi Objective Optimization, Non-linear Schema Theorem, WCSPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14225198 Variance Based Component Analysis for Texture Segmentation
Authors: Zeinab Ghasemi, S. Amirhassan Monadjemi, Abbas Vafaei
Abstract:
This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19855197 Mobile Medical Operation Route Planning
Authors: K. Somprasonk, R. Boondiskulchok
Abstract:
Medical services are usually provided in hospitals; however, in developing country, some rural residences have fewer opportunities to access in healthcare services due to the limitation of transportation communication. Therefore, in Thailand, there are charitable organizations operating to provide medical treatments to these people by shifting the medical services to operation sites; this is commonly known as mobile medical service. Operation routing is important for the organization to reduce its transportation cost in order to focus more on other important activities; for instance, the development of medical apparatus. VRP is applied to solve the problem of high transportation cost of the studied organization with the searching techniques of saving algorithm to find the minimum total distance of operation route and satisfy available time constraints of voluntary medical staffs.
Keywords: Decision Support System, Mobile Medical Service Planning, Saving Algorithm, Vehicle Routing Problem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16185196 Black Box Model and Evolutionary Fuzzy Control Methods of Coupled-Tank System
Authors: S. Yaman, S. Rostami
Abstract:
In this study, a black box modeling of the coupled-tank system is obtained by using fuzzy sets. The derived model is tested via adaptive neuro fuzzy inference system (ANFIS). In order to achieve a better control performance, the parameters of three different controller types, classical proportional integral controller (PID), fuzzy PID and function tuner method, are tuned by one of the evolutionary computation method, genetic algorithm. All tuned controllers are applied to the fuzzy model of the coupled-tank experimental setup and analyzed under the different reference input values. According to the results, it is seen that function tuner method demonstrates better robust control performance and guarantees the closed loop stability.
Keywords: Function tuner method, fuzzy modeling, fuzzy PID controller, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16625195 DHT-LMS Algorithm for Sensorineural Loss Patients
Authors: Sunitha S. L., V. Udayashankara
Abstract:
Hearing impairment is the number one chronic disability affecting many people in the world. Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Hartley Transform Power Normalized Least Mean Square algorithm (DHT-LMS) to improve the SNR and to reduce the convergence rate of the Least Means Square (LMS) for sensorineural loss patients. The DHT transforms n real numbers to n real numbers, and has the convenient property of being its own inverse. It can be effectively used for noise cancellation with less convergence time. The simulated result shows the superior characteristics by improving the SNR at least 9 dB for input SNR with zero dB and faster convergence rate (eigenvalue ratio 12) compare to time domain method and DFT-LMS.Keywords: Hearing Impairment, DHT-LMS, Convergence rate, SNR improvement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17305194 Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment
Authors: S. Bourennane, A. Bendjama
Abstract:
This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.
Keywords: Higher-order statistics, high resolution array processing techniques, localization of acoustics sources, wide band sources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16045193 1G2A IMU\GPS Integration Algorithm for Land Vehicle Navigation
Authors: O. Maklouf, Ahmed Abdulla
Abstract:
A general decline in the cost, size, and power requirements of electronics is accelerating the adoption of integrated GPS/INS technologies in consumer applications such Land Vehicle Navigation. Researchers have looking for ways to eliminate additional components from product designs. One possibility is to drop one or more of the relatively expensive gyroscopes from microelectromechanical system (MEMS) versions of inertial measurement units (IMUs). For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a simplified integration algorithm for strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of the low-cost IMU and because of the relatively small area of the trajectory.
Keywords: GPS, ParIMU, INS, Kalman Filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28705192 Using Information Theory to Observe Natural Intelligence and Artificial Intelligence
Authors: Lipeng Zhang, Limei Li, Yanming Pearl Zhang
Abstract:
This paper takes a philosophical view as axiom, and reveals the relationship between information theory and Natural Intelligence and Artificial Intelligence under real world conditions. This paper also derives the relationship between natural intelligence and nature. According to communication principle of information theory, Natural Intelligence can be divided into real part and virtual part. Based on information theory principle that Information does not increase, the restriction mechanism of Natural Intelligence creativity is conducted. The restriction mechanism of creativity reveals the limit of natural intelligence and artificial intelligence. The paper provides a new angle to observe natural intelligence and artificial intelligence.Keywords: Natural intelligence, artificial intelligence, creativity, information theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19955191 Voice Command Recognition System Based on MFCC and VQ Algorithms
Authors: Mahdi Shaneh, Azizollah Taheri
Abstract:
The goal of this project is to design a system to recognition voice commands. Most of voice recognition systems contain two main modules as follow “feature extraction" and “feature matching". In this project, MFCC algorithm is used to simulate feature extraction module. Using this algorithm, the cepstral coefficients are calculated on mel frequency scale. VQ (vector quantization) method will be used for reduction of amount of data to decrease computation time. In the feature matching stage Euclidean distance is applied as similarity criterion. Because of high accuracy of used algorithms, the accuracy of this voice command system is high. Using these algorithms, by at least 5 times repetition for each command, in a single training session, and then twice in each testing session zero error rate in recognition of commands is achieved.Keywords: MFCC, Vector quantization, Vocal tract, Voicecommand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31685190 Optimal Relaxation Parameters for Obtaining Efficient Iterative Methods for the Solution of Electromagnetic Scattering Problems
Authors: Nadaniela Egidi, Pierluigi Maponi
Abstract:
The approximate solution of a time-harmonic electromagnetic scattering problem for inhomogeneous media is required in several application contexts and its two-dimensional formulation is a Fredholm integral equation of second kind. This integral equation provides a formulation for the direct scattering problem but has to be solved several times in the numerical solution of the corresponding inverse scattering problem. The discretization of this Fredholm equation produces large and dense linear systems that are usually solved by iterative methods. To improve the efficiency of these iterative methods, we use the Symmetric SOR preconditioning and propose an algorithm to evaluate the associated relaxation parameter. We show the efficiency of the proposed algorithm by several numerical experiments, where we use two Krylov subspace methods, i.e. Bi-CGSTAB and GMRES.
Keywords: Fredholm integral equation, iterative method, preconditioning, scattering problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2275189 A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines
Authors: Hany Osman, M. F. Baki
Abstract:
We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.Keywords: Transfer line balancing, Benders' decomposition, Linearization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17365188 ARCS for Critical Information Retrieval Development
Authors: Suttipong Boonphadung
Abstract:
The research on ARCS for critical information retrieval development aimed to (1) investigate conditions of critical information retrieval skill of the Mathematics pre-service teachers before applying ARCS model in learning activities, (2) study and analyze the development of critical information retrieval skill of the Mathematics pre-service teachers after utilizing ARCS model in learning activities, and (3) evaluate the Mathematics pre-service teachers’ satisfaction on using ARCS model in learning activities as a tool to development critical information retrieval skill. Forty-one of 4th year Mathematics pre-service teachers who have enrolled in the subject of Research for Learning Development of semester 2 in 2012 were purposively selected as the research cohort. The research tools were self-report and interview questionnaire that was approved as content validity and reliability (IOC=.66-1.00, α =.834). The research found that critical information retrieval skill of the research samples before using ARCS model in learning activities was in the normal high level. According to the in-depth interview and focus group, the result however showed that the pre-service teachers still lack inadequate and effective knowledge in information retrieval. Additionally, critical information retrieval skill of the research cohort after applying ARCS model in learning activities appeared to be high level. The result revealed that the pre-service teachers are able to explain the method of searching, extraction, and selecting information as well as evaluating quality of information, and effectively making decision in accepting information. Moreover, the research discovered that the pre-service teachers showed normal high to highest level of satisfaction on using ARCS model in learning activities as a tool to development their critical information retrieval skill.
Keywords: Critical information retrieval skill, ARCS model, Satisfaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15295187 Anticipating Action Decisions of Automated Guided Vehicle in an Autonomous Decentralized Flexible Manufacturing System
Authors: Rizauddin Ramli, Jaber Abu Qudeiri, Hidehiko Yamamoto
Abstract:
Nowadays the market for industrial companies is becoming more and more globalized and highly competitive, forcing them to shorten the duration of the manufacturing system development time in order to reduce the time to market. In order to achieve this target, the hierarchical systems used in previous manufacturing systems are not enough because they cannot deal effectively with unexpected situations. To achieve flexibility in manufacturing systems, the concept of an Autonomous Decentralized Flexible Manufacturing System (AD-FMS) is useful. In this paper, we introduce a hypothetical reasoning based algorithm called the Algorithm for Future Anticipative Reasoning (AFAR) which is able to decide on a conceivable next action of an Automated Guided Vehicle (AGV) that works autonomously in the AD-FMS.
Keywords: Flexible Manufacturing System, Automated GuidedVehicle, Hypothetical Reasoning, Autonomous Decentralized.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20965186 Sequential Partitioning Brainbow Image Segmentation Using Bayesian
Authors: Yayun Hsu, Henry Horng-Shing Lu
Abstract:
This paper proposes a data-driven, biology-inspired neural segmentation method of 3D drosophila Brainbow images. We use Bayesian Sequential Partitioning algorithm for probabilistic modeling, which can be used to detect somas and to eliminate crosstalk effects. This work attempts to develop an automatic methodology for neuron image segmentation, which nowadays still lacks a complete solution due to the complexity of the image. The proposed method does not need any predetermined, risk-prone thresholds, since biological information is inherently included inside the image processing procedure. Therefore, it is less sensitive to variations in neuron morphology; meanwhile, its flexibility would be beneficial for tracing the intertwining structure of neurons.
Keywords: Brainbow, 3D imaging, image segmentation, neuron morphology, biological data mining, non-parametric learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22695185 Low Complexity Hybrid Scheme for PAPR Reduction in OFDM Systems Based on SLM and Clipping
Authors: V. Sudha, D. Sriram Kumar
Abstract:
In this paper, we present a low complexity hybrid scheme using conventional selective mapping (C-SLM) and clipping algorithms to reduce the high peak-to-average power ratio (PAPR) of orthogonal frequency division multiplexing (OFDM) signal. In the proposed scheme, the input data sequence (X) is divided into two sub-blocks, then clipping algorithm is applied to the first sub-block, whereas C-SLM algorithm is applied to the second sub-block in order to reduce both computational complexity and PAPR. The resultant time domain OFDM signal is obtained by combining the output of two sub-blocks. The simulation results show that the proposed hybrid scheme provides 0.45 dB PAPR reduction gain at CCDF value of 10-2 and 52% of computational complexity reduction when compared to C-SLM scheme at the expense of slight degradation in bit error rate (BER) performance.Keywords: CCDF, Clipping, OFDM, PAPR, SLM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12805184 Scene Adaptive Shadow Detection Algorithm
Authors: Mohammed Ibrahim M, Anupama R.
Abstract:
Robustness is one of the primary performance criteria for an Intelligent Video Surveillance (IVS) system. One of the key factors in enhancing the robustness of dynamic video analysis is,providing accurate and reliable means for shadow detection. If left undetected, shadow pixels may result in incorrect object tracking and classification, as it tends to distort localization and measurement information. Most of the algorithms proposed in literature are computationally expensive; some to the extent of equalling computational requirement of motion detection. In this paper, the homogeneity property of shadows is explored in a novel way for shadow detection. An adaptive division image (which highlights homogeneity property of shadows) analysis followed by a relatively simpler projection histogram analysis for penumbra suppression is the key novelty in our approach.
Keywords: homogeneity, penumbra, projection histogram, shadow correction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19095183 Fuzzy Clustering Analysis in Real Estate Companies in China
Authors: Jianfeng Li, Feng Jin, Xiaoyu Yang
Abstract:
This paper applies fuzzy clustering algorithm in classifying real estate companies in China according to some general financial indexes, such as income per share, share accumulation fund, net profit margins, weighted net assets yield and shareholders' equity. By constructing and normalizing initial partition matrix, getting fuzzy similar matrix with Minkowski metric and gaining the transitive closure, the dynamic fuzzy clustering analysis for real estate companies is shown clearly that different clustered result change gradually with the threshold reducing, and then, it-s shown there is the similar relationship with the prices of those companies in stock market. In this way, it-s great valuable in contrasting the real estate companies- financial condition in order to grasp some good chances of investment, and so on.
Keywords: Fuzzy clustering algorithm, data mining, real estate company, financial analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19295182 Clustering of Variables Based On a Probabilistic Approach Defined on the Hypersphere
Authors: Paulo Gomes, Adelaide Figueiredo
Abstract:
We consider n individuals described by p standardized variables, represented by points of the surface of the unit hypersphere Sn-1. For a previous choice of n individuals we suppose that the set of observables variables comes from a mixture of bipolar Watson distribution defined on the hypersphere. EM and Dynamic Clusters algorithms are used for identification of such mixture. We obtain estimates of parameters for each Watson component and then a partition of the set of variables into homogeneous groups of variables. Additionally we will present a factor analysis model where unobservable factors are just the maximum likelihood estimators of Watson directional parameters, exactly the first principal component of data matrix associated to each group previously identified. Such alternative model it will yield us to directly interpretable solutions (simple structure), avoiding factors rotations.
Keywords: Dynamic Clusters algorithm, EM algorithm, Factor analysis model, Hierarchical Clustering, Watson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16325181 Alternative Convergence Analysis for a Kind of Singularly Perturbed Boundary Value Problems
Authors: Jiming Yang
Abstract:
A kind of singularly perturbed boundary value problems is under consideration. In order to obtain its approximation, simple upwind difference discretization is applied. We use a moving mesh iterative algorithm based on equi-distributing of the arc-length function of the current computed piecewise linear solution. First, a maximum norm a posteriori error estimate on an arbitrary mesh is derived using a different method from the one carried out by Chen [Advances in Computational Mathematics, 24(1-4) (2006), 197-212.]. Then, basing on the properties of discrete Green-s function and the presented posteriori error estimate, we theoretically prove that the discrete solutions computed by the algorithm are first-order uniformly convergent with respect to the perturbation parameter ε.
Keywords: Convergence analysis, green's function, singularly perturbed, equi-distribution, moving mesh.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17025180 Using Dempster-Shafer Theory in XML Information Retrieval
Authors: F. Raja, M. Rahgozar, F. Oroumchian
Abstract:
XML is a markup language which is becoming the standard format for information representation and data exchange. A major purpose of XML is the explicit representation of the logical structure of a document. Much research has been performed to exploit logical structure of documents in information retrieval in order to precisely extract user information need from large collections of XML documents. In this paper, we describe an XML information retrieval weighting scheme that tries to find the most relevant elements in XML documents in response to a user query. We present this weighting model for information retrieval systems that utilize plausible inferences to infer the relevance of elements in XML documents. We also add to this model the Dempster-Shafer theory of evidence to express the uncertainty in plausible inferences and Dempster-Shafer rule of combination to combine evidences derived from different inferences.Keywords: Dempster-Shafer theory, plausible inferences, XMLinformation retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15385179 Online Optic Disk Segmentation Using Fractals
Authors: Srinivasan Aruchamy, Partha Bhattacharjee, Goutam Sanyal
Abstract:
Optic disk segmentation plays a key role in the mass screening of individuals with diabetic retinopathy and glaucoma ailments. An efficient hardware-based algorithm for optic disk localization and segmentation would aid for developing an automated retinal image analysis system for real time applications. Herein, TMS320C6416DSK DSP board pixel intensity based fractal analysis algorithm for an automatic localization and segmentation of the optic disk is reported. The experiment has been performed on color and fluorescent angiography retinal fundus images. Initially, the images were pre-processed to reduce the noise and enhance the quality. The retinal vascular tree of the image was then extracted using canny edge detection technique. Finally, a pixel intensity based fractal analysis is performed to segment the optic disk by tracing the origin of the vascular tree. The proposed method is examined on three publicly available data sets of the retinal image and also with the data set obtained from an eye clinic. The average accuracy achieved is 96.2%. To the best of the knowledge, this is the first work reporting the use of TMS320C6416DSK DSP board and pixel intensity based fractal analysis algorithm for an automatic localization and segmentation of the optic disk. This will pave the way for developing devices for detection of retinal diseases in the future.Keywords: Color retinal fundus images, Diabetic retinopathy, Fluorescein angiography retinal fundus images, Fractal analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25275178 Quantum Enhanced Correlation Matrix Memories via States Orthogonalisation
Authors: Mario Mastriani, Marcelo Naiouf
Abstract:
This paper introduces a Quantum Correlation Matrix Memory (QCMM) and Enhanced QCMM (EQCMM), which are useful to work with quantum memories. A version of classical Gram-Schmidt orthogonalisation process in Dirac notation (called Quantum Orthogonalisation Process: QOP) is presented to convert a non-orthonormal quantum basis, i.e., a set of non-orthonormal quantum vectors (called qudits) to an orthonormal quantum basis, i.e., a set of orthonormal quantum qudits. This work shows that it is possible to improve the performance of QCMM thanks QOP algorithm. Besides, the EQCMM algorithm has a lot of additional fields of applications, e.g.: Steganography, as a replacement Hopfield Networks, Bilevel image processing, etc. Finally, it is important to mention that the EQCMM is an extremely easy to implement in any firmware.
Keywords: Quantum Algebra, correlation matrix memory, Dirac notation, orthogonalisation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17275177 Frequent and Systematic Timing Enhancement of Congestion Window in Typical Transmission Control Protocol
Authors: Ghassan A. Abed, Akbal O. Salman, Bayan M. Sabbar
Abstract:
Transmission Control Protocol (TCP) among the wired and wireless networks, it still has a practical problem; where the congestion control mechanism does not permit the data stream to get complete bandwidth over the existing network links. To solve this problem, many TCP protocols have been introduced with high speed performance. Therefore, an enhanced congestion window (cwnd) for the congestion control mechanism is proposed in this article to improve the performance of TCP by increasing the number of cycles of the new window to improve the transmitted packet number. The proposed algorithm used a new mechanism based on the available bandwidth of the connection to detect the capacity of network path in order to improve the regular clocking of congestion avoidance mechanism. The work in this paper based on using Network Simulator 2 (NS-2) to simulate the proposed algorithm.
Keywords: TCP, cwnd, Congestion Control, NS-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16645176 Application of Artificial Intelligence to Schedule Operability of Waterfront Facilities in Macro Tide Dominated Wide Estuarine Harbour
Authors: A. Basu, A. A. Purohit, M. M. Vaidya, M. D. Kudale
Abstract:
Mumbai, being traditionally the epicenter of India's trade and commerce, the existing major ports such as Mumbai and Jawaharlal Nehru Ports (JN) situated in Thane estuary are also developing its waterfront facilities. Various developments over the passage of decades in this region have changed the tidal flux entering/leaving the estuary. The intake at Pir-Pau is facing the problem of shortage of water in view of advancement of shoreline, while jetty near Ulwe faces the problem of ship scheduling due to existence of shallower depths between JN Port and Ulwe Bunder. In order to solve these problems, it is inevitable to have information about tide levels over a long duration by field measurements. However, field measurement is a tedious and costly affair; application of artificial intelligence was used to predict water levels by training the network for the measured tide data for one lunar tidal cycle. The application of two layered feed forward Artificial Neural Network (ANN) with back-propagation training algorithms such as Gradient Descent (GD) and Levenberg-Marquardt (LM) was used to predict the yearly tide levels at waterfront structures namely at Ulwe Bunder and Pir-Pau. The tide data collected at Apollo Bunder, Ulwe, and Vashi for a period of lunar tidal cycle (2013) was used to train, validate and test the neural networks. These trained networks having high co-relation coefficients (R= 0.998) were used to predict the tide at Ulwe, and Vashi for its verification with the measured tide for the year 2000 & 2013. The results indicate that the predicted tide levels by ANN give reasonably accurate estimation of tide. Hence, the trained network is used to predict the yearly tide data (2015) for Ulwe. Subsequently, the yearly tide data (2015) at Pir-Pau was predicted by using the neural network which was trained with the help of measured tide data (2000) of Apollo and Pir-Pau. The analysis of measured data and study reveals that: The measured tidal data at Pir-Pau, Vashi and Ulwe indicate that there is maximum amplification of tide by about 10-20 cm with a phase lag of 10-20 minutes with reference to the tide at Apollo Bunder (Mumbai). LM training algorithm is faster than GD and with increase in number of neurons in hidden layer and the performance of the network increases. The predicted tide levels by ANN at Pir-Pau and Ulwe provides valuable information about the occurrence of high and low water levels to plan the operation of pumping at Pir-Pau and improve ship schedule at Ulwe.Keywords: Artificial neural network, back-propagation, tide data, training algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17205175 Information Literacy among Faculty and Students of Medical Colleges of Haryana, Punjab and Chandigarh
Authors: Sanjeev Sharma, Suman Lata
Abstract:
With the availability of diverse printed, electronic literature and web sites on medical and health related information, it is impossible for the medical professional to get the information he seeks in the shortest possible time. For all these problems information literacy is the only solution. Thus, information literacy is recognized as an important aspect of medical education. In the present study, an attempt has been made to know the information literacy skills of the faculty and students at medical colleges of Haryana, Punjab and Chandigarh. The scope of the study was confined to the 12 selected medical colleges of three States (Haryana, Punjab, and Chandigarh). The findings of the study were based on the data collected through 1018 questionnaires filled by the respondents of the medical colleges. It was found that Online Medical Websites (such as WebMD, eMedicine and Mayo Clinic etc.) were frequently used by 63.43% of the respondents of Chandigarh which is slightly more than Haryana (61%) and Punjab (55.65%). As well, 30.86% of the respondents of Chandigarh, 27.41% of Haryana and 27.05% of Punjab were familiar with the controlled vocabulary tool; 25.14% respondents of Chandigarh, 23.80% of Punjab, 23.17% of Haryana were familiar with the Boolean operators; 33.05% of the respondents of Punjab, 28.19% of Haryana and 25.14% of Chandigarh were familiar with the use and importance of the keywords while searching an electronic database; and 51.43% of the respondents of Chandigarh, 44.52% of Punjab and 36.29% of Haryana were able to make effective use of the retrieved information. For accessing information in electronic format, 47.74% of the respondents rated their skills high, while the majority of respondents (76.13%) were unfamiliar with the basic search technique i.e. Boolean operator used for searching information in an online database. On the basis of the findings, it was suggested that a comprehensive training program based on medical professionals information needs should be organized frequently. Furthermore, it was also suggested that information literacy may be included as a subject in the health science curriculum so as to make the medical professionals information literate and independent lifelong learners.
Keywords: Information, information literacy, medical colleges, medical professionals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9465174 SOA Embedded in BPM: A High Level View of Object Oriented Paradigm
Authors: Imran S.Bajwa
Abstract:
The trends of design and development of information systems have undergone a variety of ongoing phases and stages. These variations have been evolved due to brisk changes in user requirements and business needs. To meet these requirements and needs, a flexible and agile business solution was required to come up with the latest business trends and styles. Another obstacle in agility of information systems was typically different treatment of same diseases of two patients: business processes and information services. After the emergence of information technology, the business processes and information systems have become counterparts. But these two business halves have been treated under totally different standards. There is need to streamline the boundaries of these both pillars that are equally sharing information system's burdens and liabilities. In last decade, the object orientation has evolved into one of the major solutions for modern business needs and now, SOA is the solution to shift business on ranks of electronic platform. BPM is another modern business solution that assists to regularize optimization of business processes. This paper discusses how object orientation can be conformed to incorporate or embed SOA in BPM for improved information systems.
Keywords: Object Oriented Business Solutions, Services forBusiness Processes; Mixing SOA and BPM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13435173 Person Re-Identification Using Siamese Convolutional Neural Network
Authors: Sello Mokwena, Monyepao Thabang
Abstract:
In this study, we propose a comprehensive approach to address the challenges in person re-identification models. By combining a centroid tracking algorithm with a Siamese convolutional neural network model, our method excels in detecting, tracking, and capturing robust person features across non-overlapping camera views. The algorithm efficiently identifies individuals in the camera network, while the neural network extracts fine-grained global features for precise cross-image comparisons. The approach's effectiveness is further accentuated by leveraging the camera network topology for guidance. Our empirical analysis of benchmark datasets highlights its competitive performance, particularly evident when background subtraction techniques are selectively applied, underscoring its potential in advancing person re-identification techniques.
Keywords: Camera network, convolutional neural network topology, person tracking, person re-identification, Siamese.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 103