Search results for: pure and hybrid automatic repeat request (ARQ).
1214 Adaptive Pulse Coupled Neural Network Parameters for Image Segmentation
Authors: Thejaswi H. Raya, Vineetha Bettaiah, Heggere S. Ranganath
Abstract:
For over a decade, the Pulse Coupled Neural Network (PCNN) based algorithms have been successfully used in image interpretation applications including image segmentation. There are several versions of the PCNN based image segmentation methods, and the segmentation accuracy of all of them is very sensitive to the values of the network parameters. Most methods treat PCNN parameters like linking coefficient and primary firing threshold as global parameters, and determine them by trial-and-error. The automatic determination of appropriate values for linking coefficient, and primary firing threshold is a challenging problem and deserves further research. This paper presents a method for obtaining global as well as local values for the linking coefficient and the primary firing threshold for neurons directly from the image statistics. Extensive simulation results show that the proposed approach achieves excellent segmentation accuracy comparable to the best accuracy obtainable by trial-and-error for a variety of images.Keywords: Automatic Selection of PCNN Parameters, Image Segmentation, Neural Networks, Pulse Coupled Neural Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22871213 An Overview of Islanding Detection Methods in Photovoltaic Systems
Authors: Wei Yee Teoh, Chee Wei Tan
Abstract:
The issue of unintentional islanding in PV grid interconnection still remains as a challenge in grid-connected photovoltaic (PV) systems. This paper discusses the overview of popularly used anti-islanding detection methods, practically applied in PV grid-connected systems. Anti-islanding methods generally can be classified into four major groups, which include passive methods, active methods, hybrid methods and communication base methods. Active methods have been the preferred detection technique over the years due to very small non-detected zone (NDZ) in small scale distribution generation. Passive method is comparatively simpler than active method in terms of circuitry and operations. However, it suffers from large NDZ that significantly reduces its performance. Communication base methods inherit the advantages of active and passive methods with reduced drawbacks. Hybrid method which evolved from the combination of both active and passive methods has been proven to achieve accurate anti-islanding detection by many researchers. For each of the studied anti-islanding methods, the operation analysis is described while the advantages and disadvantages are compared and discussed. It is difficult to pinpoint a generic method for a specific application, because most of the methods discussed are governed by the nature of application and system dependent elements. This study concludes that the setup and operation cost is the vital factor for anti-islanding method selection in order to achieve minimal compromising between cost and system quality.Keywords: Active method, hybrid method, islanding detection, passive method, photovoltaic (PV), utility method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 97591212 On the Analysis of Bandwidth Management for Hybrid Load Balancing Scheme in WLANs
Authors: Chutima Prommak, Airisa Jantaweetip
Abstract:
In wireless networks, bandwidth is scare resource and it is essential to utilize it effectively. This paper analyses effects of using different bandwidth management techniques on the network performances of the Wireless Local Area Networks (WLANs) that use hybrid load balancing scheme. In particular, we study three bandwidth management schemes, namely Complete Sharing (CS), Complete Partitioning (CP), and Partial Sharing (PS). Performances of these schemes are evaluated by simulation experiments in term of percentage of network association blocking. Our results show that the CS scheme can provide relatively low blocking percentage in various network traffic scenarios whereas the PS scheme can enhance quality of services of the multimedia traffic with rather small expenses on the blocking percentage of the best effort traffic.
Keywords: Bandwidth management, Load Balancing, WLANs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14631211 A Family of Zero Stable Block Integrator for the Solutions of Ordinary Differential Equations
Authors: A. M. Sagir
Abstract:
In this paper, linear multistep technique using power series as the basis function is used to develop the block methods which are suitable for generating direct solution of the special second order ordinary differential equations with associated initial or boundary conditions. The continuous hybrid formulations enable us to differentiate and evaluate at some grids and off – grid points to obtain two different four discrete schemes, each of order (5,5,5,5)T, which were used in block form for parallel or sequential solutions of the problems. The computational burden and computer time wastage involved in the usual reduction of second order problem into system of first order equations are avoided by this approach. Furthermore, a stability analysis and efficiency of the block methods are tested on linear and non-linear ordinary differential equations and the results obtained compared favorably with the exact solution.Keywords: Block Method, Hybrid, Linear Multistep Method, Self – starting, Special Second Order.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14821210 Perturbed-Chain Statistical Association Fluid Theory (PC-SAFT) Parameters for Propane, Ethylene, and Hydrogen under Supercritical Conditions
Authors: Ilke Senol
Abstract:
Perturbed-Chain Statistical Association Fluid Theory (PC-SAFT) equation of state (EOS) is a modified SAFT EOS with three pure component specific parameters: segment number (m), diameter (σ) and energy (ε). These PC-SAFT parameters need to be determined for each component under the conditions of interest by fitting experimental data, such as vapor pressure, density or heat capacity. PC-SAFT parameters for propane, ethylene and hydrogen in supercritical region were successfully estimated by fitting experimental density data available in literature. The regressed PCSAFT parameters were compared with the literature values by means of estimating pure component density and calculating average absolute deviation between the estimated and experimental density values. PC-SAFT parameters available in literature especially for ethylene and hydrogen estimated density in supercritical region reasonably well. However, the regressed PC-SAFT parameters performed better in supercritical region than the PC-SAFT parameters from literature.
Keywords: Equation of state, perturbed-chain, PC-SAFT, super critical.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 69921209 Automatic Translation of Ada-ECATNet Using Rewriting Logic
Authors: N. Boudiaf
Abstract:
One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic and its programming language Maude. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems We proposed previously a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet) and we showed that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets. We showed also previously how the ECATNet formalism offers to Ada many validation and verification tools like simulation, Model Checking, accessibility analysis and static analysis. In this paper, we describe the implementation of our translation of the Ada programs into ECATNets.Keywords: Ada tasking, Analysis, Automatic Translation, ECATNets, Maude, Rewriting Logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15841208 Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents
Authors: Kanso Hassan, Elhore Ali, Soule-dupuy Chantal, Tazi Said
Abstract:
Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.Keywords: Information research, text analyzes, intentionalstructure, segmentation, ontology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16381207 Effect of Pack Aluminising Conditions on βNiAl Coatings
Authors: A. D. Chandio, P. Xiao
Abstract:
In this study, nickel aluminide coatings were deposited onto CMSX-4 single crystal superalloy and pure Ni substrates by using in-situ chemical vapour deposition (CVD) technique. The microstructural evolutions and coating thickness (CT) were studied upon the variation of processing conditions i.e. time and temperature. The results demonstrated (under identical conditions) that coating formed on pure Ni contains no substrate entrapments and have lower CT in comparison to one deposited on the CMSX-4 counterpart. In addition, the interdiffusion zone (IDZ) of Ni substrate is a γ’-Ni3Al in comparison to the CMSX-4 alloy that is βNiAl phase. The higher CT on CMSX-4 superalloy is attributed to presence of γ-Ni/γ’-Ni3Al structure which contains ~ 15 at.% Al before deposition (that is already present in superalloy). Two main deposition parameters (time and temperature) of the coatings were also studied in addition to standard comparison of substrate effects. The coating formation time was found to exhibit profound effect on CT, whilst temperature was found to change coating activities. In addition, the CT showed linear trend from 800 to 1000 °C, thereafter reduction was observed. This was attributed to the change in coating activities.
Keywords: βNiAl, in-situ CVD, CT, CMSX-4, Ni, microstructure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24121206 Improving Performance of World Wide Web by Adaptive Web Traffic Reduction
Authors: Achuthsankar S. Nair, J. S. Jayasudha
Abstract:
The ever increasing use of World Wide Web in the existing network, results in poor performance. Several techniques have been developed for reducing web traffic by compressing the size of the file, saving the web pages at the client side, changing the burst nature of traffic into constant rate etc. No single method was adequate enough to access the document instantly through the Internet. In this paper, adaptive hybrid algorithms are developed for reducing web traffic. Intelligent agents are used for monitoring the web traffic. Depending upon the bandwidth usage, user-s preferences, server and browser capabilities, intelligent agents use the best techniques to achieve maximum traffic reduction. Web caching, compression, filtering, optimization of HTML tags, and traffic dispersion are incorporated into this adaptive selection. Using this new hybrid technique, latency is reduced to 20 – 60 % and cache hit ratio is increased 40 – 82 %.Keywords: Bandwidth, Congestion, Intelligent Agents, Prefetching, Web Caching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17431205 Physical and Rheological Properties of Asphalt Modified with Cellulose Date Palm Fibers
Authors: Howaidi M. Al-Otaibi, Abdulrahman S. Al-Suhaibani, Hamad A. Alsoliman
Abstract:
Fibers are extensively used in civil engineering applications for many years. In this study, empty fruit bunch of date palm trees were used to produce cellulose fiber that were used as additives in the asphalt binder. Two sizes (coarse and fine) of cellulose fibers were pre-blended in PG64-22 binder with various contents of 1.5%, 3%, 4.5%, 6%, and 7.5% by weight of asphalt binder. The physical and rheological properties of fiber modified asphalt binders were tested by using conventional tests such as penetration, softening point and viscosity; and SHRP test such as dynamic shear rheometer. The results indicated that the fiber modified asphalt binders were higher in softening point, viscosity, and complex shear modulus, and lower in penetration compared to pure asphalt. The fiber modified binders showed an improvement in rheological properties since it was possible to raise the control binder (pure asphalt) PG from 64 to 70 by adding 6% (by weight) of either fine or coarse fibers. Such improvement in stiffness of fiber modified binder is expected to improve pavement resistance to rutting.
Keywords: Cellulose date palm fiber, fiber modified asphalt, physical properties, rheological properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18561204 Traffic Density Measurement by Automatic Detection of Vehicles Using Gradient Vectors from Aerial Images
Authors: Saman Ghaffarian, Ilgın Gökasar
Abstract:
This paper presents a new automatic vehicle detection method from very high resolution aerial images to measure traffic density. The proposed method starts by extracting road regions from image using road vector data. Then, the road image is divided into equal sections considering resolution of the images. Gradient vectors of the road image are computed from edge map of the corresponding image. Gradient vectors on the each boundary of the sections are divided where the gradient vectors significantly change their directions. Finally, number of vehicles in each section is carried out by calculating the standard deviation of the gradient vectors in each group and accepting the group as vehicle that has standard deviation above predefined threshold value. The proposed method was tested in four very high resolution aerial images acquired from Istanbul, Turkey which illustrate roads and vehicles with diverse characteristics. The results show the reliability of the proposed method in detecting vehicles by producing 86% overall F1 accuracy value.Keywords: Aerial images, intelligent transportation systems, traffic density measurement, vehicle detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29351203 Simulation Data Management Approach for Developing Adaptronic Systems – The W-Model Methodology
Authors: Roland S. Nattermann, Reiner Anderl
Abstract:
Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.
Keywords: Adaptronic, Data-Management, LOEWE-CentreAdRIA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23681202 Superior Performances of the Neural Network on the Masses Lesions Classification through Morphological Lesion Differences
Authors: U. Bottigli, R.Chiarucci, B. Golosio, G.L. Masala, P. Oliva, S.Stumbo, D.Cascio, F. Fauci, M. Glorioso, M. Iacomi, R. Magro, G. Raso
Abstract:
Purpose of this work is to develop an automatic classification system that could be useful for radiologists in the breast cancer investigation. The software has been designed in the framework of the MAGIC-5 collaboration. In an automatic classification system the suspicious regions with high probability to include a lesion are extracted from the image as regions of interest (ROIs). Each ROI is characterized by some features based generally on morphological lesion differences. A study in the space features representation is made and some classifiers are tested to distinguish the pathological regions from the healthy ones. The results provided in terms of sensitivity and specificity will be presented through the ROC (Receiver Operating Characteristic) curves. In particular the best performances are obtained with the Neural Networks in comparison with the K-Nearest Neighbours and the Support Vector Machine: The Radial Basis Function supply the best results with 0.89 ± 0.01 of area under ROC curve but similar results are obtained with the Probabilistic Neural Network and a Multi Layer Perceptron.
Keywords: Neural Networks, K-Nearest Neighbours, Support Vector Machine, Computer Aided Detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16141201 A Dynamic Hybrid Option Pricing Model by Genetic Algorithm and Black- Scholes Model
Authors: Yi-Chang Chen, Shan-Lin Chang, Chia-Chun Wu
Abstract:
Unlike this study focused extensively on trading behavior of option market, those researches were just taken their attention to model-driven option pricing. For example, Black-Scholes (B-S) model is one of the most famous option pricing models. However, the arguments of B-S model are previously mentioned by some pricing models reviewing. This paper following suggests the importance of the dynamic character for option pricing, which is also the reason why using the genetic algorithm (GA). Because of its natural selection and species evolution, this study proposed a hybrid model, the Genetic-BS model which combining GA and B-S to estimate the price more accurate. As for the final experiments, the result shows that the output estimated price with lower MAE value than the calculated price by either B-S model or its enhanced one, Gram-Charlier garch (G-C garch) model. Finally, this work would conclude that the Genetic-BS pricing model is exactly practical.Keywords: genetic algorithm, Genetic-BS, option pricing model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22451200 Hybrid Honeypot System for Network Security
Authors: Kyi Lin Lin Kyaw
Abstract:
Nowadays, we are facing with network threats that cause enormous damage to the Internet community day by day. In this situation, more and more people try to prevent their network security using some traditional mechanisms including firewall, Intrusion Detection System, etc. Among them honeypot is a versatile tool for a security practitioner, of course, they are tools that are meant to be attacked or interacted with to more information about attackers, their motives and tools. In this paper, we will describe usefulness of low-interaction honeypot and high-interaction honeypot and comparison between them. And then we propose hybrid honeypot architecture that combines low and high -interaction honeypot to mitigate the drawback. In this architecture, low-interaction honeypot is used as a traffic filter. Activities like port scanning can be effectively detected by low-interaction honeypot and stop there. Traffic that cannot be handled by low-interaction honeypot is handed over to high-interaction honeypot. In this case, low-interaction honeypot is used as proxy whereas high-interaction honeypot offers the optimal level realism. To prevent the high-interaction honeypot from infections, containment environment (VMware) is used.Keywords: Low-interaction honeypot, High-interactionhoneypot, VMware, Proxy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29531199 Data Transformation Services (DTS): Creating Data Mart by Consolidating Multi-Source Enterprise Operational Data
Authors: J. D. D. Daniel, K. N. Goh, S. M. Yusop
Abstract:
Trends in business intelligence, e-commerce and remote access make it necessary and practical to store data in different ways on multiple systems with different operating systems. As business evolve and grow, they require efficient computerized solution to perform data update and to access data from diverse enterprise business applications. The objective of this paper is to demonstrate the capability of DTS [1] as a database solution for automatic data transfer and update in solving business problem. This DTS package is developed for the sales of variety of plants and eventually expanded into commercial supply and landscaping business. Dimension data modeling is used in DTS package to extract, transform and load data from heterogeneous database systems such as MySQL, Microsoft Access and Oracle that consolidates into a Data Mart residing in SQL Server. Hence, the data transfer from various databases is scheduled to run automatically every quarter of the year to review the efficient sales analysis. Therefore, DTS is absolutely an attractive solution for automatic data transfer and update which meeting today-s business needs.Keywords: Data Transformation Services (DTS), ObjectLinking and Embedding Database (OLEDB), Data Mart, OnlineAnalytical Processing (OLAP), Online Transactional Processing(OLTP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20381198 Hybrid Recommender Systems using Social Network Analysis
Authors: Kyoung-Jae Kim, Hyunchul Ahn
Abstract:
This study proposes novel hybrid social network analysis and collaborative filtering approach to enhance the performance of recommender systems. The proposed model selects subgroups of users in Internet community through social network analysis (SNA), and then performs clustering analysis using the information about subgroups. Finally, it makes recommendations using cluster-indexing CF based on the clustering results. This study tries to use the cores in subgroups as an initial seed for a conventional clustering algorithm. This model chooses five cores which have the highest value of degree centrality from SNA, and then performs clustering analysis by using the cores as initial centroids (cluster centers). Then, the model amplifies the impact of friends in social network in the process of cluster-indexing CF.
Keywords: Social network analysis, Recommender systems, Collaborative filtering, Customer relationship management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27731197 Automatic Map Simplification for Visualization on Mobile Devices
Authors: Hang Yu
Abstract:
The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.
Keywords: Map simplification, visualization, mobile devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14361196 A Multi-Science Study of Modern Synergetic War and Its Information Security Component
Authors: Alexander G. Yushchenko
Abstract:
From a multi-science point of view, we analyze threats to security resulting from globalization of international information space and information and communication aggression of Russia. A definition of Ruschism is formulated as an ideology supporting aggressive actions of modern Russia against the Euro-Atlantic community. Stages of the hybrid war Russia is leading against Ukraine are described, including the elements of subversive activity of the special services, the activation of the military phase and the gradual shift of the focus of confrontation to the realm of information and communication technologies. We reveal an emergence of a threat for democratic states resulting from the destabilizing impact of a target state’s mass media and social networks being exploited by Russian secret services under freedom-of-speech disguise. Thus, we underline the vulnerability of cyber- and information security of the network society in regard of hybrid war. We propose to define the latter a synergetic war. Our analysis is supported with a long-term qualitative monitoring of representation of top state officials on popular TV channels and Facebook. From the memetics point of view, we have detected a destructive psycho-information technology used by the Kremlin, a kind of information catastrophe, the essence of which is explained in detail. In the conclusion, a comprehensive plan for information protection of the public consciousness and mentality of Euro-Atlantic citizens from the aggression of the enemy is proposed.
Keywords: Cyber and information security, psycho-information technology, hybrid war, synergetic war, WWIII, Ruschism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10111195 Simulation of an Auto-Tuning Bicycle Suspension Fork with Quick Releasing Valves
Authors: Y. C. Mao, G. S. Chen
Abstract:
Bicycle configuration is not as large as those of motorcycles or automobiles, while it indeed composes a complicated dynamic system. People-s requirements on comfortability, controllability and safety grow higher as the research and development technologies improve. The shock absorber affects the vehicle suspension performances enormously. The absorber takes the vibration energy and releases it at a suitable time, keeping the wheel under a proper contact condition with road surface, maintaining the vehicle chassis stability. Suspension design for mountain bicycles is more difficult than that of city bikes since it encounters dynamic variations on road and loading conditions. Riders need a stiff damper as they exert to tread on the pedals when climbing, while a soft damper when they descend downhill. Various switchable shock absorbers are proposed in markets, however riders have to manually switch them among soft, hard and lock positions. This study proposes a novel design of the bicycle shock absorber, which provides automatic smooth tuning of the damping coefficient, from a predetermined lower bound to theoretically unlimited. An automatic quick releasing valve is involved in this design so that it can release the peak pressure when the suspension fork runs into a square-wave type obstacle and prevent the chassis from damage, avoiding the rider skeleton from injury. This design achieves the automatic tuning process by innovative plunger valve and fluidic passage arrangements without any electronic devices. Theoretical modelling of the damper and spring are established in this study. Design parameters of the valves and fluidic passages are determined. Relations between design parameters and shock absorber performances are discussed in this paper. The analytical results give directions to the shock absorber manufacture.
Keywords: Modelling, Simulation, Bicycle, Shock Absorber, Damping, Releasing Valve
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28901194 A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms
Authors: Parisut Jitpakdee, Pakinee Aimmanee, Bunyarit Uyyanonvara
Abstract:
Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.Keywords: Clustering, Color quantization, Firefly algorithm, Kmeans.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22181193 Enhanced Disk-Based Databases Towards Improved Hybrid In-Memory Systems
Authors: Samuel Kaspi, Sitalakshmi Venkatraman
Abstract:
In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable inmemory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of diskbased database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of inmemory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.
Keywords: Concurrency control, disk-based databases, inmemory systems, enhanced memory access (EMA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20381192 A Hybrid Ontology Based Approach for Ranking Documents
Authors: Sarah Motiee, Azadeh Nematzadeh, Mehrnoush Shamsfard
Abstract:
Increasing growth of information volume in the internet causes an increasing need to develop new (semi)automatic methods for retrieval of documents and ranking them according to their relevance to the user query. In this paper, after a brief review on ranking models, a new ontology based approach for ranking HTML documents is proposed and evaluated in various circumstances. Our approach is a combination of conceptual, statistical and linguistic methods. This combination reserves the precision of ranking without loosing the speed. Our approach exploits natural language processing techniques to extract phrases from documents and the query and doing stemming on words. Then an ontology based conceptual method will be used to annotate documents and expand the query. To expand a query the spread activation algorithm is improved so that the expansion can be done flexible and in various aspects. The annotated documents and the expanded query will be processed to compute the relevance degree exploiting statistical methods. The outstanding features of our approach are (1) combining conceptual, statistical and linguistic features of documents, (2) expanding the query with its related concepts before comparing to documents, (3) extracting and using both words and phrases to compute relevance degree, (4) improving the spread activation algorithm to do the expansion based on weighted combination of different conceptual relationships and (5) allowing variable document vector dimensions. A ranking system called ORank is developed to implement and test the proposed model. The test results will be included at the end of the paper.Keywords: Document ranking, Ontology, Spread activation algorithm, Annotation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16301191 Multi-objective Optimisation of Composite Laminates under Heat and Moisture Effects using a Hybrid Neuro-GA Algorithm
Authors: M. R. Ghasemi, A. Ehsani
Abstract:
In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.Keywords: Composite Laminates, GA, Multi-objectiveOptimisation, Neural Networks, RBFNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16311190 The Hybrid Dimming Control System for Solar Charging Robot
Authors: A. Won-Yong Chae, B. Hyung-Nam Kim, C. Kyoung-Jun Lee, D. Hee-Je Kim
Abstract:
The renewable energy has been attracting attention as a new alternative energy due to the problem of environmental pollution and resource depletion. In particular, daylighting and PV system are regarded as the solutions. In this paper, the hybrid dimming control system supplied by solar cell and daylighting system was designed. Daylighting system is main source and PV system is spare source. PV system operates the LED lamp which supports daylighting system because daylighting system is unstable due to the variation of irradiance. In addition, PV system has a role charging batteries. Battery charging has a benefit that PV system operate LED lamp in the bad weather. However, LED lamp always can`t turn on that-s why dimming control system was designed. In particular, the solar charging robot was designed to check the interior irradiance intensity. These systems and the application of the solar charging robot are expected to contribute developing alternative energy in the near future.Keywords: Daylighting system, PV system, LED lamp, Suntracking robot.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18071189 Modeling and Analyzing the WAP Class 2 Wireless Transaction Protocol Using Event-B
Authors: Rajaa Filali, Mohamed Bouhdadi
Abstract:
This paper presents an incremental formal development of the Wireless Transaction Protocol (WTP) in Event-B. WTP is part of the Wireless Application Protocol (WAP) architectures and provides a reliable request-response service. To model and verify the protocol, we use the formal technique Event-B which provides an accessible and rigorous development method. This interaction between modelling and proving reduces the complexity and helps to eliminate misunderstandings, inconsistencies, and specification gaps. As result, verification of WTP allows us to find some deficiencies in the current specification.
Keywords: Event-B, wireless transaction protocol, refinement, proof obligation, Rodin, ProB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9611188 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics
Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim
Abstract:
A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.
Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5841187 Automatic Product Identification Based on Deep-Learning Theory in an Assembly Line
Authors: Fidel Lòpez Saca, Carlos Avilés-Cruz, Miguel Magos-Rivera, José Antonio Lara-Chávez
Abstract:
Automated object recognition and identification systems are widely used throughout the world, particularly in assembly lines, where they perform quality control and automatic part selection tasks. This article presents the design and implementation of an object recognition system in an assembly line. The proposed shapes-color recognition system is based on deep learning theory in a specially designed convolutional network architecture. The used methodology involve stages such as: image capturing, color filtering, location of object mass centers, horizontal and vertical object boundaries, and object clipping. Once the objects are cut out, they are sent to a convolutional neural network, which automatically identifies the type of figure. The identification system works in real-time. The implementation was done on a Raspberry Pi 3 system and on a Jetson-Nano device. The proposal is used in an assembly course of bachelor’s degree in industrial engineering. The results presented include studying the efficiency of the recognition and processing time.Keywords: Deep-learning, image classification, image identification, industrial engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7581186 The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable
Authors: Nutthapat Kaewrattanapat, Wannarat Bunchongkien
Abstract:
The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.
Keywords: Algorithm, Spoonerism, Computational Linguistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23581185 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal
Authors: Karima Siham Aoubid, Mohamed Boulemden
Abstract:
The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1337