Search results for: time scale.
5885 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6
Authors: M. Moslehpour, S. Khorsandi
Abstract:
Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.
Keywords: NDP, IPsec, SEND, CGA, Modifier, Malicious node, Self-Computing, Distributed-Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13765884 Transmission Lines Loading Enhancement Using ADPSO Approach
Authors: M. Mahdavi, H. Monsef, A. Bagheri
Abstract:
Discrete particle swarm optimization (DPSO) is a powerful stochastic evolutionary algorithm that is used to solve the large-scale, discrete and nonlinear optimization problems. However, it has been observed that standard DPSO algorithm has premature convergence when solving a complex optimization problem like transmission expansion planning (TEP). To resolve this problem an advanced discrete particle swarm optimization (ADPSO) is proposed in this paper. The simulation result shows that optimization of lines loading in transmission expansion planning with ADPSO is better than DPSO from precision view point.Keywords: ADPSO, TEP problem, Lines loading optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16195883 The Effect of the Side-Weir Crest Height to Scour in Clay-Sand Mixed Sediments
Authors: F. Ayça Varol Saraçoğlu, Hayrullah Ağaçcıoğlu
Abstract:
Experimental studies to investigate the depth of the scour conducted at a side-weir intersection located at the 1800 curved flume which located Hydraulic Laboratory of Yıldız Technical University, Istanbul, Turkey. Side weirs were located at the middle of the straight part of the main channel. Three different lengths (25, 40 and 50 cm) and three different weir crest height (7, 10 and 12 cm) of the side weir placed on the side weir station. There is no scour when the material is only kaolin. Therefore, the cohesive bed was prepared by properly mixing clay material (kaolin) with 31% sand in all experiments. Following 24h consolidation time, in order to observe the effect of flow intensity on the scour depth, experiments were carried out for five different upstream Froude numbers in the range of 0.33-0.81. As a result of this study the relation between scour depth and upstream flow intensity as a function of time have been established. The longitudinal velocities decreased along the side weir; towards the downstream due to overflow over the side-weirs. At the beginning, the scour depth increases rapidly with time and then asymptotically approached constant values in all experiments for all side weir dimensions as in non-cohesive sediment. Thus, the scour depth reached equilibrium conditions. Time to equilibrium depends on the approach flow intensity and the dimensions of side weirs. For different heights of the weir crest, dimensionless scour depths increased with increasing upstream Froude number. Equilibrium scour depths which formed 7 cm side-weir crest height were obtained higher than that of the 12 cm side-weir crest height. This means when side-weir crest height increased equilibrium scour depths decreased. Although the upstream side of the scour hole is almost vertical, the downstream side of the hole is inclined.Keywords: Clay-sand mixed sediments, scour, side weir.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21345882 Solitary Wave Solutions for Burgers-Fisher type Equations with Variable Coefficients
Authors: Amit Goyal, Alka, Rama Gupta, C. Nagaraja Kumar
Abstract:
We have solved the Burgers-Fisher (BF) type equations, with time-dependent coefficients of convection and reaction terms, by using the auxiliary equation method. A class of solitary wave solutions are obtained, and some of which are derived for the first time. We have studied the effect of variable coefficients on physical parameters (amplitude and velocity) of solitary wave solutions. In some cases, the BF equations could be solved for arbitrary timedependent coefficient of convection term.Keywords: Solitary wave solution, Variable coefficient Burgers- Fisher equation, Auxiliary equation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16285881 Knowledge Sharing based on Semantic Nets and Mereology to Avoid Risks in Manufacturing
Authors: Ulrich Berger, Yuliya Lebedynska, Veronica Vargas
Abstract:
The right information at the right time influences the enterprise and technical success. Sharing knowledge among members of a big organization may be a complex activity. And as long as the knowledge is not shared, can not be exploited by the organization. There are some mechanisms which can originate knowledge sharing. It is intended, in this paper, to trigger these mechanisms by using semantic nets. Moreover, the intersection and overlapping of terms and sub-terms, as well as their relationships will be described through the mereology science for the whole knowledge sharing system. It is proposed a knowledge system to supply to operators with the right information about a specific process and possible risks, e.g. at the assembly process, at the right time in an automated manufacturing environment, such as at the automotive industry.Keywords: Automated manufacturing, knowledge sharing, mereology, risk management, semantic net.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14845880 Architectural Thinking in a Time of Climate Emergency
Authors: Manoj Parmar
Abstract:
The article uses reflexivity as a research method to investigate and propose an architectural theory plan for climate change. It hypothecates that to discuss or formulate discourse on "architectural paradigm and climate change," firstly, we need to understand the modes of integration that enable architectural thinking with climate change. The research intends to study the various integration modes that have evolved historically and situate them in time. Subsequently, it analyzes the integration pattern, challenges the existing model, and finds a way towards climate change as central to architectural thinking. The study is fundamental on-premises that ecology and climate change scholarship has consistently out lashed the asymmetrical and nonlinear knowledge and needs approaches for architecture that are less burden to climate change to people and minimize its impact on ecology.
Keywords: Climate change, architectural theory, reflexivity, modernity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3375879 Angular-Coordinate Driven Radial Tree Drawing
Authors: Farshad Ghassemi Toosi, Nikola S. Nikolov
Abstract:
We present a visualization technique for radial drawing of trees consisting of two slightly different algorithms. Both of them make use of node-link diagrams for visual encoding. This visualization creates clear drawings without edge crossing. One of the algorithms is suitable for real-time visualization of large trees, as it requires minimal recalculation of the layout if leaves are inserted or removed from the tree; while the other algorithm makes better utilization of the drawing space. The algorithms are very similar and follow almost the same procedure but with different parameters. Both algorithms assign angular coordinates for all nodes which are then converted into 2D Cartesian coordinates for visualization. We present both algorithms and discuss how they compare to each other.
Keywords: Radial Tree Drawing, Real-Time Visualization, Angular Coordinates, Large Trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26015878 Probabilistic Robustness Assessment of Structures under Sudden Column-Loss Scenario
Authors: Ali Y Al-Attraqchi, P. Rajeev, M. Javad Hashemi, Riadh Al-Mahaidi
Abstract:
This paper presents a probabilistic incremental dynamic analysis (IDA) of a full reinforced concrete building subjected to column loss scenario for the assessment of progressive collapse. The IDA is chosen to explicitly account for uncertainties in loads and system capacity. Fragility curves are developed to predict the probability of progressive collapse given the loss of one or more columns. At a broader scale, it will also provide critical information needed to support the development of a new generation of design codes that attempt to explicitly quantify structural robustness.
Keywords: Incremental dynamic analysis, progressive collapse, structural engineering, pushdown analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10615877 Disinfection of Water by Adsorption with Electrochemical Regeneration
Authors: S. N. Hussain, H. M. A. Asghar, E. P. L. Roberts, N. W. Brown
Abstract:
Arvia®, a spin-out company of University of Manchester, UK is commercialising a water treatment technology for the removal of low concentrations of organics from water. This technology is based on the adsorption of organics onto graphite based adsorbents coupled with their electrochemical regeneration in a simple electrochemical cell. In this paper, the potential of the process to adsorb microorganisms and electrochemically disinfect them present in water has been demonstrated. Bench scale experiments have indicated that the process of adsorption using graphite adsorbents with electrochemical regeneration can be used for water disinfection effectively. The most likely mechanisms of disinfection of water through this process include direct electrochemical oxidation and electrochemical chlorination.Keywords: Arvia, Adsorption, Electrochemical Regeneration, Nyex
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21595876 An Experimental Study on Autoignition of Wood
Authors: Tri Poespowati
Abstract:
Experiments were conducted to characterize fire properties of wood exposed to the certain external heat flux and under variety of wood moisture content. Six kinds of Indonesian wood: keruing, sono, cemara, kamper, pinus, and mahoni were exposed to radiant heat from a conical heater, result in appearance of a stable flame on the wood surface caused by spontaneous ignition. A thermocouple K-type was used to measure the wood surface temperature. Temperature histories were recorded throughout each experiment at 1 s intervals using a TC-08. Data of first ignition time and temperature, end ignition time and temperature, and charring rate have been successfully collected. It was found that the ignition temperature and charring rate depend on moisture content of wood.Keywords: Fire properties, moisture content, wood, charring rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20605875 A Novel Dual-Purpose Image Watermarking Technique
Authors: Maha Sharkas, Dahlia R. ElShafie, Nadder Hamdy
Abstract:
Image watermarking has proven to be quite an efficient tool for the purpose of copyright protection and authentication over the last few years. In this paper, a novel image watermarking technique in the wavelet domain is suggested and tested. To achieve more security and robustness, the proposed techniques relies on using two nested watermarks that are embedded into the image to be watermarked. A primary watermark in form of a PN sequence is first embedded into an image (the secondary watermark) before being embedded into the host image. The technique is implemented using Daubechies mother wavelets where an arbitrary embedding factor α is introduced to improve the invisibility and robustness. The proposed technique has been applied on several gray scale images where a PSNR of about 60 dB was achieved.Keywords: Image watermarking, Multimedia Security, Wavelets, Image Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16995874 Transmission Expansion Planning with Economic Dispatch and N-1Constraints
Authors: A. Charlangsut, M. Boonthienthong, N. Rugthaicharoencheep
Abstract:
This paper proposes a mathematical model for transmission expansion employing optimization method with scenario analysis approach. Economic transmission planning, on the other hand, seeks investment opportunities so that network expansions can generate more economic benefits than the costs. This approach can be used as a decision model for building new transmission lines added to the existing transmission system minimizing costs of the entire system subject to various system’s constraints and consider of loss value of transmission system and N-1 checking. The results show that the proposed model is efficient to be applied for the larger scale of power system topology.
Keywords: Transmission Expansion Planning, Economic Dispatch, Scenario Analysis, Contingency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21035873 Minimizing Examinee Collusion with a Latin- Square Treatment Structure
Authors: M. H. Omar
Abstract:
Cheating on standardized tests has been a major concern as it potentially minimizes measurement precision. One major way to reduce cheating by collusion is to administer multiple forms of a test. Even with this approach, potential collusion is still quite large. A Latin-square treatment structure for distributing multiple forms is proposed to further reduce the colluding potential. An index to measure the extent of colluding potential is also proposed. Finally, with a simple algorithm, the various Latin-squares were explored to find the best structure to keep the colluding potential to a minimum.Keywords: Colluding pairs, Scale for Colluding Potential, Latin-Square Structure, Minimization of Cheating.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11665872 Biodiesel Production from Palm Oil using Heterogeneous Base Catalyst
Authors: Sirichai Chantara-arpornchai, Apanee Luengnaruemitchai, Samai Jai-In
Abstract:
In this study, the transesterification of palm oil with methanol for biodiesel production was studied by using CaO–ZnO as a heterogeneous base catalyst prepared by incipient-wetness impregnation (IWI) and co-precipitation (CP) methods. The reaction parameters considered were molar ratio of methanol to oil, amount of catalyst, reaction temperature, and reaction time. The optimum conditions–15:1 molar ratio of methanol to oil, a catalyst amount of 6 wt%, reaction temperature of 60 °C, and reaction time of 8 h–were observed. The effects of Ca loading, calcination temperature, and catalyst preparation on the catalytic performance were studied. The fresh and spent catalysts were characterized by several techniques, including XRD, TPR, and XRF.
Keywords: CaO, ZnO, biodiesel, heterogeneous catalyst, trans-esterification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25695871 Decentralised Edge Authentication in the Industrial Enterprise IoT Space
Authors: C. P. Autry, A.W. Roscoe
Abstract:
Authentication protocols based on public key infrastructure (PKI) and trusted third party (TTP) are no longer adequate for industrial scale IoT networks thanks to issues such as low compute and power availability, the use of widely distributed and commercial off-the-shelf (COTS) systems, and the increasingly sophisticated attackers and attacks we now have to counter. For example, there is increasing concern about nation-state-based interference and future quantum computing capability. We have examined this space from first principles and have developed several approaches to group and point-to-point authentication for IoT that do not depend on the use of a centralised client-server model. We emphasise the use of quantum resistant primitives such as strong cryptographic hashing and the use multi-factor authentication.
Keywords: Authentication, enterprise IoT cybersecurity, public key infrastructure, trusted third party.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4725870 Search for New Design Elements in Time-Honoured Shops in Tainan—On Curriculum Practice about Culture Creative Industry
Authors: Ya-Ling Huang, Ming-Chun Tsai, Fan Hsu, Kai-Ru Hsieh
Abstract:
This paper mainly discusses the research and practice process of a laboratory curriculum by leading students to perform field investigation into time-honoured shops that have existed for more than 50 years in the downtown area of Tainan, Taiwan, and then search again for design elements and completing the design. The participants are juniors from the Department of Visual Communication Design, Kun Shan University. The duration of research and practice is two months. Operators of these shops are invited to jointly appraise the final achievements. 9 works out of 27 are chosen for final exhibition and commercialization.
Keywords: Culture creative industry, visual communication design, curriculum experimental.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17455869 Production of Biodiesel from Roasted Chicken Fat and Methanol: Free Catalyst
Authors: Jorge Ramírez-Ortiz, Merced Martínez Rosales, Horacio Flores Zúñiga
Abstract:
Transesterification reactions free of catalyst between roasted chicken fat with methanol were carried out in a batch reactor in order to produce biodiesel to temperatures from 120°C to 140°C. Parameters related to the transesterification reactions, including temperature, time and the molar ratio of chicken fat to methanol also investigated. The maximum yield of the reaction was of 98% under conditions of 140°C, 4 h of reaction time and a molar ratio of chicken fat to methanol of 1:31. The biodiesel thus obtained exhibited a viscosity of 6.3 mm2/s and a density of 895.9 kg/m3. The results showed this process can be right choice to produce biodiesel since this process does not use any catalyst. Therefore, the steps of neutralization and washing are avoided, indispensables in the case of the alkaline catalysis.
Keywords: Biodiesel, non-catalyst, roasted chicken fat, transesterification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31585868 Proposal of Commutation Protocol in Hybrid Sensors and Vehicular Networks for Intelligent Transport Systems
Authors: Taha Bensiradj, Samira Moussaoui
Abstract:
Hybrid Sensors and Vehicular Networks (HSVN), represent a hybrid network, which uses several generations of Ad-Hoc networks. It is used especially in Intelligent Transport Systems (ITS). The HSVN allows making collaboration between the Wireless Sensors Network (WSN) deployed on the border of the road and the Vehicular Network (VANET). This collaboration is defined by messages exchanged between the two networks for the purpose to inform the drivers about the state of the road, provide road safety information and more information about traffic on the road. Moreover, this collaboration created by HSVN, also allows the use of a network and the advantage of improving another network. For example, the dissemination of information between the sensors quickly decreases its energy, and therefore, we can use vehicles that do not have energy constraint to disseminate the information between sensors. On the other hand, to solve the disconnection problem in VANET, the sensors can be used as gateways that allow sending the messages received by one vehicle to another. However, because of the short communication range of the sensor and its low capacity of storage and processing of data, it is difficult to ensure the exchange of road messages between it and the vehicle, which can be moving at high speed at the time of exchange. This represents the time where the vehicle is in communication range with the sensor. This work is the proposition of a communication protocol between the sensors and the vehicle used in HSVN. The latter has as the purpose to ensure the exchange of road messages in the available time of exchange.
Keywords: HSVN, ITS, VANET, WSN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12335867 Integration of Microarray Data into a Genome-Scale Metabolic Model to Study Flux Distribution after Gene Knockout
Authors: Mona Heydari, Ehsan Motamedian, Seyed Abbas Shojaosadati
Abstract:
Prediction of perturbations after genetic manipulation (especially gene knockout) is one of the important challenges in systems biology. In this paper, a new algorithm is introduced that integrates microarray data into the metabolic model. The algorithm was used to study the change in the cell phenotype after knockout of Gss gene in Escherichia coli BW25113. Algorithm implementation indicated that gene deletion resulted in more activation of the metabolic network. Growth yield was more and less regulating gene were identified for mutant in comparison with the wild-type strain.Keywords: Metabolic network, gene knockout, flux balance analysis, microarray data, integration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9975866 Optimal Manufacturing Scheduling for Dependent Details Processing
Authors: Ivan C. Mustakerov, Daniela I. Borissova
Abstract:
The increasing competitiveness in manufacturing industry is forcing manufacturers to seek effective processing schedules. The paper presents an optimization manufacture scheduling approach for dependent details processing with given processing sequences and times on multiple machines. By defining decision variables as start and end moments of details processing it is possible to use straightforward variables restrictions to satisfy different technological requirements and to formulate easy to understand and solve optimization tasks for multiple numbers of details and machines. A case study example is solved for seven base moldings for CNC metalworking machines processed on five different machines with given processing order among details and machines and known processing time-s duration. As a result of linear optimization task solution the optimal manufacturing schedule minimizing the overall processing time is obtained. The manufacturing schedule defines the moments of moldings delivery thus minimizing storage costs and provides mounting due-time satisfaction. The proposed optimization approach is based on real manufacturing plant problem. Different processing schedules variants for different technological restrictions were defined and implemented in the practice of Bulgarian company RAIS Ltd. The proposed approach could be generalized for other job shop scheduling problems for different applications.Keywords: Optimal manufacturing scheduling, linear programming, metalworking machines production, dependant details processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14875865 3D Star Skeleton for Fast Human Posture Representation
Authors: Sungkuk Chun, Kwangjin Hong, Keechul Jung
Abstract:
In this paper, we propose an improved 3D star skeleton technique, which is a suitable skeletonization for human posture representation and reflects the 3D information of human posture. Moreover, the proposed technique is simple and then can be performed in real-time. The existing skeleton construction techniques, such as distance transformation, Voronoi diagram, and thinning, focus on the precision of skeleton information. Therefore, those techniques are not applicable to real-time posture recognition since they are computationally expensive and highly susceptible to noise of boundary. Although a 2D star skeleton was proposed to complement these problems, it also has some limitations to describe the 3D information of the posture. To represent human posture effectively, the constructed skeleton should consider the 3D information of posture. The proposed 3D star skeleton contains 3D data of human, and focuses on human action and posture recognition. Our 3D star skeleton uses the 8 projection maps which have 2D silhouette information and depth data of human surface. And the extremal points can be extracted as the features of 3D star skeleton, without searching whole boundary of object. Therefore, on execution time, our 3D star skeleton is faster than the “greedy" 3D star skeleton using the whole boundary points on the surface. Moreover, our method can offer more accurate skeleton of posture than the existing star skeleton since the 3D data for the object is concerned. Additionally, we make a codebook, a collection of representative 3D star skeletons about 7 postures, to recognize what posture of constructed skeleton is.Keywords: computer vision, gesture recognition, skeletonization, human posture representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21235864 Biospeckle Techniques in Quality Evaluation of Indian Fruits
Authors: MD Zaheer Ansari, A.K. Nirala
Abstract:
In this study spatial-temporal speckle correlation techniques have been applied for the quality evaluation of three different Indian fruits namely apple, pear and tomato for the first time. The method is based on the analysis of variations of laser light scattered from biological samples. The results showed that crosscorrelation coefficients of biospeckle patterns change subject to their freshness and the storage conditions. The biospeckle activity was determined by means of the cross-correlation functions of the intensity fluctuations. Significant changes in biospeckle activity were observed during their shelf lives. From the study, it is found that the biospeckle activity decreases with the shelf-life storage time. Further it has been shown that biospeckle activity changes according to their respiration rates.
Keywords: Biospeckle, cross-correlation, respiration, shelf-life.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24915863 Performance Evaluation of Neural Network Prediction for Data Prefetching in Embedded Applications
Authors: Sofien Chtourou, Mohamed Chtourou, Omar Hammami
Abstract:
Embedded systems need to respect stringent real time constraints. Various hardware components included in such systems such as cache memories exhibit variability and therefore affect execution time. Indeed, a cache memory access from an embedded microprocessor might result in a cache hit where the data is available or a cache miss and the data need to be fetched with an additional delay from an external memory. It is therefore highly desirable to predict future memory accesses during execution in order to appropriately prefetch data without incurring delays. In this paper, we evaluate the potential of several artificial neural networks for the prediction of instruction memory addresses. Neural network have the potential to tackle the nonlinear behavior observed in memory accesses during program execution and their demonstrated numerous hardware implementation emphasize this choice over traditional forecasting techniques for their inclusion in embedded systems. However, embedded applications execute millions of instructions and therefore millions of addresses to be predicted. This very challenging problem of neural network based prediction of large time series is approached in this paper by evaluating various neural network architectures based on the recurrent neural network paradigm with pre-processing based on the Self Organizing Map (SOM) classification technique.Keywords: Address, data set, memory, prediction, recurrentneural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16755862 A New Approach for Image Segmentation using Pillar-Kmeans Algorithm
Authors: Ali Ridho Barakbah, Yasushi Kiyoki
Abstract:
This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.Keywords: Image segmentation, K-means clustering, Pillaralgorithm, color spaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33725861 Removal of Heavy Metals from Rainwater in Batch Reactors with Sulphate Reducing Bacteria (SRB)
Authors: Abdulsalam I. Rafida
Abstract:
The main objective of this research was to investigate the biosorption capacity for biofilms of sulphate reducing bacteria (SRB) to remove heavy metals, such as Zn, Pb and Cd from rainwater using laboratory-scale reactors containing mixed support media. Evidence showed that biosorption had contributed to removal of heavy metals including Zn, Pb and Cd in presence of SRB and SRB were also found in the aqueous samples from reactors. However, the SRB and specific families (Desulfobacteriaceae and Desulfovibrionaceae) were found mainly in the biomass samples taken from all reactors at the end of the experiment. EDX-analysis of reactor solids at end of experiment showed that heavy metals Zn, Pb and Cd had also accumulated in these precipitates.Keywords: Sulphate reducing bacteria (SRB), biosorption capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14485860 Comparison between Separable and Irreducible Goppa Code in McEliece Cryptosystem
Authors: Thuraya M. Qaradaghi, Newroz N. Abdulrazaq
Abstract:
The McEliece cryptosystem is an asymmetric type of cryptography based on error correction code. The classical McEliece used irreducible binary Goppa code which considered unbreakable until now especially with parameter [1024, 524, and 101], but it is suffering from large public key matrix which leads to be difficult to be used practically. In this work Irreducible and Separable Goppa codes have been introduced. The Irreducible and Separable Goppa codes used are with flexible parameters and dynamic error vectors. A Comparison between Separable and Irreducible Goppa code in McEliece Cryptosystem has been done. For encryption stage, to get better result for comparison, two types of testing have been chosen; in the first one the random message is constant while the parameters of Goppa code have been changed. But for the second test, the parameters of Goppa code are constant (m=8 and t=10) while the random message have been changed. The results show that the time needed to calculate parity check matrix in separable are higher than the one for irreducible McEliece cryptosystem, which is considered expected results due to calculate extra parity check matrix in decryption process for g2(z) in separable type, and the time needed to execute error locator in decryption stage in separable type is better than the time needed to calculate it in irreducible type. The proposed implementation has been done by Visual studio C#.Keywords: McEliece cryptosystem, Goppa code, separable, irreducible.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22115859 Dam Operation Management Criteria during Floods: Case Study of Dez Dam in Southwest Iran
Authors: Ali Heidari
Abstract:
This paper presents the principles for improving flood mitigation operation in multipurpose dams and maximizing reservoir performance during flood occurrence with a focus on the real-time operation of gated spillways. The criteria of operation include the safety of dams during flood management, minimizing the downstream flood risk by decreasing the flood hazard and fulfilling water supply and other purposes of the dam operation in mid and long terms horizons. The parameters deemed to be important include flood inflow, outlet capacity restrictions, downstream flood inundation damages, economic revenue of dam operation, and environmental and sedimentation restrictions. A simulation model was used to determine the real-time release of the Dez Dam located in the Dez Rivers in southwest Iran, considering the gate regulation curves for the gated spillway. The results of the simulation model show that there is a possibility to improve the current procedures used in the real-time operation of the dams, particularly using gate regulation curves and early flood forecasting system results. The Dez Dam operation data show that in one of the best flood control records, 17% of the total active volume and flood control pool of the reservoir have not been used in decreasing the downstream flood hazard despite the availability of a flood forecasting system.
Keywords: Dam operation, flood control criteria, Dez Dam, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3865858 DWM-CDD: Dynamic Weighted Majority Concept Drift Detection for Spam Mail Filtering
Authors: Leili Nosrati, Alireza Nemaney Pour
Abstract:
Although e-mail is the most efficient and popular communication method, unwanted and mass unsolicited e-mails, also called spam mail, endanger the existence of the mail system. This paper proposes a new algorithm called Dynamic Weighted Majority Concept Drift Detection (DWM-CDD) for content-based filtering. The design purposes of DWM-CDD are first to accurate the performance of the previously proposed algorithms, and second to speed up the time to construct the model. The results show that DWM-CDD can detect both sudden and gradual changes quickly and accurately. Moreover, the time needed for model construction is less than previously proposed algorithms.
Keywords: Concept drift, Content-based filtering, E-mail, Spammail.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19625857 String Matching using Inverted Lists
Authors: Chouvalit Khancome, Veera Boonjing
Abstract:
This paper proposes a new solution to string matching problem. This solution constructs an inverted list representing a string pattern to be searched for. It then uses a new algorithm to process an input string in a single pass. The preprocessing phase takes 1) time complexity O(m) 2) space complexity O(1) where m is the length of pattern. The searching phase time complexity takes 1) O(m+α ) in average case 2) O(n/m) in the best case and 3) O(n) in the worst case, where α is the number of comparing leading to mismatch and n is the length of input text.
Keywords: String matching, inverted list, inverted index, pattern, algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15565856 Possibilities of Mathematical Modelling of Explosive Substance Aerosol and Vapour Dispersion in the Atmosphere
Authors: A. Bumbová, J. Kellner, J. Navrátil, D. Pluskal, M. Kozubková, E. Kozubek
Abstract:
The paper deals with the possibilities of modelling vapour propagation of explosive substances in the FLUENT software. With regard to very low tensions of explosive substance vapours the experiment has been verified as exemplified by mononitrotoluene. Either constant or time variable meteorological conditions have been used for calculation. Further, it has been verified that the eluent source may be time-dependent and may reflect a real situation or the liberation rate may be constant. The execution of the experiment as well as evaluation were clear and it could also be used for modelling vapour and aerosol propagation of selected explosive substances in the atmospheric boundary layer.Keywords: atmospheric boundary layer, explosive substances, FLUENT software, modelling of propagation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709