Search results for: Mitigation Techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2642

Search results for: Mitigation Techniques

2042 A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Singularvalue decomposition, Discrete cosine transform, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096
2041 A Frugal Bidding Procedure for Replicating WWW Content

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Internet, data content replication, static allocation, mechanism design, equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
2040 Genetic Programming: Principles, Applications and Opportunities for Hydrological Modelling

Authors: Oluwaseun K. Oyebode, Josiah A. Adeyemo

Abstract:

Hydrological modelling plays a crucial role in the planning and management of water resources, most especially in water stressed regions where the need to effectively manage the available water resources is of critical importance. However, due to the complex, nonlinear and dynamic behaviour of hydro-climatic interactions, achieving reliable modelling of water resource systems and accurate projection of hydrological parameters are extremely challenging. Although a significant number of modelling techniques (process-based and data-driven) have been developed and adopted in that regard, the field of hydrological modelling is still considered as one that has sluggishly progressed over the past decades. This is majorly as a result of the identification of some degree of uncertainty in the methodologies and results of techniques adopted. In recent times, evolutionary computation (EC) techniques have been developed and introduced in response to the search for efficient and reliable means of providing accurate solutions to hydrological related problems. This paper presents a comprehensive review of the underlying principles, methodological needs and applications of a promising evolutionary computation modelling technique – genetic programming (GP). It examines the specific characteristics of the technique which makes it suitable to solving hydrological modelling problems. It discusses the opportunities inherent in the application of GP in water related-studies such as rainfall estimation, rainfall-runoff modelling, streamflow forecasting, sediment transport modelling, water quality modelling and groundwater modelling among others. Furthermore, the means by which such opportunities could be harnessed in the near future are discussed. In all, a case for total embracement of GP and its variants in hydrological modelling studies is made so as to put in place strategies that would translate into achieving meaningful progress as it relates to modelling of water resource systems, and also positively influence decision-making by relevant stakeholders.

Keywords: Computational modelling, evolutionary algorithms, genetic programming, hydrological modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3329
2039 A Materialized View Approach to Support Aggregation Operations over Long Periods in Sensor Networks

Authors: Minsoo Lee, Julee Choi, Sookyung Song

Abstract:

The increasing interest on processing data created by sensor networks has evolved into approaches to implement sensor networks as databases. The aggregation operator, which calculates a value from a large group of data such as computing averages or sums, etc. is an essential function that needs to be provided when implementing such sensor network databases. This work proposes to add the DURING clause into TinySQL to calculate values during a specific long period and suggests a way to implement the aggregation service in sensor networks by applying materialized view and incremental view maintenance techniques that is used in data warehouses. In sensor networks, data values are passed from child nodes to parent nodes and an aggregation value is computed at the root node. As such root nodes need to be memory efficient and low powered, it becomes a problem to recompute aggregate values from all past and current data. Therefore, applying incremental view maintenance techniques can reduce the memory consumption and support fast computation of aggregate values.

Keywords: Aggregation, Incremental View Maintenance, Materialized view, Sensor Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
2038 Contrast Enhancement of Color Images with Color Morphing Approach

Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi

Abstract:

Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.

Keywords: Contrast enhancement, normalized RGB, adaptive histogram equalization, cumulative variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
2037 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing

Authors: Jaimin Patel

Abstract:

Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.

Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man-in-the-middle attack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
2036 HaskellFL: A Tool for Detecting Logical Errors in Haskell

Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha

Abstract:

Understanding and using the functional paradigm is a challenge for many programmers. Looking for logical errors in code may take a lot of a developer’s time when a program grows in size. In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying Functional Programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against Functional Programming assignments submitted by students enrolled at the Functional Programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available in GitHub. This work also evaluated the effectiveness of two fault localization techniques, Tarantula and Ochiai, in the Haskell context. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. The results also showed that the Ochiai method was more effective than Tarantula.

Keywords: Debug, fault localization, functional programming, Haskell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731
2035 Advanced Geolocation of IP Addresses

Authors: Robert Koch, Mario Golling, Gabi Dreo Rodosek

Abstract:

Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.

Keywords: IP geolocation, prosecution of computer fraud, attack attribution, target-analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4727
2034 Effectiveness of Infrastructure Flood Control Due to Development Upstream Land Use: Case Study of Ciliwung Watershed

Authors: Siti Murniningsih, Evi Anggraheni

Abstract:

Various infrastructures such as dams, flood control dams and reservoirs have been developed in the 19th century until the 20th century. These infrastructures are very effective in controlling the river flows and in preventing inundation in the urban area prone to flooding. Flooding in the urban area often brings large impact, affecting every aspect of life and also environment. Ciliwung is one of the rivers allegedly contributes to the flooding problems in Jakarta; various engineering work has been done in Ciliwung river to help controlling the flooding. One of the engineering work is to build Ciawi Dam and Sukamahi Dam. In this research, author is doing the flood calculation with Nakayasu Method, while the previous flooding in that case study is computed using Level Pool Routine. The effectiveness of these dams can be identified by using flood simulation of existing condition and compare it to the flood simulation after the dam construction. The final goal of this study is to determine the effectiveness of flood mitigation infrastructure located at upstream area in reducing the volume of flooding in Jakarta.

Keywords: Effectiveness, flood simulation, infrastructure flooding, level pool routine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1230
2033 Importance of Public Communication Campaigns and Art Activities in Social Education

Authors: Bilgehan Gültekin, Tuba Gültekin

Abstract:

Universities have an important role in social education in many aspects. In terms of creating awareness and convincing public about social issues, universities take a leading position for public. The best way to provide public support for social education is to develop public communication campaigns. The aim of this study is to present a public communication model which will be guided in social education practices. The study titled “Importance of public communication campaigns and art activities in Social Education “is based on the following topics: Effects of public communication campaigns on social education, Public relations techniques for education, communication strategies, Steps of public relations campaigns in social education, making persuasive messages for public communication campaigns, developing artistic messages and organizing art activities in social education. In addition to these topics, media planning for social education, forming a team as campaign managers, dialogues with opinion leaders in education and preparing creative communication models for social education will be taken into consideration. This study also aims to criticize social education Case studies in Turkey. At the same time, some communicative methods and principles will be given in the light of communication campaigns within the context of this notice.

Keywords: Art activities in social education, Persuasive communication, Public communication campaigns, Public relations techniques for education

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597
2032 Carbon-Based Electrochemical Detection of Pharmaceuticals from Water

Authors: M. Ardelean, F. Manea, A. Pop, J. Schoonman

Abstract:

The presence of pharmaceuticals in the environment and especially in water has gained increasing attention. They are included in emerging class of pollutants, and for most of them, legal limits have not been set-up due to their impact on human health and ecosystem was not determined and/or there is not the advanced analytical method for their quantification. In this context, the development of various advanced analytical methods for the quantification of pharmaceuticals in water is required. The electrochemical methods are known to exhibit the great potential for high-performance analytical methods but their performance is in direct relation to the electrode material and the operating techniques. In this study, two types of carbon-based electrodes materials, i.e., boron-doped diamond (BDD) and carbon nanofiber (CNF)-epoxy composite electrodes have been investigated through voltammetric techniques for the detection of naproxen in water. The comparative electrochemical behavior of naproxen (NPX) on both BDD and CNF electrodes was studied by cyclic voltammetry, and the well-defined peak corresponding to NPX oxidation was found for each electrode. NPX oxidation occurred on BDD electrode at the potential value of about +1.4 V/SCE (saturated calomel electrode) and at about +1.2 V/SCE for CNF electrode. The sensitivities for NPX detection were similar for both carbon-based electrode and thus, CNF electrode exhibited superiority in relation to the detection potential. Differential-pulsed voltammetry (DPV) and square-wave voltammetry (SWV) techniques were exploited to improve the electroanalytical performance for the NPX detection, and the best results related to the sensitivity of 9.959 µA·µM-1 were achieved using DPV. In addition, the simultaneous detection of NPX and fluoxetine -a very common antidepressive drug, also present in water, was studied using CNF electrode and very good results were obtained. The detection potential values that allowed a good separation of the detection signals together with the good sensitivities were appropriate for the simultaneous detection of both tested pharmaceuticals. These results reclaim CNF electrode as a valuable tool for the individual/simultaneous detection of pharmaceuticals in water.

Keywords: Boron-doped diamond electrode, carbon nanofiber-epoxy composite electrode, emerging pollutants, pharmaceuticals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
2031 A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Data replication, auctions, static allocation, pricing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
2030 Questions Categorization in E-Learning Environment Using Data Mining Technique

Authors: Vilas P. Mahatme, K. K. Bhoyar

Abstract:

Nowadays, education cannot be imagined without digital technologies. It broadens the horizons of teaching learning processes. Several universities are offering online courses. For evaluation purpose, e-examination systems are being widely adopted in academic environments. Multiple-choice tests are extremely popular. Moving away from traditional examinations to e-examination, Moodle as Learning Management Systems (LMS) is being used. Moodle logs every click that students make for attempting and navigational purposes in e-examination. Data mining has been applied in various domains including retail sales, bioinformatics. In recent years, there has been increasing interest in the use of data mining in e-learning environment. It has been applied to discover, extract, and evaluate parameters related to student’s learning performance. The combination of data mining and e-learning is still in its babyhood. Log data generated by the students during online examination can be used to discover knowledge with the help of data mining techniques. In web based applications, number of right and wrong answers of the test result is not sufficient to assess and evaluate the student’s performance. So, assessment techniques must be intelligent enough. If student cannot answer the question asked by the instructor then some easier question can be asked. Otherwise, more difficult question can be post on similar topic. To do so, it is necessary to identify difficulty level of the questions. Proposed work concentrate on the same issue. Data mining techniques in specific clustering is used in this work. This method decide difficulty levels of the question and categories them as tough, easy or moderate and later this will be served to the desire students based on their performance. Proposed experiment categories the question set and also group the students based on their performance in examination. This will help the instructor to guide the students more specifically. In short mined knowledge helps to support, guide, facilitate and enhance learning as a whole.

Keywords: Data mining, e-examination, e-learning, moodle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075
2029 The Effect of Addition of Dioctyl Terephthalate and Calcite on the Tensile Properties of Organoclay/Linear Low Density Polyethylene Nanocomposites

Authors: A. Gürses, Z. Eroğlu, E. Şahin, K. Güneş, Ç. Doğar

Abstract:

In recent years, polymer/clay nanocomposites have generated great interest in the polymer industry as a new type of composite material because of their superior properties, which includes high heat deflection temperature, gas barrier performance, dimensional stability, enhanced mechanical properties, optical clarity and flame retardancy when compared with the pure polymer or conventional composites. The investigation of change of the tensile properties of organoclay/linear low density polyethylene (LLDPE) nanocomposites with the use of Dioctyl terephthalate (DOTP) (as plasticizer) and calcite (as filler) has been aimed. The composites and organoclay synthesized were characterized using the techniques such as XRD, HRTEM and FTIR techniques. The spectroscopic results indicate that platelets of organoclay were well dispersed within the polymeric matrix. The tensile properties of the composites were compared considering the stress-strain curve drawn for each composite and pure polymer. It was observed that the composites prepared by adding the plasticizer at different ratios and a certain amount of calcite exhibited different tensile behaviors compared to pure polymer.

Keywords: Linear low density polyethylene, nanocomposite, organoclay, plasticizer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
2028 Hybrid Intelligent Intrusion Detection System

Authors: Norbik Bashah, Idris Bharanidharan Shanmugam, Abdul Manan Ahmed

Abstract:

Intrusion Detection Systems are increasingly a key part of systems defense. Various approaches to Intrusion Detection are currently being used, but they are relatively ineffective. Artificial Intelligence plays a driving role in security services. This paper proposes a dynamic model Intelligent Intrusion Detection System, based on specific AI approach for intrusion detection. The techniques that are being investigated includes neural networks and fuzzy logic with network profiling, that uses simple data mining techniques to process the network data. The proposed system is a hybrid system that combines anomaly, misuse and host based detection. Simple Fuzzy rules allow us to construct if-then rules that reflect common ways of describing security attacks. For host based intrusion detection we use neural-networks along with self organizing maps. Suspicious intrusions can be traced back to its original source path and any traffic from that particular source will be redirected back to them in future. Both network traffic and system audit data are used as inputs for both.

Keywords: Intrusion Detection, Network Security, Data mining, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2131
2027 EEIA: Energy Efficient Indexed Aggregation in Smart Wireless Sensor Networks

Authors: Mohamed Watfa, William Daher, Hisham Al Azar

Abstract:

The main idea behind in network aggregation is that, rather than sending individual data items from sensors to sinks, multiple data items are aggregated as they are forwarded by the sensor network. Existing sensor network data aggregation techniques assume that the nodes are preprogrammed and send data to a central sink for offline querying and analysis. This approach faces two major drawbacks. First, the system behavior is preprogrammed and cannot be modified on the fly. Second, the increased energy wastage due to the communication overhead will result in decreasing the overall system lifetime. Thus, energy conservation is of prime consideration in sensor network protocols in order to maximize the network-s operational lifetime. In this paper, we give an energy efficient approach to query processing by implementing new optimization techniques applied to in-network aggregation. We first discuss earlier approaches in sensors data management and highlight their disadvantages. We then present our approach “Energy Efficient Indexed Aggregation" (EEIA) and evaluate it through several simulations to prove its efficiency, competence and effectiveness.

Keywords: Sensor Networks, Data Base, Data Fusion, Aggregation, Indexing, Energy Efficiency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
2026 Concept for Knowledge out of Sri Lankan Non-State Sector: Performances of Higher Educational Institutes and Successes of Its Sector

Authors: S. Jeyarajan

Abstract:

Concept of knowledge is discovered from conducted study for successive Competition in Sri Lankan Non-State Higher Educational Institutes. The Concept discovered out of collected Knowledge Management Practices from Emerald inside likewise reputed literatures and of Non-State Higher Educational sector. A test is conducted to reveal existences and its reason behind of these collected practices in Sri Lankan Non-State Higher Education Institutes. Further, unavailability of such study and uncertain on number of participants for data collection in the Sri Lankan context contributed selection of research method as qualitative method, which used attributes of Delphi Method to manage those likewise uncertainty. Data are collected under Dramaturgical Method, which contributes efficient usage of the Delphi method. Grounded theory is selected as data analysis techniques, which is conducted in intermixed discourse to manage different perspectives of data that are collected systematically through perspective and modified snowball sampling techniques. Data are then analysed using Grounded Theory Development Techniques in Intermix discourses to manage differences in Data. Consequently, Agreement in the results of Grounded theories and of finding in the Foreign Study is discovered in the analysis whereas present study conducted as Qualitative Research and The Foreign Study conducted as Quantitative Research. As such, the Present study widens the discovery in the Foreign Study. Further, having discovered reason behind of the existences, the Present result shows Concept for Knowledge from Sri Lankan Non-State sector to manage higher educational Institutes in successful manner.

Keywords: Adherence of snowball sampling into perspective sampling, Delphi method in qualitative method, grounded theory development in intermix discourses of analysis, knowledge management for success of higher educational institutes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
2025 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM

Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad

Abstract:

Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.

Keywords: Cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 993
2024 Using Non-Linear Programming Techniques in Determination of the Most Probable Slip Surface in 3D Slopes

Authors: M. M. Toufigh, A. R. Ahangarasr, A. Ouria

Abstract:

Among many different methods that are used for optimizing different engineering problems mathematical (numerical) optimization techniques are very important because they can easily be used and are consistent with most of engineering problems. Many studies and researches are done on stability analysis of three dimensional (3D) slopes and the relating probable slip surfaces and determination of factors of safety, but in most of them force equilibrium equations, as in simplified 2D methods, are considered only in two directions. In other words for decreasing mathematical calculations and also for simplifying purposes the force equilibrium equation in 3rd direction is omitted. This point is considered in just a few numbers of previous studies and most of them have only given a factor of safety and they haven-t made enough effort to find the most probable slip surface. In this study shapes of the slip surfaces are modeled, and safety factors are calculated considering the force equilibrium equations in all three directions, and also the moment equilibrium equation is satisfied in the slip direction, and using nonlinear programming techniques the shape of the most probable slip surface is determined. The model which is used in this study is a 3D model that is composed of three upper surfaces which can cover all defined and probable slip surfaces. In this research the meshing process is done in a way that all elements are prismatic with quadrilateral cross sections, and the safety factor is defined on this quadrilateral surface in the base of the element which is a part of the whole slip surface. The method that is used in this study to find the most probable slip surface is the non-linear programming method in which the objective function that must get optimized is the factor of safety that is a function of the soil properties and the coordinates of the nodes on the probable slip surface. The main reason for using non-linear programming method in this research is its quick convergence to the desired responses. The final results show a good compatibility with the previously used classical and 2D methods and also show a reasonable convergence speed.

Keywords: Non-linear programming, numerical optimization, slope stability, 3D analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
2023 An Improved k Nearest Neighbor Classifier Using Interestingness Measures for Medical Image Mining

Authors: J. Alamelu Mangai, Satej Wagle, V. Santhosh Kumar

Abstract:

The exponential increase in the volume of medical image database has imposed new challenges to clinical routine in maintaining patient history, diagnosis, treatment and monitoring. With the advent of data mining and machine learning techniques it is possible to automate and/or assist physicians in clinical diagnosis. In this research a medical image classification framework using data mining techniques is proposed. It involves feature extraction, feature selection, feature discretization and classification. In the classification phase, the performance of the traditional kNN k nearest neighbor classifier is improved using a feature weighting scheme and a distance weighted voting instead of simple majority voting. Feature weights are calculated using the interestingness measures used in association rule mining. Experiments on the retinal fundus images show that the proposed framework improves the classification accuracy of traditional kNN from 78.57 % to 92.85 %.

Keywords: Medical Image Mining, Data Mining, Feature Weighting, Association Rule Mining, k nearest neighbor classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3308
2022 Computer Aided Diagnosis of Polycystic Kidney Disease Using ANN

Authors: Anjan Babu G, Sumana G, Rajasekhar M

Abstract:

Many inherited diseases and non-hereditary disorders are common in the development of renal cystic diseases. Polycystic kidney disease (PKD) is a disorder developed within the kidneys in which grouping of cysts filled with water like fluid. PKD is responsible for 5-10% of end-stage renal failure treated by dialysis or transplantation. New experimental models, application of molecular biology techniques have provided new insights into the pathogenesis of PKD. Researchers are showing keen interest for developing an automated system by applying computer aided techniques for the diagnosis of diseases. In this paper a multilayered feed forward neural network with one hidden layer is constructed, trained and tested by applying back propagation learning rule for the diagnosis of PKD based on physical symptoms and test results of urinalysis collected from the individual patients. The data collected from 50 patients are used to train and test the network. Among these samples, 75% of the data used for training and remaining 25% of the data are used for testing purpose. Further, this trained network is used to implement for new samples. The output results in normality and abnormality of the patient.

Keywords: Dialysis, Hereditary, Transplantation, Polycystic, Pathogenesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2004
2021 Comparative Study on Swarm Intelligence Techniques for Biclustering of Microarray Gene Expression Data

Authors: R. Balamurugan, A. M. Natarajan, K. Premalatha

Abstract:

Microarray gene expression data play a vital in biological processes, gene regulation and disease mechanism. Biclustering in gene expression data is a subset of the genes indicating consistent patterns under the subset of the conditions. Finding a biclustering is an optimization problem. In recent years, swarm intelligence techniques are popular due to the fact that many real-world problems are increasingly large, complex and dynamic. By reasons of the size and complexity of the problems, it is necessary to find an optimization technique whose efficiency is measured by finding the near optimal solution within a reasonable amount of time. In this paper, the algorithmic concepts of the Particle Swarm Optimization (PSO), Shuffled Frog Leaping (SFL) and Cuckoo Search (CS) algorithms have been analyzed for the four benchmark gene expression dataset. The experiment results show that CS outperforms PSO and SFL for 3 datasets and SFL give better performance in one dataset. Also this work determines the biological relevance of the biclusters with Gene Ontology in terms of function, process and component.

Keywords: Particle swarm optimization, Shuffled frog leaping, Cuckoo search, biclustering, gene expression data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2663
2020 Application of Data Mining Tools to Predicate Completion Time of a Project

Authors: Seyed Hossein Iranmanesh, Zahra Mokhtari

Abstract:

Estimation time and cost of work completion in a project and follow up them during execution are contributors to success or fail of a project, and is very important for project management team. Delivering on time and within budgeted cost needs to well managing and controlling the projects. To dealing with complex task of controlling and modifying the baseline project schedule during execution, earned value management systems have been set up and widely used to measure and communicate the real physical progress of a project. But it often fails to predict the total duration of the project. In this paper data mining techniques is used predicting the total project duration in term of Time Estimate At Completion-EAC (t). For this purpose, we have used a project with 90 activities, it has updated day by day. Then, it is used regular indexes in literature and applied Earned Duration Method to calculate time estimate at completion and set these as input data for prediction and specifying the major parameters among them using Clem software. By using data mining, the effective parameters on EAC and the relationship between them could be extracted and it is very useful to manage a project with minimum delay risks. As we state, this could be a simple, safe and applicable method in prediction the completion time of a project during execution.

Keywords: Data Mining Techniques, Earned Duration Method, Earned Value, Estimate At Completion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
2019 Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

Authors: Aikaterini Fountoulaki, Nikos Karacapilidis, Manolis Manatakis

Abstract:

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Keywords: Acceptance Sampling, Neural Networks, Statistical Quality Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
2018 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types

Authors: Chaghoub Soraya, Zhang Xiaoyan

Abstract:

This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.

Keywords: Approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 597
2017 Unconventional Composite Inorganic Membrane Fabrication for Carbon Emissions Mitigation

Authors: Ngozi Nwogu, Godson Osueke, Mamdud Hossain, Edward Gobina

Abstract:

An unconventional composite inorganic ceramic membrane capable of enhancing carbon dioxide emission decline was fabricated and tested at laboratory scale in conformism to various environmental guidelines and also to mitigate the effect of global warming. A review of the existing membrane technologies for carbon capture including the relevant gas transport mechanisms is presented. Single gas permeation experiments using silica modified ceramic membrane with internal diameter 20mm, outside diameter 25mm and length of 368mm deposited on a macro porous support was carried out to investigate individual gas permeation behaviours at different pressures at room temperature. Membrane fabrication was achieved using after a dip coating method. Nitrogen, Carbon dioxide, Argon, Oxygen and Methane pure gases were used to investigate their individual permeation rates at various pressures. Results show that the gas flow rate increases with pressure drop. However above a pressure of 3bar, CO2 permeability ratio to that of the other gases indicated control of a more selective surface adsorptive transport mechanism.

Keywords: Carbon dioxide composite inorganic membranes, permeability, transport mechanisms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2093
2016 A Design Framework for Event Recommendation in Novice Low-Literacy Communities

Authors: Yimeng Deng, Klarissa T.T. Chang

Abstract:

The proliferation of user-generated content (UGC) results in huge opportunities to explore event patterns. However, existing event recommendation systems primarily focus on advanced information technology users. Little work has been done to address novice and low-literacy users. The next billion users providing and consuming UGC are likely to include communities from developing countries who are ready to use affordable technologies for subsistence goals. Therefore, we propose a design framework for providing event recommendations to address the needs of such users. Grounded in information integration theory (IIT), our framework advocates that effective event recommendation is supported by systems capable of (1) reliable information gathering through structured user input, (2) accurate sense making through spatial-temporal analytics, and (3) intuitive information dissemination through interactive visualization techniques. A mobile pest management application is developed as an instantiation of the design framework. Our preliminary study suggests a set of design principles for novice and low-literacy users.

Keywords: Event recommendation, iconic interface, information integration, spatial-temporal clustering, user-generated content, visualization techniques

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
2015 Real-time Target Tracking Using a Pan and Tilt Platform

Authors: Moulay A. Akhloufi

Abstract:

In recent years, we see an increase of interest for efficient tracking systems in surveillance applications. Many of the proposed techniques are designed for static cameras environments. When the camera is moving, tracking moving objects become more difficult and many techniques fail to detect and track the desired targets. The problem becomes more complex when we want to track a specific object in real-time using a moving Pan and Tilt camera system to keep the target within the image. This type of tracking is of high importance in surveillance applications. When a target is detected at a certain zone, the possibility of automatically tracking it continuously and keeping it within the image until action is taken is very important for security personnel working in very sensitive sites. This work presents a real-time tracking system permitting the detection and continuous tracking of targets using a Pan and Tilt camera platform. A novel and efficient approach for dealing with occlusions is presented. Also a new intelligent forget factor is introduced in order to take into account target shape variations and avoid learning non desired objects. Tests conducted in outdoor operational scenarios show the efficiency and robustness of the proposed approach.

Keywords: Tracking, surveillance, target detection, Pan and tilt.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
2014 Surveying Earthquake Vulnerabilities of District 13 of Kabul City, Afghanistan

Authors: Mohsen Mohammadi, Toshio Fujimi

Abstract:

High population and irregular urban development in Kabul city, Afghanistan's capital, are among factors that increase its vulnerability to earthquake disasters (on top of its location in a high seismic region); this can lead to widespread economic loss and casualties. This study aims to evaluate earthquake risks in Kabul's 13th district based on scientific data. The research data, which include hazard curves of Kabul, vulnerability curves, and a questionnaire survey through sampling in district 13, have been incorporated to develop risk curves. To estimate potential casualties, we used a set of M parameters in a model developed by Coburn and Spence. The results indicate that in the worst case scenario, more than 90% of district 13, which comprises mostly residential buildings, is exposed to high risk; this may lead to nearly 1000 million USD economic loss and 120 thousand casualties (equal to 25.88% of the 13th district's population) for a nighttime earthquake. To reduce risks, we present the reconstruction of the most vulnerable buildings, which are primarily adobe and masonry buildings. A comparison of risk reduction between reconstructing adobe and masonry buildings indicates that rebuilding adobe buildings would be more effective.

Keywords: Earthquake risk evaluation, Kabul, mitigation, vulnerability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
2013 Harmonic Analysis and Performance Improvement of a Wind Energy Conversions System with Double Output Induction Generator

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Wind turbines with double output induction generators can operate at variable speed permitting conversion efficiency maximization over a wide range of wind velocities. This paper presents the performance analysis of a wind driven double output induction generator (DOIG) operating at varying shafts speed. A periodic transient state analysis of DOIG equipped with two converters is carried out using a hybrid induction machine model. This paper simulates the harmonic content of waveforms in various points of drive at different speeds, based on the hybrid model (dqabc). Then the sinusoidal and trapezoidal pulse-width–modulation control techniques are used in order to improve the power factor of the machine and to weaken the injected low order harmonics to the supply. Based on the frequency spectrum, total harmonics distortion, distortion factor and power factor. Finally advantages of sinusoidal and trapezoidal pulse width modulation techniques are compared.

Keywords: DOIG, Harmonic Analysis, Wind.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804