Search results for: search techniques.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3117

Search results for: search techniques.

2217 Application of Data Mining Tools to Predicate Completion Time of a Project

Authors: Seyed Hossein Iranmanesh, Zahra Mokhtari

Abstract:

Estimation time and cost of work completion in a project and follow up them during execution are contributors to success or fail of a project, and is very important for project management team. Delivering on time and within budgeted cost needs to well managing and controlling the projects. To dealing with complex task of controlling and modifying the baseline project schedule during execution, earned value management systems have been set up and widely used to measure and communicate the real physical progress of a project. But it often fails to predict the total duration of the project. In this paper data mining techniques is used predicting the total project duration in term of Time Estimate At Completion-EAC (t). For this purpose, we have used a project with 90 activities, it has updated day by day. Then, it is used regular indexes in literature and applied Earned Duration Method to calculate time estimate at completion and set these as input data for prediction and specifying the major parameters among them using Clem software. By using data mining, the effective parameters on EAC and the relationship between them could be extracted and it is very useful to manage a project with minimum delay risks. As we state, this could be a simple, safe and applicable method in prediction the completion time of a project during execution.

Keywords: Data Mining Techniques, Earned Duration Method, Earned Value, Estimate At Completion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
2216 Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

Authors: Aikaterini Fountoulaki, Nikos Karacapilidis, Manolis Manatakis

Abstract:

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Keywords: Acceptance Sampling, Neural Networks, Statistical Quality Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
2215 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types

Authors: Chaghoub Soraya, Zhang Xiaoyan

Abstract:

This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.

Keywords: Approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 576
2214 A Design Framework for Event Recommendation in Novice Low-Literacy Communities

Authors: Yimeng Deng, Klarissa T.T. Chang

Abstract:

The proliferation of user-generated content (UGC) results in huge opportunities to explore event patterns. However, existing event recommendation systems primarily focus on advanced information technology users. Little work has been done to address novice and low-literacy users. The next billion users providing and consuming UGC are likely to include communities from developing countries who are ready to use affordable technologies for subsistence goals. Therefore, we propose a design framework for providing event recommendations to address the needs of such users. Grounded in information integration theory (IIT), our framework advocates that effective event recommendation is supported by systems capable of (1) reliable information gathering through structured user input, (2) accurate sense making through spatial-temporal analytics, and (3) intuitive information dissemination through interactive visualization techniques. A mobile pest management application is developed as an instantiation of the design framework. Our preliminary study suggests a set of design principles for novice and low-literacy users.

Keywords: Event recommendation, iconic interface, information integration, spatial-temporal clustering, user-generated content, visualization techniques

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
2213 Prediction of a Human Facial Image by ANN using Image Data and its Content on Web Pages

Authors: Chutimon Thitipornvanid, Siripun Sanguansintukul

Abstract:

Choosing the right metadata is a critical, as good information (metadata) attached to an image will facilitate its visibility from a pile of other images. The image-s value is enhanced not only by the quality of attached metadata but also by the technique of the search. This study proposes a technique that is simple but efficient to predict a single human image from a website using the basic image data and the embedded metadata of the image-s content appearing on web pages. The result is very encouraging with the prediction accuracy of 95%. This technique may become a great assist to librarians, researchers and many others for automatically and efficiently identifying a set of human images out of a greater set of images.

Keywords: Metadata, Prediction, Multi-layer perceptron, Human facial image, Image mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1200
2212 Real-time Target Tracking Using a Pan and Tilt Platform

Authors: Moulay A. Akhloufi

Abstract:

In recent years, we see an increase of interest for efficient tracking systems in surveillance applications. Many of the proposed techniques are designed for static cameras environments. When the camera is moving, tracking moving objects become more difficult and many techniques fail to detect and track the desired targets. The problem becomes more complex when we want to track a specific object in real-time using a moving Pan and Tilt camera system to keep the target within the image. This type of tracking is of high importance in surveillance applications. When a target is detected at a certain zone, the possibility of automatically tracking it continuously and keeping it within the image until action is taken is very important for security personnel working in very sensitive sites. This work presents a real-time tracking system permitting the detection and continuous tracking of targets using a Pan and Tilt camera platform. A novel and efficient approach for dealing with occlusions is presented. Also a new intelligent forget factor is introduced in order to take into account target shape variations and avoid learning non desired objects. Tests conducted in outdoor operational scenarios show the efficiency and robustness of the proposed approach.

Keywords: Tracking, surveillance, target detection, Pan and tilt.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
2211 Visual Search Based Indoor Localization in Low Light via RGB-D Camera

Authors: Yali Zheng, Peipei Luo, Shinan Chen, Jiasheng Hao, Hong Cheng

Abstract:

Most of traditional visual indoor navigation algorithms and methods only consider the localization in ordinary daytime, while we focus on the indoor re-localization in low light in the paper. As RGB images are degraded in low light, less discriminative infrared and depth image pairs are taken, as the input, by RGB-D cameras, the most similar candidates, as the output, are searched from databases which is built in the bag-of-word framework. Epipolar constraints can be used to relocalize the query infrared and depth image sequence. We evaluate our method in two datasets captured by Kinect2. The results demonstrate very promising re-localization results for indoor navigation system in low light environments.

Keywords: Indoor navigation, low light, RGB-D camera, vision based.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
2210 Reliability Improvement with Optimal Placement of Distributed Generation in Distribution System

Authors: N. Rugthaicharoencheep, T. Langtharthong

Abstract:

This paper presents the optimal placement and sizing of distributed generation (DG) in a distribution system. The problem is to reliability improvement of distribution system with distributed generations. The technique employed to solve the minimization problem is based on a developed Tabu search algorithm and reliability worth analysis. The developed methodology is tested with a distribution system of Roy Billinton Test System (RBTS) bus 2. It can be seen from the case study that distributed generation can reduce the customer interruption cost and therefore improve the reliability of the system. It is expected that our proposed method will be utilized effectively for distribution system operator.

Keywords: Distributed generation Optimization technique Reliability improvement, Distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2997
2209 Harmonic Analysis and Performance Improvement of a Wind Energy Conversions System with Double Output Induction Generator

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Wind turbines with double output induction generators can operate at variable speed permitting conversion efficiency maximization over a wide range of wind velocities. This paper presents the performance analysis of a wind driven double output induction generator (DOIG) operating at varying shafts speed. A periodic transient state analysis of DOIG equipped with two converters is carried out using a hybrid induction machine model. This paper simulates the harmonic content of waveforms in various points of drive at different speeds, based on the hybrid model (dqabc). Then the sinusoidal and trapezoidal pulse-width–modulation control techniques are used in order to improve the power factor of the machine and to weaken the injected low order harmonics to the supply. Based on the frequency spectrum, total harmonics distortion, distortion factor and power factor. Finally advantages of sinusoidal and trapezoidal pulse width modulation techniques are compared.

Keywords: DOIG, Harmonic Analysis, Wind.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
2208 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique

Authors: Navajyoti Panda, R. S. Pawar

Abstract:

In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.

Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
2207 An Effective Framework for Chinese Syntactic Parsing

Authors: Xing Li, Chengqing Zong

Abstract:

This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.

Keywords: syntactic parsing, PCFG, pruning, evaluation model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1202
2206 A Comparative Analysis of Different Web Content Mining Tools

Authors: T. Suresh Kumar, M. Arthanari, N. Shanthi

Abstract:

Nowadays, the Web has become one of the most pervasive platforms for information change and retrieval. It collects the suitable and perfectly fitting information from websites that one requires. Data mining is the form of extracting data’s available in the internet. Web mining is one of the elements of data mining Technique, which relates to various research communities such as information recovery, folder managing system and simulated intellects. In this Paper we have discussed the concepts of Web mining. We contain generally focused on one of the categories of Web mining, specifically the Web Content Mining and its various farm duties. The mining tools are imperative to scanning the many images, text, and HTML documents and then, the result is used by the various search engines. We conclude by presenting a comparative table of these tools based on some pertinent criteria.

Keywords: Data Mining, Web Mining, Web Content Mining, Mining Tools, Information retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3532
2205 Adopting Artificial Intelligence and Deep Learning Techniques in Cloud Computing for Operational Efficiency

Authors: Sandesh Achar

Abstract:

Artificial intelligence (AI) is being increasingly incorporated into many applications across various sectors such as health, education, security, and agriculture. Recently, there has been rapid development in cloud computing technology, resulting in AI’s implementation into cloud computing to enhance and optimize the technology service rendered. The deployment of AI in cloud-based applications has brought about autonomous computing, whereby systems achieve stated results without human intervention. Despite the amount of research into autonomous computing, work incorporating AI/ML into cloud computing to enhance its performance and resource allocation remains a fundamental challenge. This paper highlights different manifestations, roles, trends, and challenges related to AI-based cloud computing models. This work reviews and highlights investigations and progress in the domain. Future directions are suggested for leveraging AI/ML in next-generation computing for emerging computing paradigms such as cloud environments. Adopting AI-based algorithms and techniques to increase operational efficiency, cost savings, automation, reducing energy consumption and solving complex cloud computing issues are the major findings outlined in this paper.

Keywords: Artificial intelligence, AI, cloud computing, deep learning, machine learning, ML, internet of things, IoT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 585
2204 Role of Feedbacks in Simulation-Based Learning

Authors: Usman Ghani

Abstract:

Feedback is a vital element for improving student learning in a simulation-based training as it guides and refines learning through scaffolding. A number of studies in literature have shown that students’ learning is enhanced when feedback is provided with personalized tutoring that offers specific guidance and adapts feedback to the learner in a one-to-one environment. Thus, emulating these adaptive aspects of human tutoring in simulation provides an effective methodology to train individuals. This paper presents the results of a study that investigated the effectiveness of automating different types of feedback techniques such as Knowledge-of-Correct-Response (KCR) and Answer-Until- Correct (AUC) in software simulation for learning basic information technology concepts. For the purpose of comparison, techniques like simulation with zero or no-feedback (NFB) and traditional hands-on (HON) learning environments are also examined. The paper presents the summary of findings based on quantitative analyses which reveal that the simulation based instructional strategies are at least as effective as hands-on teaching methodologies for the purpose of learning of IT concepts. The paper also compares the results of the study with the earlier studies and recommends strategies for using feedback mechanism to improve students’ learning in designing and simulation-based IT training.

Keywords: Simulation, feedback, training, hands-on, labs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
2203 Cluster Algorithm for Genetic Diversity

Authors: Manpreet Singh, Keerat Kaur, Bhavdeep Singh

Abstract:

With the hardware technology advancing, the cost of storing is decreasing. Thus there is an urgent need for new techniques and tools that can intelligently and automatically assist us in transferring this data into useful knowledge. Different techniques of data mining are developed which are helpful for handling these large size databases [7]. Data mining is also finding its role in the field of biotechnology. Pedigree means the associated ancestry of a crop variety. Genetic diversity is the variation in the genetic composition of individuals within or among species. Genetic diversity depends upon the pedigree information of the varieties. Parents at lower hierarchic levels have more weightage for predicting genetic diversity as compared to the upper hierarchic levels. The weightage decreases as the level increases. For crossbreeding, the two varieties should be more and more genetically diverse so as to incorporate the useful characters of the two varieties in the newly developed variety. This paper discusses the searching and analyzing of different possible pairs of varieties selected on the basis of morphological characters, Climatic conditions and Nutrients so as to obtain the most optimal pair that can produce the required crossbreed variety. An algorithm was developed to determine the genetic diversity between the selected wheat varieties. Cluster analysis technique is used for retrieving the results.

Keywords: Genetic diversity, pedigree, nutrients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
2202 Incorporating Multiple Supervised Learning Algorithms for Effective Intrusion Detection

Authors: Umar Albalawi, Sang C. Suh, Jinoh Kim

Abstract:

As internet continues to expand its usage with an  enormous number of applications, cyber-threats have significantly  increased accordingly. Thus, accurate detection of malicious traffic in  a timely manner is a critical concern in today’s Internet for security.  One approach for intrusion detection is to use Machine Learning (ML)  techniques. Several methods based on ML algorithms have been  introduced over the past years, but they are largely limited in terms of  detection accuracy and/or time and space complexity to run. In this  work, we present a novel method for intrusion detection that  incorporates a set of supervised learning algorithms. The proposed  technique provides high accuracy and outperforms existing techniques  that simply utilizes a single learning method. In addition, our  technique relies on partial flow information (rather than full  information) for detection, and thus, it is light-weight and desirable for  online operations with the property of early identification. With the  mid-Atlantic CCDC intrusion dataset publicly available, we show that  our proposed technique yields a high degree of detection rate over 99%  with a very low false alarm rate (0.4%). 

 

Keywords: Intrusion Detection, Supervised Learning, Traffic Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2017
2201 Removal of Textile Dye from Industrial Wastewater by Natural and Modified Diatomite

Authors: Hakim Aguedal, Abdelkader Iddou, Abdallah Aziz, Djillali Reda Merouani, Ferhat Bensaleh, Saleh Bensadek

Abstract:

The textile industry produces high amount of colored effluent each year. The management or treatment of these discharges depends on the applied techniques. Adsorption is one of wastewater treatment techniques destined to treat this kind of pollution, and the performance and efficiency predominantly depend on the nature of the adsorbent used. Therefore, scientific research is directed towards the development of new materials using different physical and chemical treatments to improve their adsorption capacities. In the same perspective, we looked at the effect of the heat treatment on the effectiveness of diatomite, which is found in abundance in Algeria. The textile dye Orange Bezaktiv (SRL-150) which is used as organic pollutants in this study is provided by the textile company SOITEXHAM in Oran city (west Algeria). The effect of different physicochemical parameters on the adsorption of SRL-150 on natural and modified diatomite is studied, and the results of the kinetics and adsorption isotherms were modeled.

Keywords: Wastewater treatment, diatomite, adsorption, dye pollution, kinetic, Isotherm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
2200 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: Data mining, hybrid storage system, recurrent neural network, support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
2199 Research on IBR-Driven Distributed Collaborative Visualization System

Authors: Yin Runmin, Song Changfeng

Abstract:

Image-based Rendering(IBR) techniques recently reached in broad fields which leads to a critical challenge to build up IBR-Driven visualization platform where meets requirement of high performance, large bounds of distributed visualization resource aggregation and concentration, multiple operators deploying and CSCW design employing. This paper presents an unique IBR-based visualization dataflow model refer to specific characters of IBR techniques and then discusses prominent feature of IBR-Driven distributed collaborative visualization (DCV) system before finally proposing an novel prototype. The prototype provides a well-defined three level modules especially work as Central Visualization Server, Local Proxy Server and Visualization Aid Environment, by which data and control for collaboration move through them followed the previous dataflow model. With aid of this triple hierarchy architecture of that, IBR oriented application construction turns to be easy. The employed augmented collaboration strategy not only achieve convenient multiple users synchronous control and stable processing management, but also is extendable and scalable.

Keywords: Image-Based Rendering, Distributed CollaborativeVisualization, Computer Supported Cooperative Work, Model andSimulation, Modular Visualization Environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
2198 Zero-knowledge-like Proof of Cryptanalysis of Bluetooth Encryption

Authors: Eric Filiol

Abstract:

This paper presents a protocol aiming at proving that an encryption system contains structural weaknesses without disclosing any information on those weaknesses. A verifier can check in a polynomial time that a given property of the cipher system output has been effectively realized. This property has been chosen by the prover in such a way that it cannot been achieved by known attacks or exhaustive search but only if the prover indeed knows some undisclosed weaknesses that may effectively endanger the cryptosystem security. This protocol has been denoted zero-knowledge-like proof of cryptanalysis. In this paper, we apply this protocol to the Bluetooth core encryption algorithm E0, used in many mobile environments and thus we suggest that its security can seriously be put into question.

Keywords: Bluetooth encryption, Bluetooth security, Bluetoothprotocol, Stream cipher, Zero-knowledge, Cryptanalysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
2197 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: Classification algorithms; data mining; tourism; knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2526
2196 Web Page Watermarking: XML files using Synonyms and Acronyms

Authors: Nighat Mir, Sayed Afaq Hussain

Abstract:

Advent enhancements in the field of computing have increased massive use of web based electronic documents. Current Copyright protection laws are inadequate to prove the ownership for electronic documents and do not provide strong features against copying and manipulating information from the web. This has opened many channels for securing information and significant evolutions have been made in the area of information security. Digital Watermarking has developed into a very dynamic area of research and has addressed challenging issues for digital content. Watermarking can be visible (logos or signatures) and invisible (encoding and decoding). Many visible watermarking techniques have been studied for text documents but there are very few for web based text. XML files are used to trade information on the internet and contain important information. In this paper, two invisible watermarking techniques using Synonyms and Acronyms are proposed for XML files to prove the intellectual ownership and to achieve the security. Analysis is made for different attacks and amount of capacity to be embedded in the XML file is also noticed. A comparative analysis for capacity is also made for both methods. The system has been implemented using C# language and all tests are made practically to get the results.

Keywords: Watermarking, Extensible Markup Language (XML), Synonyms, Acronyms, Copyright protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2265
2195 Identification of Author and Reviewer from Single and Double Blind Paper

Authors: Jatinderkumar R. Saini, Nikita R. Sonthalia, Khushbu A. Dodiya

Abstract:

Research leads to the development of science and technology and hence it leads to the betterment of humankind also. Journals and Conferences provide a platform to receive large number of research papers for publications and presentations before the expert and peer-level scientific community. In order to assure quality of such papers, they are also sent to reviewers for their comments. In order to maintain good ethical standards, the research papers are sent to reviewers in such a way authors and reviewers do not know each other’s identity. This technique is called Double-blind Review Process. It is called Single-blind Review Process, if identity of any one party, generally authors’, is disclosed to the other. This paper presents the techniques by which identity of author as well as reviewer could be found even through Double-blind Review process. It is proposed that the characteristics and techniques presented here will help journals and conferences in assuring intentional or un-intentional disclosure of identity revealing information by the either party. 

Keywords: Author, Conference, Double Blind Paper, Journal, Reviewer, Single Blind Paper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2429
2194 A New Approach for Assertions Processing during Assertion-Based Software Testing

Authors: Ali M. Alakeel

Abstract:

Assertion-Based software testing has been shown to be a promising tool for generating test cases that reveal program faults. Because the number of assertions may be very large for industry-size programs, one of the main concerns to the applicability of assertion-based testing is the amount of search time required to explore a large number of assertions. This paper presents a new approach for assertions exploration during the process of Assertion- Based software testing. Our initial exterminations with the proposed approach show that the performance of Assertion-Based testing may be improved, therefore, making this approach more efficient when applied on programs with large number of assertions.

Keywords: Software testing, assertion-based testing, program assertions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
2193 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis

Authors: V. Venkatachalam, S. Selvan

Abstract:

The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.

Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
2192 Authentication and Data Hiding Using a Reversible ROI-based Watermarking Scheme for DICOM Images

Authors: Osamah M. Al-Qershi, Khoo Bee Ee

Abstract:

In recent years image watermarking has become an important research area in data security, confidentiality and image integrity. Many watermarking techniques were proposed for medical images. However, medical images, unlike most of images, require extreme care when embedding additional data within them because the additional information must not affect the image quality and readability. Also the medical records, electronic or not, are linked to the medical secrecy, for that reason, the records must be confidential. To fulfill those requirements, this paper presents a lossless watermarking scheme for DICOM images. The proposed a fragile scheme combines two reversible techniques based on difference expansion for patient's data hiding and protecting the region of interest (ROI) with tamper detection and recovery capability. Patient's data are embedded into ROI, while recovery data are embedded into region of non-interest (RONI). The experimental results show that the original image can be exactly extracted from the watermarked one in case of no tampering. In case of tampered ROI, tampered area can be localized and recovered with a high quality version of the original area.

Keywords: DICOM, reversible, ROI-based, watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
2191 Educational Data Mining: The Case of Department of Mathematics and Computing in the Period 2009-2018

Authors: M. Sitoe, O. Zacarias

Abstract:

University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.

Keywords: Evasion and retention, cross validation, bagging, stacking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 85
2190 Utilization of Laser-Ablation Based Analytical Methods for Obtaining Complete Chemical Information of Algae

Authors: Pavel Pořízka, David Prochazka, Karel Novotný, Ota Samek, ZdeněkPilát, Klára Procházková, and Jozef Kaiser

Abstract:

Themain goal of this article is to find efficient methods for elemental and molecular analysis of living microorganisms (algae) under defined environmental conditions and cultivation processes. The overall knowledge of chemical composition is obtained utilizing laser-based techniques, Laser- Induced Breakdown Spectroscopy (LIBS) for acquiring information about elemental composition and Raman Spectroscopy for gaining molecular information, respectively. Algal cells were suspended in liquid media and characterized using their spectra. Results obtained employing LIBS and Raman Spectroscopy techniques will help to elucidate algae biology (nutrition dynamics depending on cultivation conditions) and to identify algal strains, which have the potential for applications in metal-ion absorption (bioremediation) and biofuel industry. Moreover, bioremediation can be readily combined with production of 3rd generation biofuels. In order to use algae for efficient fuel production, the optimal cultivation parameters have to be determinedleading to high production of oil in selected cellswithout significant inhibition of the photosynthetic activity and the culture growth rate, e.g. it is necessary to distinguish conditions for algal strain containing high amount of higher unsaturated fatty acids. Measurements employing LIBS and Raman Spectroscopy were utilized in order to give information about alga Trachydiscusminutus with emphasis on the amount of the lipid content inside the algal cell and the ability of algae to withdraw nutrients from its environment and bioremediation (elemental composition), respectively. This article can serve as the reference for further efforts in describing complete chemical composition of algal samples employing laserablation techniques.

Keywords: Laser-Induced Breakdown Spectroscopy, Raman Spectroscopy, Algae, Algal strains, Bioremediation, Biofuels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2226
2189 Analytical Studies on Volume Determination of Leg Ulcer using Structured Light and Laser Triangulation Data Acquisition Techniques

Authors: M. Abdul-Rani, K. K. Chong, A. F. M. Hani, Y. B. Yap, A. Jamil

Abstract:

Imaging is defined as the process of obtaining geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin surface in medical application. This research focuses on analyzing and determining volume of leg ulcers using imaging devices. Volume determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the indication on responding to treatment whether healing or worsening. Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the treatment efficacy. The objectives of this paper is to compare the accuracy between two 3D data acquisition method, which is laser triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique produces better accuracy compared with laser triangulation data acquisition method for leg ulcer volume determination.

Keywords: Imaging, Laser Triangulation, Structured Light, Volume Determination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
2188 A Combined Practical Approach to Condition Monitoring of Reciprocating Compressors using IAS and Dynamic Pressure

Authors: M. Elhaj, M. Almrabet, M. Rgeai, I. Ehtiwesh

Abstract:

A Comparison and evaluation of the different condition monitoring (CM) techniques was applied experimentally on RC e.g. Dynamic cylinder pressure and crankshaft Instantaneous Angular Speed (IAS), for the detection and diagnosis of valve faults in a two - stage reciprocating compressor for a programme of condition monitoring which can successfully detect and diagnose a fault in machine. Leakage in the valve plate was introduced experimentally into a two-stage reciprocating compressor. The effect of the faults on compressor performance was monitored and the differences with the normal, healthy performance noted as a fault signature been used for the detection and diagnosis of faults. The paper concludes with what is considered to be a unique approach to condition monitoring. First, each of the two most useful techniques is used to produce a Truth Table which details the circumstances in which each method can be used to detect and diagnose a fault. The two Truth Tables are then combined into a single Decision Table to provide a unique and reliable method of detection and diagnosis of each of the individual faults introduced into the compressor. This gives accurate diagnosis of compressor faults.

Keywords: Condition Monitoring, Dynamic Pressure, Instantaneous Angular Speed, Reciprocating Compressor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3290