Search results for: data integrity challenges
7666 Categorical Data Modeling: Logistic Regression Software
Authors: Abdellatif Tchantchane
Abstract:
A Matlab based software for logistic regression is developed to enhance the process of teaching quantitative topics and assist researchers with analyzing wide area of applications where categorical data is involved. The software offers an option of performing stepwise logistic regression to select the most significant predictors. The software includes a feature to detect influential observations in data, and investigates the effect of dropping or misclassifying an observation on a predictor variable. The input data may consist either as a set of individual responses (yes/no) with the predictor variables or as grouped records summarizing various categories for each unique set of predictor variables' values. Graphical displays are used to output various statistical results and to assess the goodness of fit of the logistic regression model. The software recognizes possible convergence constraints when present in data, and the user is notified accordingly.
Keywords: Logistic regression, Matlab, Categorical data, Influential observation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18827665 Role of Association Rule Mining in Numerical Data Analysis
Authors: Sudhir Jagtap, Kodge B. G., Shinde G. N., Devshette P. M
Abstract:
Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. The numerical data analysis became key process in research and development of all the fields [6]. In this paper we have made an attempt to analyze the specified numerical patterns with reference to the association rule mining techniques with minimum confidence and minimum support mining criteria. The extracted rules and analyzed results are graphically demonstrated. Association rules are a simple but very useful form of data mining that describe the probabilistic co-occurrence of certain events within a database [7]. They were originally designed to analyze market-basket data, in which the likelihood of items being purchased together within the same transactions are analyzed.Keywords: Numerical data analysis, Data Mining, Association Rule Mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28617664 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures
Authors: Silvina Caíno-Lores, Jesús Carretero
Abstract:
Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.Keywords: Co-scheduling, data-centric, data-intensive, data locality, in-memory storage, large scale.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14917663 Correction of Infrared Data for Electrical Components on a Board
Authors: Seong-Ho Song, Ki-Seob Kim, Seop-Hyeong Park, Seon-Woo Lee
Abstract:
In this paper, the data correction algorithm is suggested when the environmental air temperature varies. To correct the infrared data in this paper, the initial temperature or the initial infrared image data is used so that a target source system may not be necessary. The temperature data obtained from infrared detector show nonlinear property depending on the surface temperature. In order to handle this nonlinear property, Taylor series approach is adopted. It is shown that the proposed algorithm can reduce the influence of environmental temperature on the components in the board. The main advantage of this algorithm is to use only the initial temperature of the components on the board rather than using other reference device such as black body sources in order to get reference temperatures.Keywords: Infrared camera, Temperature Data compensation, Environmental Ambient Temperature, Electric Component
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15277662 Towards the Design of a GIS-Linked Agent-Based Model for the Lake Chad Basin Region: Challenges and Opportunities
Authors: Stephen Akuma, Isaac Terngu Adom, Evelyn Doofan Akuma
Abstract:
Generation after generation of humans has experienced conflicts leading to needless deaths. Usually, it begins as a minor argument that occasionally escalates into a full-fledged conflict. There has been a lingering crisis in the Lake Chad Basin (LCB) of Africa for over a decade leading to bloodshed that has claimed thousands of lives. The terrorist group, Boko Haram has claimed responsibility for these deaths. Efforts have been made by the governments in the LCB region to end the crisis through kinetic approaches, but the conflict persists. In this work, we explored non-kinetic methods used by social scientists in resolving conflicts, with a focus on computational approaches due to the increasing processing power of the computer. Firstly, we reviewed the innovative computational methods available for researchers working on conflict, violence, and peace. Secondly, we described how an Agent-Based Model (ABM) can be linked with a Geographic Information System (GIS) to model the LCB. Finally, this research discusses the challenges and opportunities in constructing a Geographic Information System linked Agent-Based Model of the LCB region.
Keywords: Agent-based modelling, conflict, Geographical Information Systems, Lake Chad Basin, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437661 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment
Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane
Abstract:
Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.
Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12927660 Structural Health Monitoring of Buildings and Infrastructure
Authors: Mojtaba Valinejadshoubi, Ashutosh Bagchi, Osama Moselhi
Abstract:
Structures such as buildings, bridges, dams, wind turbines etc. need to be maintained against various factors such as deterioration, excessive loads, environment, temperature, etc. Choosing an appropriate monitoring system is important for determining any critical damage to a structure and address that to avoid any adverse consequence. Structural Health Monitoring (SHM) has emerged as an effective technique to monitor the health of the structures. SHM refers to an ongoing structural performance assessment using different kinds of sensors attached to or embedded in the structures to evaluate their integrity and safety to help engineers decide on rehabilitation measures. Ability of SHM in identifying the location and severity of structural damages by considering any changes in characteristics of the structures such as their frequency, stiffness and mode shapes helps engineers to monitor the structures and take the most effective corrective actions to maintain their safety and extend their service life. The main objective of this study is to review the overall SHM process specifically determining the natural frequency of an instrumented simply-supported concrete beam using modal testing and finite element model updating.
Keywords: Structural Health Monitoring, Natural Frequency, FFT analysis, Finite element model updating.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24427659 A Generalised Relational Data Model
Authors: Georgia Garani
Abstract:
A generalised relational data model is formalised for the representation of data with nested structure of arbitrary depth. A recursive algebra for the proposed model is presented. All the operations are formally defined. The proposed model is proved to be a superset of the conventional relational model (CRM). The functionality and validity of the model is shown by a prototype implementation that has been undertaken in the functional programming language Miranda.Keywords: nested relations, recursive algebra, recursive nested operations, relational data model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15597658 WiFi Data Offloading: Bundling Method in a Canvas Business Model
Authors: Majid Mokhtarnia, Alireza Amini
Abstract:
Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.
Keywords: Bundling, canvas business model, telecommunication, WiFi Data Offloading.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8907657 Numerical Simulation of Diesel Sprays under Hot Bomb Conditions
Authors: Ishtiaq A. Chaudhry, Zia R Tahir, F. A. Siddiqui, F. Noor, M. J. Rashid
Abstract:
It has experimentally been proved that the performance of compression ignition (C.I.) engine is spray characteristics related. In modern diesel engine the spray formation and the eventual combustion process are the vital processes that offer more challenges towards enhancing the engine performance. In the present work the numerical simulation has been carried out for evaporating diesel sprays using Fluent software. For computational fluid dynamics simulation “Meshing” is done using Gambit software before transmitting it into Fluent. The simulation is carried out using hot bomb conditions under varying chamber conditions such as gas pressure, nozzle diameter and fuel injection pressure. For comparison purpose, the numerical simulations the chamber conditions were kept the same as that of the experimental data. At varying chamber conditions the spray penetration rates are compared with the existing experimental results.
Keywords: Evaporating diesel sprays, Penetration rates, Hot bomb conditions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21847656 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Moses Noel Dogonyaro
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.
Keywords: Data Analytics, Security, Privacy, Bootstrapping, and Fully Homomorphic Encryption Scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34587655 The Internet of Healthcare Things: A European Perspective and a Review of Ethical Concerns
Authors: M. Emmanouilidou
Abstract:
The Internet of Things (IoT) is a disruptive technological paradigm that is at the center of the digital evolution by integrating physical and virtual worlds leading to the creation of extended interconnected ecosystems that are characterized as smart environments. The concept of the IoT has a broad range of applications in different industries including the healthcare sector. The Internet of Healthcare Things (IoHT), a branch of the IoT, is expected to bring promising benefits to all involved stakeholders and accelerate the revolution of the healthcare sector through a transition towards preventive and personalized medicine. The socio-economic challenges that the healthcare sector is facing further emphasize the need for a radical transformation of healthcare systems in both developed and developing countries with the role of pervasive technological innovations, such as IoHT, recognized as key to counteract the relevant challenges. Besides the number of potential opportunities that IoHT presents, there are fundamental ethical concerns that need to be considered and addressed in relation to the application of IoHT. This paper contributes to the discussion of the emerging topic of IoHT by providing an overview of the role and potential of IoHT, highlighting the characteristics of the current and future healthcare landscape, reporting on the up-to-date status of IoHT in Europe and reflecting upon existing research in the ethics of IoHT by incorporating additional ethical dimensions that have been ignored which can provide pathways for future research in the field.Keywords: Ethics, Europe, healthcare, internet of things.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17597654 Effect of Heat-Moisture Treatment on the Formation and Properties of Resistant Starches From Mung Bean (Phaseolus radiatus) Starches
Authors: Su-Ling Li, Qun-Yu Gao
Abstract:
Mung bean starches were subjected to heat-moisture treatment (HMT) by different moisture contents (15%, 20%, 25%, 30% and 35%) at 120Ôäâ for 12h. The impact on the yields of resistant starch (RS), microstructure, physicochemical and functional properties was investigated. Compared to native starch, the RS content of heat-moisture treated starches increased significantly. The RS level of HMT-20 was the highest of all the starches. Birefringence was displayed clear at the center of native starch. For HMT starches, pronounced birefringence was exhibited on the periphery of starch granules; however, birefringence disappeared at the centre of some starch granules. The shape of HMT starches hadn-t been changed and the integrity of starch granules was preserved for all the conditions. Concavity could be observed on HMT starches under scanning electronic microscopy. After HMT, apparent amylose contents were increased and starch macromolecule was degraded in comparison with those of native starch. There was a reduction in swelling power on HMT starches, but the solubility of HMT starches was higher than that of native starch. Both of native and HMT starches showed A-type X-ray diffraction pattern. Furthermore, there is a higher intensity at the peak of 15.0 and 22.9 Å than those of native starch.
Keywords: Resistant starch, mung bean (Phaseolus radiatus) starch, heat-moisture treatment, physicochemical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35227653 Evaluating Complexity – Ethical Challenges in Computational Design Processes
Authors: J.Partanen
Abstract:
Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.Keywords: urban planning, architecture, dynamic modeling, ethics, complexity theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18907652 Hierarchical Clustering Algorithms in Data Mining
Authors: Z. Abdullah, A. R. Hamdan
Abstract:
Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the area in data mining and it can be classified into partition, hierarchical, density based and grid based. Therefore, in this paper we do survey and review four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems as well as deriving more robust and scalable algorithms for clustering.Keywords: Clustering, method, algorithm, hierarchical, survey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33767651 Iterative Clustering Algorithm for Analyzing Temporal Patterns of Gene Expression
Authors: Seo Young Kim, Jae Won Lee, Jong Sung Bae
Abstract:
Microarray experiments are information rich; however, extensive data mining is required to identify the patterns that characterize the underlying mechanisms of action. For biologists, a key aim when analyzing microarray data is to group genes based on the temporal patterns of their expression levels. In this paper, we used an iterative clustering method to find temporal patterns of gene expression. We evaluated the performance of this method by applying it to real sporulation data and simulated data. The patterns obtained using the iterative clustering were found to be superior to those obtained using existing clustering algorithms.Keywords: Clustering, microarray experiment, temporal pattern of gene expression data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13557650 Effective Software-Based Solution for Processing Mass Downstream Data in Interactive Push VOD System
Authors: Ni Hong, Wu Guobin, Wu Gang, Pan Liang
Abstract:
Interactive push VOD system is a new kind of system that incorporates push technology and interactive technique. It can push movies to users at high speeds at off-peak hours for optimal network usage so as to save bandwidth. This paper presents effective software-based solution for processing mass downstream data at terminals of interactive push VOD system, where the service can download movie according to a viewer-s selection. The downstream data is divided into two catalogs: (1) the carousel data delivered according to DSM-CC protocol; (2) IP data delivered according to Euro-DOCSIS protocol. In order to accelerate download speed and reduce data loss rate at terminals, this software strategy introduces caching, multi-thread and resuming mechanisms. The experiments demonstrate advantages of the software-based solution.Keywords: DSM-CC, data carousel, Euro-DOCSIS, push VOD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14897649 Approaches and Schemes for Storing DTD-Independent XML Data in Relational Databases
Authors: Mehdi Emadi, Masoud Rahgozar, Adel Ardalan, Alireza Kazerani, Mohammad Mahdi Ariyan
Abstract:
The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method's query answering.Keywords: XML Data Management, XPath, DTD-IndependentXML Data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18817648 Approaches and Schemes for Storing DTDIndependent XML Data in Relational Databases
Authors: Mehdi Emadi, Masoud Rahgozar, Adel Ardalan, Alireza Kazerani, Mohammad Mahdi Ariyan
Abstract:
The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method-s query answering.Keywords: XML Data Management, XPath, DTD-Independent XML Data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14537647 Influence of Build Orientation on Machinability of Selective Laser Melted Titanium Alloy-Ti-6Al-4V
Authors: Manikandakumar Shunmugavel, Ashwin Polishetty, Moshe Goldberg, Junior Nomani, Guy Littlefair
Abstract:
Selective laser melting (SLM), a promising additive manufacturing (AM) technology, has a huge potential in the fabrication of Ti-6Al-4V near-net shape components. However, poor surface finish of the components fabricated from this technology requires secondary machining to achieve the desired accuracy and tolerance. Therefore, a systematic understanding of the machinability of SLM fabricated Ti-6Al-4V components is paramount to improve the productivity and product quality. Considering the significance of machining in SLM fabricated Ti-6Al-4V components, this research aim is to study the influence of build orientation on machinability characteristics by performing low speed orthogonal cutting tests. In addition, the machinability of SLM fabricated Ti-6Al-4V is compared with conventionally produced wrought Ti-6Al-4V to understand the influence of SLM technology on machining. This paper is an attempt to provide evidence to the hypothesis associated that build orientation influences cutting forces, chip formation and surface integrity during orthogonal cutting of SLM Ti-6Al-4V samples. Results obtained from the low speed orthogonal cutting tests highlight the practical importance of microstructure and build orientation on machinability of SLM Ti-6Al-4V.Keywords: Additive manufacturing, build orientation, machinability, titanium alloys (Ti-6Al-4V).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10927646 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.
Keywords: Geolocation, Twitter, distribution analysis, human mobility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11927645 Database Compression for Intelligent On-board Vehicle Controllers
Authors: Ágoston Winkler, Sándor Juhász, Zoltán Benedek
Abstract:
The vehicle fleet of public transportation companies is often equipped with intelligent on-board passenger information systems. A frequently used but time and labor-intensive way for keeping the on-board controllers up-to-date is the manual update using different memory cards (e.g. flash cards) or portable computers. This paper describes a compression algorithm that enables data transmission using low bandwidth wireless radio networks (e.g. GPRS) by minimizing the amount of data traffic. In typical cases it reaches a compression rate of an order of magnitude better than that of the general purpose compressors. Compressed data can be easily expanded by the low-performance controllers, too.
Keywords: Data analysis, data compression, differentialencoding, run-length encoding, vehicle control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15677644 EUDIS-An Encryption Scheme for User-Data Security in Public Networks
Authors: S. Balaji, M. Rajaram
Abstract:
The method of introducing the proxy interpretation for sending and receiving requests increase the capability of the server and our approach UDIV (User-Data Identity Security) to solve the data and user authentication without extending size of the data makes better than hybrid IDS (Intrusion Detection System). And at the same time all the security stages we have framed have to pass through less through that minimize the response time of the request. Even though an anomaly detected, before rejecting it the proxy extracts its identity to prevent it to enter into system. In case of false anomalies, the request will be reshaped and transformed into legitimate request for further response. Finally we are holding the normal and abnormal requests in two different queues with own priorities.
Keywords: IDS, Data & User authentication, UDIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18547643 Feedrate Optimization for Ball-end milling of Sculptured Surfaces using Fuzzy Logic Controller
Authors: Njiri J. G., Ikua B. W., Nyakoe G. N.
Abstract:
Optimization of cutting parameters important in precision machining in regards to efficiency and surface integrity of the machined part. Usually productivity and precision in machining is limited by the forces emanating from the cutting process. Due to the inherent varying nature of the workpiece in terms of geometry and material composition, the peak cutting forces vary from point to point during machining process. In order to increase productivity without compromising on machining accuracy, it is important to control these cutting forces. In this paper a fuzzy logic control algorithm is developed that can be applied in the control of peak cutting forces in milling of spherical surfaces using ball end mills. The controller can adaptively vary the feedrate to maintain allowable cutting force on the tool. This control algorithm is implemented in a computer numerical control (CNC) machine. It has been demonstrated that the controller can provide stable machining and improve the performance of the CNC milling process by varying feedrate.
Keywords: Ball-end mill, feedrate, fuzzy logic controller, machining optimization, spherical surface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24847642 An Analysis of Genetic Algorithm Based Test Data Compression Using Modified PRL Coding
Authors: K. S. Neelukumari, K. B. Jayanthi
Abstract:
In this paper genetic based test data compression is targeted for improving the compression ratio and for reducing the computation time. The genetic algorithm is based on extended pattern run-length coding. The test set contains a large number of X value that can be effectively exploited to improve the test data compression. In this coding method, a reference pattern is set and its compatibility is checked. For this process, a genetic algorithm is proposed to reduce the computation time of encoding algorithm. This coding technique encodes the 2n compatible pattern or the inversely compatible pattern into a single test data segment or multiple test data segment. The experimental result shows that the compression ratio and computation time is reduced.Keywords: Backtracking, test data compression (TDC), x-filling, x-propagating and genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18697641 Implementing Pro-Poor Policies for Poverty Alleviation: The Case of the White Paper on Families in South Africa
Authors: P. Mbecke
Abstract:
The role of the government to tangibly alleviate poverty, improve and sustain the quality of people’s lives remains a “work in progress” twenty-two years after the dawn of democracy in South Africa despite a host of socio-economic programs and pro-poor policies and legislations. This paper assesses the development process and the implementation of the White Paper on Families in South Africa as one of the pro-poor policies intended to curb poverty and redress the imbalances of the apartheid regime. The paper is the result of a qualitative implementation research theory facilitated through in-depth interviews with social work managers complemented by literature and policy review techniques. It investigates the level of basic knowledge and understanding as well as the implementation challenges of the White Paper on Families as causes of its failure. The paper emphasizes the importance of the family-centered approach in the implementation of pro-poor policies. To facilitate the understanding of the White Paper on Families by its users, the Department of Social Development needs take stock of the identified challenges of its implementation so as to facilitate its success in fostering positive family well-being that will directly contributes to the overall socio-economic development of South Africa.
Keywords: Poverty alleviation, pro-poor policy, social development, social welfare, South Africa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5867640 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory organization, parallel processors, serial code, vector processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10627639 Lifelong Education for Teachers: A Tool for Achieving Effective Teaching and Learning in Secondary Schools in Benue State, Nigeria
Authors: P. I. Adzongo, O. A. Aloga
Abstract:
The purpose of the study was to examine lifelong education for teachers as a tool for achieving effective teaching and learning. Lifelong education enhances social inclusion, personal development, citizenship, employability, teaching and learning, community and the nation. It is imperative that the teacher needs to update his knowledge regularly to be able to perform optimally, since he has a major position in the inculcation of desirable elements in students, and the challenges of lifelong education were also discussed. Descriptive survey design was adopted for the study. A simple random sampling technique was used to select 80 teachers as sample from a population of 105 senior secondary school teachers in Makurdi Local Government Area of Benue State. A 20-item self designed questionnaire subjected to expert validation and reliability was used to collect data. The reliability Alpha coefficient of 0.87 was established using Cronbach’s Alpha technique, mean scores and standard deviation were used to answer the 2 research questions while chi-square was used to analyse data for the 2 null hypotheses, which states that lifelong education for teachers is not a significant tool for achieving effective teaching and lifelong education for teachers does not significantly impact on effective learning. The findings of the study revealed that, lifelong education for teachers can be used as a tool for achieving effective teaching and learning, and the study recommended among others that government, organizations and individuals should in collaboration put lifelong education programmes for teachers on the priority list. The paper concluded that the strategic position of lifelong education for teachers towards enhanced teaching, learning and the production of quality manpower in the society makes it imperative for all hands to be on “deck” to support the programme financially and otherwise.Keywords: Lifelong Education, Tool, Effective Teaching and Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14667638 A Taxonomy Proposal on Criterion Structure for Evaluating Freight Village Concepts in Early-Stage Design Projects
Authors: Rıza Gürhan Korkut, Metin Çelik, Süleyman Özkaynak
Abstract:
The early-stage design and development projects for the freight village initiatives require a comprehensive analysis of both qualitative and quantitative data. Considering the literature review on structural and operational management requirements, this study proposed an original taxonomy on criterion structure to assess freight village conceptualization. The potential challenges and uncertainties of the developed taxonomy are extended. Besides requirement analysis, this study is also expected to contribute to forthcoming research on benchmarking of freight villages in different regions. The methodology used in this research is a systematic review on several articles as per their modelling approaches, sustainability, entities and decisions made together with the uncertainties and features of their models taken into consideration. The major findings of the study that are the categories for assessing the projects attributes on their environmental, socio-economical, accessibility and location aspects.Keywords: Freight village, logistics centers, operational management, taxonomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8297637 Adding Olive Oil into Diluents for Improving Semen Quality and Storage Ability of Roosters' Semen during Liquid Storage
Authors: Hazim J. Al-Daraji
Abstract:
The aim of this study was to investigate the effects of supplementing the diluent of roosters' semen with different levels of olive oil on motility, viability, morphology and acrosome integrity of chicken spermatozoa after in vitro storage for up to 72 h. Semen was collected from 60 White Layer males (62 wk of age) kept in separated floor pens and randomly divided into six treatment groups (10 males in each group). Experimental groups were as follows: T1 :fresh semen, T2 : semen extended 1:1 with Al – Daraji 2 diluent (AD2D) alone, T3 – T6 :semen samples extended 1:1 with AD2D supplemented with 2 ml, 4 ml, 6 ml or 8 ml of olive oil / 100 ml of diluent, respectively. Semen samples were then stored at 5 °C for 24 h, 48 h or 72 h. There was a clear influence of diluent supplementation with olive oil on the spermatozoa motility profile; olive oil groups (T3, T4, T5 and T6) recorded the highest scores of mass activity and individual motility during all storage periods compared to T1 and T2 groups. In addition, the inclusion of olive oil into semen diluent (T3, T4, T5 and T6) gave significantly higher percentages of viable spermatozoa, normal morphologically spermatozoa and intact acrosomes irrespective of storage period. These results clearly show that supplementation the diluent of roosters' semen with olive oil can improve semen quality when semen samples in vitro stored at 5 °C for up to 72 h.Keywords: Olive oil, diluent, liquid storage, semen quality of roosters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145