Search results for: Data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9246

Search results for: Data mining techniques

8196 Collaborative Design System based on Object-Oriented Modeling of Supply Chain Simulation: A Case Study of Thai Jewelry Industry

Authors: Somlak Wannarumon, Apichai Ritvirool, Thana Boonrit

Abstract:

The paper proposes a new concept in developing collaborative design system. The concept framework involves applying simulation of supply chain management to collaborative design called – 'SCM–Based Design Tool'. The system is developed particularly to support design activities and to integrate all facilities together. The system is aimed to increase design productivity and creativity. Therefore, designers and customers can collaborate by the system since conceptual design. JAG: Jewelry Art Generator based on artificial intelligence techniques is integrated into the system. Moreover, the proposed system can support users as decision tool and data propagation. The system covers since raw material supply until product delivery. Data management and sharing information are visually supported to designers and customers via user interface. The system is developed on Web–assisted product development environment. The prototype system is presented for Thai jewelry industry as a system prototype demonstration, but applicable for other industry.

Keywords: Collaborative design, evolutionary art, jewelry design, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
8195 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions

Authors: Marcelo Dias Carvalho, Leticia Ishikawa

Abstract:

Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.

Keywords: Supply chain disruptions, supply chain management, supply chain resilience, just-in-time production, lean manufacturing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3494
8194 Issues in Spectral Source Separation Techniques for Plant-wide Oscillation Detection and Diagnosis

Authors: A.K. Tangirala, S. Babji

Abstract:

In the last few years, three multivariate spectral analysis techniques namely, Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Non-negative Matrix Factorization (NMF) have emerged as effective tools for oscillation detection and isolation. While the first method is used in determining the number of oscillatory sources, the latter two methods are used to identify source signatures by formulating the detection problem as a source identification problem in the spectral domain. In this paper, we present a critical drawback of the underlying linear (mixing) model which strongly limits the ability of the associated source separation methods to determine the number of sources and/or identify the physical source signatures. It is shown that the assumed mixing model is only valid if each unit of the process gives equal weighting (all-pass filter) to all oscillatory components in its inputs. This is in contrast to the fact that each unit, in general, acts as a filter with non-uniform frequency response. Thus, the model can only facilitate correct identification of a source with a single frequency component, which is again unrealistic. To overcome this deficiency, an iterative post-processing algorithm that correctly identifies the physical source(s) is developed. An additional issue with the existing methods is that they lack a procedure to pre-screen non-oscillatory/noisy measurements which obscure the identification of oscillatory sources. In this regard, a pre-screening procedure is prescribed based on the notion of sparseness index to eliminate the noisy and non-oscillatory measurements from the data set used for analysis.

Keywords: non-negative matrix factorization, PCA, source separation, plant-wide diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
8193 The Influence of Project-Based Learning and Outcome-Based Education: Interior Design Tertiary Students in Focus

Authors: Omneya Messallam

Abstract:

Technology has been developed dramatically in most of the educational disciplines. For instance, digital rendering subject, which is being taught in both Interior and Architecture fields, is witnessing almost annually updated software versions. A lot of students and educators argued that there will be no need for manual rendering techniques to be learned. Therefore, the Interior Design Visual Presentation 1 course (ID133) has been chosen from the first level of the Interior Design (ID) undergraduate program, as it has been taught for six years continually. This time frame will facilitate sound observation and critical analysis of the use of appropriate teaching methodologies. Furthermore, the researcher believes in the high value of the manual rendering techniques. The course objectives are: to define the basic visual rendering principles, to recall theories and uses of various types of colours and hatches, to raise the learners’ awareness of the value of studying manual render techniques, and to prepare them to present their work professionally. The students are female Arab learners aged between 17 and 20. At the outset of the course, the majority of them demonstrated negative attitude, lacking both motivation and confidence in manual rendering skills. This paper is a reflective appraisal of deploying two student-centred teaching pedagogies which are: Project-based learning (PBL) and Outcome-based education (OBE) on ID133 students. This research aims of developing some teaching strategies to enhance the quality of teaching in this given course over an academic semester. The outcome of this research emphasized the positive influence of applying such educational methods on improving the quality of students’ manual rendering skills in terms of: materials, textiles, textures, lighting, and shade and shadow. Furthermore, it greatly motivated the students and raised the awareness of the importance of learning the manual rendering techniques.

Keywords: Manual renders, outcome-based education, project-based learning, personal competences, and visual presentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 836
8192 Architecture Based on Dynamic Graphs for the Dynamic Reconfiguration of Farms of Computers

Authors: Carmen Navarrete, Eloy Anguiano

Abstract:

In the last years, the computers have increased their capacity of calculus and networks, for the interconnection of these machines. The networks have been improved until obtaining the actual high rates of data transferring. The programs that nowadays try to take advantage of these new technologies cannot be written using the traditional techniques of programming, since most of the algorithms were designed for being executed in an only processor,in a nonconcurrent form instead of being executed concurrently ina set of processors working and communicating through a network.This paper aims to present the ongoing development of a new system for the reconfiguration of grouping of computers, taking into account these new technologies.

Keywords: Dynamic network topology, resource and task allocation, parallel computing, heterogeneous computing, dynamic reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1355
8191 Fuzzy Processing of Uncertain Data

Authors: Petr Morávek, Miloš Šeda

Abstract:

In practice, we often come across situations where it is necessary to make decisions based on incomplete or uncertain data. In control systems it may be due to the unknown exact mathematical model, or its excessive complexity (e.g. nonlinearity) when it is necessary to simplify it, respectively, to solve it using a rule base. In the case of databases, searching data we compare a similarity measure with of the requirements of the selection with stored data, where both the select query and the data itself may contain vague terms, for example in the form of linguistic qualifiers. In this paper, we focus on the processing of uncertain data in databases and demonstrate it on the example multi-criteria decision making in the selection of variants, specified by higher number of technical parameters.

Keywords: fuzzy logic, linguistic variable, multicriteria decision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
8190 Object Allocation with Replication in Distributed Systems

Authors: H. T. Barney, G. C. Low

Abstract:

The design of distributed systems involves dividing the system into partitions (or components) and then allocating these partitions to physical nodes. There have been several techniques proposed for both the partitioning and allocation processes. These existing techniques suffer from a number of limitations including lack of support for replication. Replication is difficult to use effectively but has the potential to greatly improve the performance of a distributed system. This paper presents a new technique technique for allocating objects in order to improve performance in a distributed system that supports replication. The performance of the proposed technique is demonstrated and tested on an example system. The performance of the new technique is compared with the performance of an existing technique in order to demonstrate both the validity and superiority of the new technique when developing a distributed system that can utilise object replication.

Keywords: Allocation, Distributed Systems, Replication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
8189 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

Authors: Dr. L. Arockiam, A. Aloysius

Abstract:

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Keywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
8188 Automated Stereophotogrammetry Data Cleansing

Authors: Stuart Henry, Philip Morrow, John Winder, Bryan Scotney

Abstract:

The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.

Keywords: Data cleansing, stereophotogrammetry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
8187 The Use of S Curves in Technology Forecasting and its Application On 3D TV Technology

Authors: Gizem Intepe, Tufan Koc

Abstract:

S-Curves are commonly used in technology forecasting. They show the paths of product performance in relation to time or investment in R&D. It is a useful tool to describe the inflection points and the limit of improvement of a technology. Companies use this information to base their innovation strategies. However inadequate use and some limitations of this technique lead to problems in decision making. In this paper first technology forecasting and its importance for company level strategies will be discussed. Secondly the S-Curve and its place among other forecasting techniques will be introduced. Thirdly its use in technology forecasting will be discussed based on its advantages, disadvantages and limitations. Finally an application of S-curve on 3D TV technology using patent data will also be presented and the results will be discussed.

Keywords: Patent analysis, Technological forecasting. S curves, 3D TV

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7769
8186 Evaluation on Recent Committed Crypt Analysis Hash Function

Authors: A. Arul Lawrence Selvakumar, C. Suresh Ganandhas

Abstract:

This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.

Keywords: Crypt Analysis, cryptographic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
8185 Modeling Aeration of Sharp Crested Weirs by Using Support Vector Machines

Authors: Arun Goel

Abstract:

The present paper attempts to investigate the prediction of air entrainment rate and aeration efficiency of a free overfall jets issuing from a triangular sharp crested weir by using regression based modelling. The empirical equations, Support vector machine (polynomial and radial basis function) models and the linear regression techniques were applied on the triangular sharp crested weirs relating the air entrainment rate and the aeration efficiency to the input parameters namely drop height, discharge, and vertex angle. It was observed that there exists a good agreement between the measured values and the values obtained using empirical equations, Support vector machine (Polynomial and rbf) models and the linear regression techniques. The test results demonstrated that the SVM based (Poly & rbf) model also provided acceptable prediction of the measured values with reasonable accuracy along with empirical equations and linear regression techniques in modelling the air entrainment rate and the aeration efficiency of a free overfall jets issuing from triangular sharp crested weir. Further sensitivity analysis has also been performed to study the impact of input parameter on the output in terms of air entrainment rate and aeration efficiency.

Keywords: Air entrainment rate, dissolved oxygen, regression, SVM, weir.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1951
8184 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya

Authors: Jamal A. Gledan, Othman A. Azzeidani

Abstract:

During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three – parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.

Keywords: Geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2726
8183 Finding Authoritative Researchers on Academic Web Sites

Authors: Dalibor Fiala, Karel Jezek, Francois Rousselot

Abstract:

In this paper, we present a methodology for finding authoritative researchers by analyzing academic Web sites. We show a case study in which we concentrate on a set of Czech computer science departments- Web sites. We analyze the relations between them via hyperlinks and find the most important ones using several common ranking algorithms. We then examine the contents of the research papers present on these sites and determine the most authoritative Czech authors.

Keywords: Authorities, citation analysis, prestige, ranking algorithms, Web mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1238
8182 Evaluation of Urban Development Proposals An ANP Approach

Authors: T. Gómez-Navarro, M. García-Melón, D. Díaz-Martín, S. Acuna-Dutra,

Abstract:

In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.

Keywords: Environmental pressure indicators, multicriteria decision analysis, analytic network process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
8181 Geochemistry of Natural Radionuclides Associated with Acid Mine Drainage (AMD) in a Coal Mining Area in Southern Brazil

Authors: Juliana A. Galhardi, Daniel M. Bonotto

Abstract:

Coal is an important non-renewable energy source of and can be associated with radioactive elements. In Figueira city, Paraná state, Brazil, it was recorded high uranium activity near the coal mine that supplies a local thermoelectric power plant. In this context, the radon activity (Rn-222, produced by the Ra-226 decay in the U-238 natural series) was evaluated in groundwater, river water and effluents produced from the acid mine drainage in the coal reject dumps. The samples were collected in August 2013 and in February 2014 and analyzed at LABIDRO (Laboratory of Isotope and Hydrochemistry), UNESP, Rio Claro city, Brazil, using an alpha spectrometer (AlphaGuard) adjusted to evaluate the mean radon activity concentration in five cycles of 10 minutes. No radon activity concentration above 100 Bq.L-1, which was a previous critic value established by the World Health Organization. The average radon activity concentration in groundwater was higher than in surface water and in effluent samples, possibly due to the accumulation of uranium and radium in the aquifer layers that favors the radon trapping. The lower value in the river waters can indicate dilution and the intermediate value in the effluents may indicate radon absorption in the coal particles of the reject dumps. The results also indicate that the radon activities in the effluents increase with the sample acidification, possibly due to the higher radium leaching and the subsequent radon transport to the drainage flow. The water samples of Laranjinha River and Ribeirão das Pedras stream, which, respectively, supply Figueira city and receive the mining effluent, exhibited higher pH values upstream the mine, reflecting the acid mine drainage discharge. The radionuclides transport indicates the importance of monitoring their activity concentration in natural waters due to the risks that the radioactivity can represent to human health.

Keywords: Radon, radium, acid mine drainage, coal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
8180 SEM and AFM Investigations of Surface Defects and Tool Wear of Multilayers Coated Carbide Inserts

Authors: Ayman M. Alaskari, Samy E. Oraby, Abdulla I. Almazrouee

Abstract:

Coated tool inserts can be considered as the backbone of machining processes due to their wear and heat resistance. However, defects of coating can degrade the integrity of these inserts and the number of these defects should be minimized or eliminated if possible. Recently, the advancement of coating processes and analytical tools open a new era for optimizing the coating tools. First, an overview is given regarding coating technology for cutting tool inserts. Testing techniques for coating layers properties, as well as the various coating defects and their assessment are also surveyed. Second, it is introduced an experimental approach to examine the possible coating defects and flaws of worn multicoated carbide inserts using two important techniques namely scanning electron microscopy and atomic force microscopy. Finally, it is recommended a simple procedure for investigating manufacturing defects and flaws of worn inserts.

Keywords: AFM, Coated inserts, Defects, SEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3040
8179 Design of a Robust Controller for AGC with Combined Intelligence Techniques

Authors: R. N. Patel, S. K. Sinha, R. Prasad

Abstract:

In this work Artificial Intelligence (AI) techniques like Fuzzy logic, Genetic Algorithms and Particle Swarm Optimization have been used to improve the performance of the Automatic Generation Control (AGC) system. Instead of applying Genetic Algorithms and Particle swarm optimization independently for optimizing the parameters of the conventional AGC with PI controller, an intelligent tuned Fuzzy logic controller (acting as the secondary controller in the AGC system) has been designed. The controller gives an improved dynamic performance for both hydrothermal and thermal-thermal power systems under a variety of operating conditions.

Keywords: Artificial intelligence, Automatic generation control, Fuzzy control, Genetic Algorithm, Particle swarm optimization, Power systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
8178 Dynamic Load Balancing in PVM Using Intelligent Application

Authors: Kashif Bilal, Tassawar Iqbal, Asad Ali Safi, Nadeem Daudpota

Abstract:

This paper deals with dynamic load balancing using PVM. In distributed environment Load Balancing and Heterogeneity are very critical issues and needed to drill down in order to achieve the optimal results and efficiency. Various techniques are being used in order to distribute the load dynamically among different nodes and to deal with heterogeneity. These techniques are using different approaches where Process Migration is basic concept with different optimal flavors. But Process Migration is not an easy job, it impose lot of burden and processing effort in order to track each process in nodes. We will propose a dynamic load balancing technique in which application will intelligently balance the load among different nodes, resulting in efficient use of system and have no overheads of process migration. It would also provide a simple solution to problem of load balancing in heterogeneous environment.

Keywords: PVM, load balancing, task allocation, intelligent application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
8177 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: Critical success factors, data quality, data quality management, Delphi, Q-Sort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
8176 Secure Data Aggregation Using Clusters in Sensor Networks

Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik

Abstract:

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Keywords: Aggregation, Clustering, Query Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
8175 Design and Optimization of a Microstrip Patch Antenna for Increased Bandwidth

Authors: Ankit Jain, Archana Agrawal

Abstract:

With the ever-increasing need for wireless communication and the emergence of many systems, it is important to design broadband antennas to cover a wide frequency range. The aim of this paper is to design a broadband patch antenna, employing the three techniques of slotting, adding directly coupled parasitic elements, and fractal EBG structures. The bandwidth is improved from 9.32% to 23.77%. A wideband ranging from 4.15 GHz to 5.27 GHz is obtained. Also a comparative analysis of embedding EBG structures at different heights is also done. The composite effect of integrating these techniques in the design provides a simple and efficient method for obtaining low profile, broadband, high gain antenna. By the addition of parasitic elements the bandwidth was increased to only 18.04%. Later on by embedding EBG structures the bandwidth was increased up to 23.77%. The design is suitable for variety of wireless applications like WLAN and Radar Applications.

Keywords: Bandwidth, broadband, EBG structures, parasitic elements, Slotting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3381
8174 Urbanization and Income Inequality in Thailand

Authors: Acumsiri Tantiakrnpanit

Abstract:

This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020, using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for 19 selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.

Keywords: Income inequality, nighttime light, population density, Thailand, urbanization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 103
8173 A New DIDS Design Based on a Combination Feature Selection Approach

Authors: Adel Sabry Eesa, Adnan Mohsin Abdulazeez Brifcani, Zeynep Orman

Abstract:

Feature selection has been used in many fields such as classification, data mining and object recognition and proven to be effective for removing irrelevant and redundant features from the original dataset. In this paper, a new design of distributed intrusion detection system using a combination feature selection model based on bees and decision tree. Bees algorithm is used as the search strategy to find the optimal subset of features, whereas decision tree is used as a judgment for the selected features. Both the produced features and the generated rules are used by Decision Making Mobile Agent to decide whether there is an attack or not in the networks. Decision Making Mobile Agent will migrate through the networks, moving from node to another, if it found that there is an attack on one of the nodes, it then alerts the user through User Interface Agent or takes some action through Action Mobile Agent. The KDD Cup 99 dataset is used to test the effectiveness of the proposed system. The results show that even if only four features are used, the proposed system gives a better performance when it is compared with the obtained results using all 41 features.

Keywords: Distributed intrusion detection system, mobile agent, feature selection, Bees Algorithm, decision tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
8172 Manufacturing Process of S-Glass Fiber Reinforced PEKK Prepregs

Authors: Nassier A. Nassir, Robert Birch, Zhongwei Guan

Abstract:

The aim of this study is to investigate the fundamental science/technology related to novel S-glass fiber reinforced polyether- ketone-ketone (GF/PEKK) composites and to gain insight into bonding strength and failure mechanisms. Different manufacturing techniques to make this high-temperature pre-impregnated composite (prepreg) were conducted i.e. mechanical deposition, electrostatic powder deposition, and dry powder prepregging techniques. Generally, the results of this investigation showed that it was difficult to control the distribution of the resin powder evenly on the both sides of the fibers within a specific percentage. Most successful approach was by using a dry powder prepregging where the fibers were coated evenly with an adhesive that served as a temporary binder to hold the resin powder in place onto the glass fiber fabric.

Keywords: Dry powder technique, PEKK, S-glass, thermoplastic prepreg.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1062
8171 Efficient Secured Lossless Coding of Medical Images– Using Modified Runlength Coding for Character Representation

Authors: S. Annadurai, P. Geetha

Abstract:

Lossless compression schemes with secure transmission play a key role in telemedicine applications that helps in accurate diagnosis and research. Traditional cryptographic algorithms for data security are not fast enough to process vast amount of data. Hence a novel Secured lossless compression approach proposed in this paper is based on reversible integer wavelet transform, EZW algorithm, new modified runlength coding for character representation and selective bit scrambling. The use of the lifting scheme allows generating truly lossless integer-to-integer wavelet transforms. Images are compressed/decompressed by well-known EZW algorithm. The proposed modified runlength coding greatly improves the compression performance and also increases the security level. This work employs scrambling method which is fast, simple to implement and it provides security. Lossless compression ratios and distortion performance of this proposed method are found to be better than other lossless techniques.

Keywords: EZW algorithm, lifting scheme, losslesscompression, reversible integer wavelet transform, securetransmission, selective bit scrambling, modified runlength coding .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
8170 A New Protocol for Concealed Data Aggregation in Wireless Sensor Networks

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
8169 Developing ESL Students' Writing

Authors: Esmaeil Hassannejad

Abstract:

Some of the students' problems in writing skill stem from inadequate preparation for the writing assignment. Students should be taught how to write well when they arrive in language classes. Having selected a topic, the students examine and explore the theme from as large a variety of viewpoints as their background and imagination make possible. Another strategy is that the students prepare an Outline before writing the paper. The comparison between the two mentioned thought provoking techniques was carried out between the two class groups –students of Islamic Azad University of Dezful who were studying “Writing 2" as their main course. Each class group was assigned to write five compositions separately in different periods of time. Then a t-test for each pair of exams between the two class groups showed that the t-observed in each pair was more than the t-critical. Consequently, the first hypothesis which states those who utilize Brainstorming as a thought provoking technique in prewriting phase are more successful than those who outline the papers before writing was verified.

Keywords: Brainstorming, Outlining, Prewriting, Thought provoking techniques

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353
8168 CMOS Solid-State Nanopore DNA System-Level Sequencing Techniques Enhancement

Authors: Syed Islam, Yiyun Huang, Sebastian Magierowski, Ebrahim Ghafar-Zadeh

Abstract:

This paper presents system level CMOS solid-state nanopore techniques enhancement for speedup next generation molecular recording and high throughput channels. This discussion also considers optimum number of base-pair (bp) measurements through channel as an important role to enhance potential read accuracy. Effective power consumption estimation offered suitable range of multi-channel configuration. Nanopore bp extraction model in statistical method could contribute higher read accuracy with longer read-length (200 < read-length). Nanopore ionic current switching with Time Multiplexing (TM) based multichannel readout system contributed hardware savings.

Keywords: DNA, Nanopore, Amplifier, ADC, Multichannel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2922
8167 Evaluating Factors Influencing Information Quality in Large Firms

Authors: B. E. Narkhede, S. K. Mahajan, B. T. Patil, R. D. Raut

Abstract:

Information quality is a major performance measure for an Enterprise Resource Planning (ERP) system of any firm. This study identifies various critical success factors of information quality. The effect of various critical success factors like project management, reengineering efforts and interdepartmental communications on information quality is analyzed using a multiple regression model. Here quantitative data are collected from respondents from various firms through structured questionnaire for assessment of the information quality, project management, reengineering efforts and interdepartmental communications. The validity and reliability of the data are ensured using techniques like factor analysis, computing of Cronbach’s alpha. This study gives relative importance of each of the critical success factors. The findings suggest that among the various factors influencing information quality careful reengineering efforts are the most influencing factor. This paper gives clear insight to managers and practitioners regarding the relative importance of critical success factors influencing information quality so that they can formulate a strategy at the beginning of ERP system implementation.

Keywords: Enterprise resource planning, information systems, multiple regression, information quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105