Search results for: Data integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8022

Search results for: Data integration

7092 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon

Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba

Abstract:

In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.

Keywords: Population, road network, statistical correlations, remote sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003
7091 Risk-Management by Numerical Pattern Analysis in Data-Mining

Authors: M. Kargar, R. Mirmiran, F. Fartash, T. Saderi

Abstract:

In this paper a new method is suggested for risk management by the numerical patterns in data-mining. These patterns are designed using probability rules in decision trees and are cared to be valid, novel, useful and understandable. Considering a set of functions, the system reaches to a good pattern or better objectives. The patterns are analyzed through the produced matrices and some results are pointed out. By using the suggested method the direction of the functionality route in the systems can be controlled and best planning for special objectives be done.

Keywords: Analysis, Data-mining, Pattern, Risk Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1278
7090 Wind Speed Data Analysis using Wavelet Transform

Authors: S. Avdakovic, A. Lukac, A. Nuhanovic, M. Music

Abstract:

Renewable energy systems are becoming a topic of great interest and investment in the world. In recent years wind power generation has experienced a very fast development in the whole world. For planning and successful implementations of good wind power plant projects, wind potential measurements are required. In these projects, of great importance is the effective choice of the micro location for wind potential measurements, installation of the measurement station with the appropriate measuring equipment, its maintenance and analysis of the gained data on wind potential characteristics. In this paper, a wavelet transform has been applied to analyze the wind speed data in the context of insight in the characteristics of the wind and the selection of suitable locations that could be the subject of a wind farm construction. This approach shows that it can be a useful tool in investigation of wind potential.

Keywords: Wind potential, Wind speed data, Wavelettransform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2637
7089 Simultaneous Optimization of Design and Maintenance through a Hybrid Process Using Genetic Algorithms

Authors: O. Adjoul, A. Feugier, K. Benfriha, A. Aoussat

Abstract:

In general, issues related to design and maintenance are considered in an independent manner. However, the decisions made in these two sets influence each other. The design for maintenance is considered an opportunity to optimize the life cycle cost of a product, particularly in the nuclear or aeronautical field, where maintenance expenses represent more than 60% of life cycle costs. The design of large-scale systems starts with product architecture, a choice of components in terms of cost, reliability, weight and other attributes, corresponding to the specifications. On the other hand, the design must take into account maintenance by improving, in particular, real-time monitoring of equipment through the integration of new technologies such as connected sensors and intelligent actuators. We noticed that different approaches used in the Design For Maintenance (DFM) methods are limited to the simultaneous characterization of the reliability and maintainability of a multi-component system. This article proposes a method of DFM that assists designers to propose dynamic maintenance for multi-component industrial systems. The term "dynamic" refers to the ability to integrate available monitoring data to adapt the maintenance decision in real time. The goal is to maximize the availability of the system at a given life cycle cost. This paper presents an approach for simultaneous optimization of the design and maintenance of multi-component systems. Here the design is characterized by four decision variables for each component (reliability level, maintainability level, redundancy level, and level of monitoring data). The maintenance is characterized by two decision variables (the dates of the maintenance stops and the maintenance operations to be performed on the system during these stops). The DFM model helps the designers choose technical solutions for the large-scale industrial products. Large-scale refers to the complex multi-component industrial systems and long life-cycle, such as trains, aircraft, etc. The method is based on a two-level hybrid algorithm for simultaneous optimization of design and maintenance, using genetic algorithms. The first level is to select a design solution for a given system that considers the life cycle cost and the reliability. The second level consists of determining a dynamic and optimal maintenance plan to be deployed for a design solution. This level is based on the Maintenance Free Operating Period (MFOP) concept, which takes into account the decision criteria such as, total reliability, maintenance cost and maintenance time. Depending on the life cycle duration, the desired availability, and the desired business model (sales or rental), this tool provides visibility of overall costs and optimal product architecture.

Keywords: Availability, design for maintenance, DFM, dynamic maintenance, life cycle cost, LCC, maintenance free operating period, MFOP, simultaneous optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 605
7088 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 852
7087 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data

Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang

Abstract:

De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.

Keywords: Contig graph, NGS, de novo assembly, scaffold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
7086 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application

Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil

Abstract:

In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or  absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.

Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2120
7085 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 660
7084 Data Mining Determination of Sunlight Average Input for Solar Power Plant

Authors: Fl. Loury, P. Sablonière, C. Lamoureux, G. Magnier, Th. Gutierrez

Abstract:

A method is proposed to extract faithful representative patterns from data set of observations when they are suffering from non-negligible fluctuations. Supposing time interval between measurements to be extremely small compared to observation time, it consists in defining first a subset of intermediate time intervals characterizing coherent behavior. Data projection on these intervals gives a set of curves out of which an ideally “perfect” one is constructed by taking the sup limit of them. Then comparison with average real curve in corresponding interval gives an efficiency parameter expressing the degradation consecutive to fluctuation effect. The method is applied to sunlight data collected in a specific place, where ideal sunlight is the one resulting from direct exposure at location latitude over the year, and efficiency is resulting from action of meteorological parameters, mainly cloudiness, at different periods of the year. The extracted information already gives interesting element of decision, before being used for analysis of plant control.

Keywords: Base Input Reconstruction, Data Mining, Efficiency Factor, Information Pattern Operator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
7083 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems

Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas

Abstract:

This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.

Keywords: Transportation networks, freight delivery, data flow, monitoring, e-services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 619
7082 Inefficiency of Data Storing in Physical Memory

Authors: Kamaruddin Malik Mohamad, Sapiee Haji Jamel, Mustafa Mat Deris

Abstract:

Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.

Keywords: Digital Evidence, Memory Forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026
7081 Development of an Avionics System for Flight Data Collection of an UAV Helicopter

Authors: Nikhil Ramaswamy, S.N.Omkar, Kashyap.H.Nathwani, Anil.M.Vanjare

Abstract:

In this present work, the development of an avionics system for flight data collection of a Raptor 30 V2 is carried out. For the data acquisition both onground and onboard avionics systems are developed for testing of a small-scale Unmanned Aerial Vehicle (UAV) helicopter. The onboard avionics record the helicopter state outputs namely accelerations, angular rates and Euler angles, in real time, and the on ground avionics system record the inputs given to the radio controlled helicopter through a transmitter, in real time. The avionic systems are designed and developed taking into consideration low weight, small size, anti-vibration, low power consumption, and easy interfacing. To mitigate the medium frequency vibrations embedded on the UAV helicopter during flight, a damper is designed and its performance is evaluated. A number of flight tests are carried out and the data obtained is then analyzed for accuracy and repeatability and conclusions are inferred.

Keywords: Data collection, Flight Testing, Onground and Onboard Avionics, UAV helicopter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2648
7080 Unit Testing with Déjà-Vu Objects

Authors: Sharareh Afsharian, Andrea Bei, Marco Bianchi

Abstract:

In this paper we introduce a new unit test technique called déjà-vu object. Déjà-vu objects replace real objects used by classes under test, allowing the execution of isolated unit tests. A déjà-vu object is able to observe and record the behaviour of a real object during real sessions, and to replace it during unit tests, returning previously recorded results. Consequently déjà-vu object technique can be useful when a bottom-up development and testing strategy is adopted. In this case déjà-vu objects can increase test portability and test source code readability. At the same time they can reduce the time spent by programmers to develop test code and the risk of incompatibility during the switching between déjà-vu and production code.

Keywords: Bottom-up testing approach, integration test, testportability, unit test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598
7079 The Research of Fuzzy Classification Rules Applied to CRM

Authors: Chien-Hua Wang, Meng-Ying Chou, Chin-Tzong Pang

Abstract:

In the era of great competition, understanding and satisfying customers- requirements are the critical tasks for a company to make a profits. Customer relationship management (CRM) thus becomes an important business issue at present. With the help of the data mining techniques, the manager can explore and analyze from a large quantity of data to discover meaningful patterns and rules. Among all methods, well-known association rule is most commonly seen. This paper is based on Apriori algorithm and uses genetic algorithms combining a data mining method to discover fuzzy classification rules. The mined results can be applied in CRM to help decision marker make correct business decisions for marketing strategies.

Keywords: Customer relationship management (CRM), Data mining, Apriori algorithm, Genetic algorithm, Fuzzy classification rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669
7078 Equilibrium Modeling of Carbon Dioxide Adsorption on Zeolites

Authors: Alireza Behvandi, Somayeh Tourani

Abstract:

High pressure adsorption of carbon dioxide on zeolite 13X was investigated in the pressure range (0 to 4) Mpa and temperatures 298, 308 and 323K. The data fitting is accomplished with the Toth, UNILAN, Dubinin-Astakhov and virial adsorption models which are generally used for micro porous adsorbents such as zeolites. Comparison with experimental data from the literature indicated that the virial model would best determine results. These results may be partly attributed to the flexibility of the virial model which can accommodate as many constants as the data warrants.

Keywords: adsorption models, zeolite, carbon dioxide

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2891
7077 Application of Java-based Pointcuts in Aspect Oriented Programming (AOP) for Data Race Detection

Authors: Sadaf Khalid, Fahim Arif

Abstract:

Wide applicability of concurrent programming practices in developing various software applications leads to different concurrency errors amongst which data race is the most important. Java provides greatest support for concurrent programming by introducing various concurrency packages. Aspect oriented programming (AOP) is modern programming paradigm facilitating the runtime interception of events of interest and can be effectively used to handle the concurrency problems. AspectJ being an aspect oriented extension to java facilitates the application of concepts of AOP for data race detection. Volatile variables are usually considered thread safe, but they can become the possible candidates of data races if non-atomic operations are performed concurrently upon them. Various data race detection algorithms have been proposed in the past but this issue of volatility and atomicity is still unaddressed. The aim of this research is to propose some suggestions for incorporating certain conditions for data race detection in java programs at the volatile fields by taking into account support for atomicity in java concurrency packages and making use of pointcuts. Two simple test programs will demonstrate the results of research. The results are verified on two different Java Development Kits (JDKs) for the purpose of comparison.

Keywords: Aspect Bench Compiler (abc), Aspect OrientedProgramming (AOP), AspectJ, Aspects, Concurrency packages, Concurrent programming, Cross-cutting Concerns, Data race, Eclipse, Java, Java Development Kits (JDKs), Pointcuts

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
7076 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 522
7075 The Concept and Practice of Good Governance in the European Union

Authors: Robert Grzeszczak

Abstract:

The article deals with one of the most significant issues concerning the functioning of the public sector in the European Union. The objectives of good governance were formulated by the EU itself and also the Scholars in reaction to the discussion that started a decade ago and concerned the role of the government in 21st century, the future of integration processes and globalization challenges in Europe. Currently, the concept of good governance is mainly associated with the improvement of management of public policies in the European Union, concerning both domestic and EU policies. However, it goes beyond the issues of state capacity and effectiveness of management. Good governance relates also to societal participation in the public administration and verification of decisions made in public authorities’ (including public administration). Indirectly, the concept and practice of good governance are connected to societal legitimisation of public bodies in the European Union.

Keywords: Good governance, Government, European law, European Union.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4425
7074 Actionable Rules: Issues and New Directions

Authors: Harleen Kaur

Abstract:

Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.

Keywords: Data Mining Community, Knowledge Discovery inDatabases (KDD), Interestingness, Subjective Measures, Actionability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
7073 Model Discovery and Validation for the Qsar Problem using Association Rule Mining

Authors: Luminita Dumitriu, Cristina Segal, Marian Craciun, Adina Cocu, Lucian P. Georgescu

Abstract:

There are several approaches in trying to solve the Quantitative 1Structure-Activity Relationship (QSAR) problem. These approaches are based either on statistical methods or on predictive data mining. Among the statistical methods, one should consider regression analysis, pattern recognition (such as cluster analysis, factor analysis and principal components analysis) or partial least squares. Predictive data mining techniques use either neural networks, or genetic programming, or neuro-fuzzy knowledge. These approaches have a low explanatory capability or non at all. This paper attempts to establish a new approach in solving QSAR problems using descriptive data mining. This way, the relationship between the chemical properties and the activity of a substance would be comprehensibly modeled.

Keywords: association rules, classification, data mining, Quantitative Structure - Activity Relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
7072 Learning Bridge: A Reading Comprehension Platform with Rich Media

Authors: Yu-Chin Kuo, Szu-Wei Yang, Hsin-Hung Kuo

Abstract:

A Reading Comprehend (RC) Platform has been constructed and developed to facilitate children-s English reading comprehension. Like a learning bridge, the RC Platform focuses on the integration of rich media and picture-book texts. The study is to examine the effects of the project within the RC Platform for children. Two classes of fourth graders were selected from a public elementary school in an urban area of central Taiwan. The findings taken from the survey showed that the students demonstrated high interest in the RC Platform. The students benefited greatly and enjoyed reading via the technology-enhanced project within the RC Platform. This Platform is a good reading bridge to enrich students- learning experiences and enhance their performance in English reading comprehension.

Keywords: English Teaching, Multimedia-based Learning, Learning Platform, Reading Comprehension, Technology EnhancedLearning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
7071 On using PEMFC for Electrical Power Generation on More Electric Aircraft

Authors: Jenica Ileana Corcau, Liviu Dinca

Abstract:

The electrical power systems of aircrafts have made serious progress in recent years because the aircrafts depend more and more on the electricity. There is a trend in the aircraft industry to replace hydraulic and pneumatic systems with electrical systems, achieving more comfort and monitoring features and enlarging the energetic efficiency. Thus, was born the concept More Electric Aircraft. In this paper is analyzed the integration of a fuel cell into the existing electrical generation and distribution systems of an aircraft. The dynamic characteristics of fuel cell systems necessitate an adaptation of the electrical power system. The architecture studied in this paper consists of a 50kW fuel cell, a dc to dc converter and several loads. The dc to dc converter is used to step down the fuel cell voltage from about 625Vdc to 28Vdc.

Keywords: Electrical power system, More Electric Aircraft, Fuel Cell, dc to dc converter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2185
7070 Mathematical Modeling of a Sub-Wet Bulb Temperature Evaporative Cooling Using Porous Ceramic Materials

Authors: Meryem Kanzari, Rabah Boukhanouf, Hatem G. Ibrahim

Abstract:

Indirect Evaporative Cooling process has the advantage of supplying cool air at constant moisture content. However, such system can only supply air at temperatures above wet bulb temperature. This paper presents a mathematical model for a Sub-wet bulb temperature indirect evaporative cooling arrangement that can overcome this limitation and supply cool air at temperatures approaching dew point and without increasing its moisture content. In addition, the use of porous ceramics as wet media materials offers the advantage of integration into building elements. Results of the computer show the proposed design is capable of cooling air to temperatures lower than the ambient wet bulb temperature and achieving wet bulb effectiveness of about 1.17.

Keywords: Indirect evaporative cooling, porous ceramic, sub-wet bulb temperature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4514
7069 From Modeling of Data Structures towards Automatic Programs Generating

Authors: Valentin P. Velikov

Abstract:

Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.

Keywords: Computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
7068 Visual Analytics in K 12 Education - Emerging Dimensions of Complexity

Authors: Linnea Stenliden

Abstract:

The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors within Actor-network theory (ANT). The learning conditions are found to be distinguished by broad complexity, characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.

Keywords: Analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
7067 An Energy Efficient Cluster Formation Protocol with Low Latency In Wireless Sensor Networks

Authors: A. Allirani, M. Suganthi

Abstract:

Data gathering is an essential operation in wireless sensor network applications. So it requires energy efficiency techniques to increase the lifetime of the network. Similarly, clustering is also an effective technique to improve the energy efficiency and network lifetime of wireless sensor networks. In this paper, an energy efficient cluster formation protocol is proposed with the objective of achieving low energy dissipation and latency without sacrificing application specific quality. The objective is achieved by applying randomized, adaptive, self-configuring cluster formation and localized control for data transfers. It involves application - specific data processing, such as data aggregation or compression. The cluster formation algorithm allows each node to make independent decisions, so as to generate good clusters as the end. Simulation results show that the proposed protocol utilizes minimum energy and latency for cluster formation, there by reducing the overhead of the protocol.

Keywords: Sensor networks, Low latency, Energy sorting protocol, data processing, Cluster formation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2747
7066 An Approach to Practical Determination of Fair Premium Rates in Crop-Hail Insurance Using Short-Term Insurance Data

Authors: Necati Içer

Abstract:

Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major challenge in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.

Keywords: Crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53
7065 The Pixel Value Data Approach for Rainfall Forecasting Based on GOES-9 Satellite Image Sequence Analysis

Authors: C. Yaiprasert, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

To develop a process of extracting pixel values over the using of satellite remote sensing image data in Thailand. It is a very important and effective method of forecasting rainfall. This paper presents an approach for forecasting a possible rainfall area based on pixel values from remote sensing satellite images. First, a method uses an automatic extraction process of the pixel value data from the satellite image sequence. Then, a data process is designed to enable the inference of correlations between pixel value and possible rainfall occurrences. The result, when we have a high averaged pixel value of daily water vapor data, we will also have a high amount of daily rainfall. This suggests that the amount of averaged pixel values can be used as an indicator of raining events. There are some positive associations between pixel values of daily water vapor images and the amount of daily rainfall at each rain-gauge station throughout Thailand. The proposed approach was proven to be a helpful manual for rainfall forecasting from meteorologists by which using automated analyzing and interpreting process of meteorological remote sensing data.

Keywords: Pixel values, satellite image, water vapor, rainfall, image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
7064 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran

Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh

Abstract:

Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.

Keywords: Malmquist Index, Grey's Theory, Charnes Cooper & Rhodes (CCR) Model, network data envelopment analysis, Iran electricity power chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 562
7063 Key Strategies for a Competitive Supply Chain

Authors: Ajay Verma, Nitin Seth

Abstract:

In this era of competitiveness, there is a growing need for supply chains also to become competitive enough to handle pressures like varying customer’s expectations, low cost high quality products to be delivered at the minimum time and the most important is throat cutting competition at world wide scale. In the recent years, supply chain competitiveness has been, therefore, accepted as one of the most important philosophies in the supply chain literature and researchers have tried to identify variables of supply chain competitiveness. This paper highlights some of the concepts of supply chain competitiveness and tries to identify select variables on the basis of literature review. Further, the paper tries to highlight the importance of the identified variables in the achievement of supply chain competitiveness. The aim is to explore the concept and to motivate researchers to further investigate the unexplored areas of this important subject domain.

Keywords: Supply Chain Competitiveness, Demand Management, Integration, Inventory Management, Flexibility, Information Technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3472