Search results for: business data type
8257 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model
Authors: Siyuan Jing, Kun She
Abstract:
Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15888256 The Journey of a Malicious HTTP Request
Authors: M. Mansouri, P. Jaklitsch, E. Teiniker
Abstract:
SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect highlevel attacks such as SQL injection.
Keywords: Linux system calls, Web attack detection, Interception.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20078255 A Mobile Agent-based Clustering Data Fusion Algorithm in WSN
Authors: Xiangbin Zhu, Wenjuan Zhang
Abstract:
In wireless sensor networks,the mobile agent technology is used in data fusion. According to the node residual energy and the results of partial integration,we design the node clustering algorithm. Optimization of mobile agent in the routing within the cluster strategy for wireless sensor networks to further reduce the amount of data transfer. Through the experiments, using mobile agents in the integration process within the cluster can be reduced the path loss in some extent.
Keywords: wireless sensor networks, data fusion, mobile agent
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15118254 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.
Keywords: Data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12358253 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences
Authors: C. Xavier Mendieta, J. J McArthur
Abstract:
Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.Keywords: Building archetypes, data analysis, energy benchmarks, GHG emissions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10248252 Experimental Characterization of the Color Quality and Error Rate for an Red, Green, and Blue-Based Light Emission Diode-Fixture Used in Visible Light Communications
Authors: Juan F. Gutierrez, Jesus M. Quintero, Diego Sandoval
Abstract:
An important feature of Lighting Emitting Diodes (LED) technology is the fast on-off commutation. This fact allows data transmission using modulation formats such as On-Off Keying (OOK) and Color Shift Keying (CSK). Since, CSK based on three color bands uses red, green, and blue monochromatic LED (RGB-LED) to define a pattern of chromaticities; this type of CSK provides poor color quality on the illuminated area. In this work, we present the design and implementation of a VLC system using RGB-based CSK with 16, 8, and 4 color points, mixing with a steady baseline of a phosphor white-LED, to improve the color quality of the LED-Fixture. The experimental system was assessed in terms of the Symbol Error Rate (SER) and the Color Rendering Index (CRI). Good color quality performance of the LED-Fixture was obtained with an acceptable SER. We describe the laboratory setup used to characterize and calibrate an LED-Fixture.
Keywords: Color rendering index, symbol error rate, color shift keying, visible light communications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738251 Mining Educational Data to Analyze the Student Motivation Behavior
Authors: Kunyanuth Kularbphettong, Cholticha Tongsiri
Abstract:
The purpose of this research aims to discover the knowledge for analysis student motivation behavior on e-Learning based on Data Mining Techniques, in case of the Information Technology for Communication and Learning Course at Suan Sunandha Rajabhat University. The data mining techniques was applied in this research including association rules, classification techniques. The results showed that using data mining technique can indicate the important variables that influence the student motivation behavior on e-Learning.Keywords: association rule mining, classification techniques, e- Learning, Moodle log Motivation Behavior
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30938250 Construction Of Decentralized Lifetime Maximizing Tree for Data Aggregation in Wireless Sensor Networks
Authors: Deepali Virmani , Satbir Jain
Abstract:
To meet the demands of wireless sensor networks (WSNs) where data are usually aggregated at a single source prior to transmitting to any distant user, there is a need to establish a tree structure inside any given event region. In this paper , a novel technique to create one such tree is proposed .This tree preserves the energy and maximizes the lifetime of event sources while they are constantly transmitting for data aggregation. The term Decentralized Lifetime Maximizing Tree (DLMT) is used to denote this tree. DLMT features in nodes with higher energy tend to be chosen as data aggregating parents so that the time to detect the first broken tree link can be extended and less energy is involved in tree maintenance. By constructing the tree in such a way, the protocol is able to reduce the frequency of tree reconstruction, minimize the amount of data loss ,minimize the delay during data collection and preserves the energy.Keywords: branch energy, decentralized, energy level , lifetime, tree energy, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14888249 Brand Placement Strategies in Turkey: The Case of “Yalan Dünya”
Authors: Burçe Boyraz
Abstract:
This study examines appearances of brand placement as an alternative communication strategy in television series by focusing on Yalan Dünya which is one of the most popular television series in Turkey. Consequently, this study has a descriptive research design and quantitative content analysis method is used in order to analyze frequency and time data of brand placement appearances in first 3 seasons of Yalan Dünya with 16 episodes. Analysis of brand placement practices in Yalan Dünya is dealt in three categories: episode-based analysis, season-based analysis and comparative analysis. At the end, brand placement practices in Yalan Dünya are evaluated in terms of type, form, duration and legal arrangements. As a result of this study, it is seen that brand placement plays a determinant role in Yalan Dünya content. Also, current legal arrangements make brand placement closer to other traditional communication strategies instead of differing brand placement from them distinctly.
Keywords: Advertising, Alternative communication strategy, Brand placement, Yalan Dünya.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43718248 Interoperability in Component Based Software Development
Authors: M. Madiajagan, B. Vijayakumar
Abstract:
The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Keywords: Interoperability, component packaging, communication technology, heterogeneous platform, component interface, middleware.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27898247 Effects of Data Correlation in a Sparse-View Compressive Sensing Based Image Reconstruction
Authors: Sajid Abbas, Joon Pyo Hong, Jung-Ryun Lee, Seungryong Cho
Abstract:
Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.
Keywords: Computed tomography, Computed laminography, Compressive sending, Low-dose.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16728246 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data
Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.Keywords: Real-Time Spatial Big Data, Quality Of Service, Vertical partitioning, Horizontal partitioning, Matching algorithm, Hamming distance, Stream query.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10568245 Design of a Service-Enabled Dependable Integration Environment
Authors: Fuyang Peng, Donghong Li
Abstract:
The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.Keywords: Application integration, dependability, legacy, SOA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11808244 Incidence of Disasters and Coping Mechanism among Farming Households in South West Nigeria
Authors: Fawehinmi Olabisi Alaba, Adeniyi O. R.
Abstract:
Farming households faces lots of disaster which contribute to endemic poverty. Anticipated increases in extreme weather events will exacerbate this. Primary data was administered to farming household using multi-stage random sampling technique. The result of the analysis shows that majority of the respondents (69.9%) are male, have mean household size, years of formal education and age of 5±1.14, 6±3.41, and 51.06±10.43 respectively. The major (48.9%) type of disaster experienced is flooding. Major coping mechanism adopted is sourcing for support from family and friends. Age, education, experience, access to extension agent, and mitigation control method contribute significantly to vulnerability to disaster. The major adaptation method (62.3%) is construction of drainage.
The study revealed that the coping mechanisms employed may become less effective as increasingly fragile livelihood systems struggle to withstand disaster shocks. Thus there is need for training of the farmers on measures to adapt to mitigate the shock from disasters
Keywords: Adaptation, Disasters, Flooding, Vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21278243 Using Metacognitive Strategies in Reading Comprehension by EFL Students
Authors: Simin Sadeghi-Saeb
Abstract:
Metacognitive strategies consistently play important roles in reading comprehension. The metacognitive strategies involve the active monitoring and consequent regulation and orchestration of the cognitive processes in relation to the cognitive objects or data on which they bear. In this paper, the effect of instruction in using metacognitive strategies on reading academic materials, type of metacognitive strategies were mostly used by college university students before and after the instruction and the level they use those strategies before and after the instruction were studied. For these aims, 50 female college students were chosen. Then, they were divided randomly into two groups, experimental and control groups. At first session, students in both groups took the standard TOFEL exam. After the pre-test had been administered, the instruction began. After treatment, a post-test was taken. It is useful to state that after pre-test and post-test the same questionnaire was handed to the students of experimental group. The results of this research show that the instruction of metacognitive strategies has positive effect on the students' scores in reading comprehension tests. Furthermore, it showed that before and after the instruction, the students' usage of metacognitive strategies changed. Also, it demonstrated that the instruction affected the students' level of metacognitive strategies' usage.
Keywords: EFL students, English reading comprehension, instruction, metacognitive strategies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15848242 Static Headspace GC Method for Aldehydes Determination in Different Food Matrices
Authors: A. Mandić, M. Sakač, A. Mišan, B. Šojić, L. Petrović, I. Lončarević, B. Pajin, I. Sedej
Abstract:
Aldehydes as secondary lipid oxidation products are highly specific to the oxidative degradation of particular polyunsaturated fatty acids present in foods. Gas chromatographic analysis of those volatile compounds has been widely used for monitoring of the deterioration of food products. Developed static headspace gas chromatography method using flame ionization detector (SHS GC FID) was applied to monitor the aldehydes present in processed foods such as bakery, meat and confectionary products.
Five selected aldehydes were determined in samples without any sample preparation, except grinding for bakery and meat products. SHS–GC analysis allows the separation of propanal, pentanal, hexanal, heptanal and octanal, within 15min. Aldehydes were quantified in fresh and stored samples, and the obtained range of aldehydes in crackers was 1.62±0.05 – 9.95±0.05mg/kg, in sausages 6.62±0.46 – 39.16±0.39mg/kg; and in cocoa spread cream 0.48±0.01 – 1.13±0.02mg/kg. Referring to the obtained results, the following can be concluded, proposed method is suitable for different types of samples, content of aldehydes varies depending on the type of a sample, and differs in fresh and stored samples of the same type.
Keywords: Lipid oxidation, aldehydes, crackers, sausage, cocoa cream spread.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49458241 A Data Warehouse System to Help Assist Breast Cancer Screening in Diagnosis, Education and Research
Authors: Souâd Demigha
Abstract:
Early detection of breast cancer is considered as a major public health issue. Breast cancer screening is not generalized to the entire population due to a lack of resources, staff and appropriate tools. Systematic screening can result in a volume of data which can not be managed by present computer architecture, either in terms of storage capabilities or in terms of exploitation tools. We propose in this paper to design and develop a data warehouse system in radiology-senology (DWRS). The aim of such a system is on one hand, to support this important volume of information providing from multiple sources of data and images and for the other hand, to help assist breast cancer screening in diagnosis, education and research.Keywords: Breast cancer screening, data warehouse, diagnosis, education, research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17158240 Data Security in a DApp Twitter Alike on Web 3.0 With Blockchain Based Technology
Authors: Vishal Awasthi, Tanya Soni, Vigya Awasthi, Swati Singh, Shivali Verma
Abstract:
There is a growing demand for a network that grants a high level of data security and confidentiality. For this reason, the semantic web was introduced, which allows data to be shared and reused across applications while safeguarding users privacy and user’s will grab back control of their data. The earlier Web 1.0 and Web 2.0 versions were built on client-server architecture, in which there was the risk of data theft and unconsented sale of user data. A decentralized version, Known as Web 3.0, that is mostly built on blockchain technology was interjected to resolve these issues. The recent research focuses on blockchain technology, deals with privacy, security, transparency, and innovation of decentralized applications (DApps), e.g. a Twitter Clone, Whatsapp clone. In this paper the Twitter Alike built on the Ethereum blockchain will replace traditional techniques with improved latency, throughput, and data ownership. The central principle of this DApp is smart contract implemented using Solidity which is an object- oriented and highlevel language. Consequently, this will provide a better Quality Services, high data security, and integrity for both present and future internet technologies.
Keywords: Blockchain, DApps, Ethereum, Semantic Web, Smart Contract, Solidity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3338239 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study
Authors: Faisal Aburub, Wael Hadi
Abstract:
Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.Keywords: Classification, data mining, evaluation measures, groundwater.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25958238 Transformation of Industrial Policy towards Industry 4.0 and Its Impact on Firms' Competition
Authors: Arūnas Burinskas
Abstract:
Although Europe is on the threshold of a new industrial revolution called Industry 4.0, many believe that this will increase the flexibility of production, the mass adaptation of products to consumers and the speed of their service; it will also improve product quality and dramatically increase productivity. However, as expected, all the benefits of Industry 4.0 face many of the inevitable changes and challenges they pose. One of them is the inevitable transformation of current competition and business models. This article examines the possible results of competitive conversion from the classic Bertrand and Cournot models to qualitatively new competition based on innovation. Ability to deliver a new product quickly and the possibility to produce the individual design (through flexible and quickly configurable factories) by reducing equipment failures and increasing process automation and control is highly important. This study shows that the ongoing transformation of the competition model is changing the game. This, together with the creation of complex value networks, means huge investments that make it particularly difficult for small and medium-sized enterprises. In addition, the ongoing digitalization of data raises new concerns regarding legal obligations, intellectual property, and security.
Keywords: Bertrand and Cournot Competition, competition model, Industry 4.0, industrial organization, monopolistic competition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4738237 Data Mining on the Router Logs for Statistical Application Classification
Authors: M. Rahmati, S.M. Mirzababaei
Abstract:
With the advance of information technology in the new era the applications of Internet to access data resources has steadily increased and huge amount of data have become accessible in various forms. Obviously, the network providers and agencies, look after to prevent electronic attacks that may be harmful or may be related to terrorist applications. Thus, these have facilitated the authorities to under take a variety of methods to protect the special regions from harmful data. One of the most important approaches is to use firewall in the network facilities. The main objectives of firewalls are to stop the transfer of suspicious packets in several ways. However because of its blind packet stopping, high process power requirements and expensive prices some of the providers are reluctant to use the firewall. In this paper we proposed a method to find a discriminate function to distinguish between usual packets and harmful ones by the statistical processing on the network router logs. By discriminating these data, an administrator may take an approach action against the user. This method is very fast and can be used simply in adjacent with the Internet routers.Keywords: Data Mining, Firewall, Optimization, Packetclassification, Statistical Pattern Recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16558236 Optimal Choice and Location of Multi Type Facts Devices in Deregulated Electricity Market Using Evolutionary Programming Method
Authors: K. Balamurugan, R. Muralisachithanandam, V. Dharmalingam, R. Srikanth
Abstract:
This paper deals with the optimal choice and allocation of multi FACTS devices in Deregulated power system using Evolutionary Programming method. The objective is to achieve the power system economic generation allocation and dispatch in deregulated electricity market. Using the proposed method, the locations of the FACTS devices, their types and ratings are optimized simultaneously. Different kinds of FACTS devices are simulated in this study such as UPFC, TCSC, TCPST, and SVC. Simulation results validate the capability of this new approach in minimizing the overall system cost function, which includes the investment costs of the FACTS devices and the bid offers of the market participants. The proposed algorithm is an effective and practical method for the choice and allocation of FACTS devices in deregulated electricity market environment. The standard data of IEEE 14 Bus systems has been taken into account and simulated with aid of MAT-lab software and results were obtained.
Keywords: FACTS devices, Optimal allocation, Deregulated electricity market, Evolutionary programming, Mat Lab.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23208235 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)
Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri
Abstract:
This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.
Keywords: JAX-WS, SMTP, SOAP, Web service, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21238234 Numerical Simulations of Flood and Inundation in Jobaru River Basin Using Laser Profiler Data
Authors: Hiroto Nakashima, Toshihiro Morita, Koichiro Ohgushi
Abstract:
Laser Profiler (LP) data from aerial laser surveys have been increasingly used as topographical inputs to numerical simulations of flooding and inundation in river basins. LP data has great potential for reproducing topography, but its effective usage has not yet been fully established. In this study, flooding and inundation are simulated numerically using LP data for the Jobaru River basin of Japan’s Saga Plain. The analysis shows that the topography is reproduced satisfactorily in the computational domain with urban and agricultural areas requiring different grid sizes. A 2-D numerical simulation shows that flood flow behavior changes as grid size is varied.
Keywords: LP data, numerical simulation, topological analysis, mesh size.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15368233 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology
Authors: Peristera Baziana
Abstract:
In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.Keywords: Access algorithm, channels division, collisions avoidance, wavelength division multiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10148232 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma
Abstract:
Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.Keywords: Road accident, machine learning, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11298231 A Testbed for the Experiments Performed in Missing Value Treatments
Authors: Dias de J. C. Lilian, Lobato M. F. Fábio, de Santana L. Ádamo
Abstract:
The occurrence of missing values in database is a serious problem for Data Mining tasks, responsible for degrading data quality and accuracy of analyses. In this context, the area has shown a lack of standardization for experiments to treat missing values, introducing difficulties to the evaluation process among different researches due to the absence in the use of common parameters. This paper proposes a testbed intended to facilitate the experiments implementation and provide unbiased parameters using available datasets and suited performance metrics in order to optimize the evaluation and comparison between the state of art missing values treatments.
Keywords: Data imputation, data mining, missing values treatment, testbed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15138230 Moral Reasoning and Behaviour in Adulthood
Authors: O. Matarazzo, L. Abbamonte, G. Nigro
Abstract:
This study aimed at assessing whether and to what extent moral judgment and behaviour were: 1. situation-dependent; 2. selectively dependent on cognitive and affective components; 3. influenced by gender and age; 4. reciprocally congruent. In order to achieve these aims, four different types of moral dilemmas were construed and five types of thinking were presented for each of them – representing five possible ways to evaluate the situation. The judgment criteria included selfishness, altruism, sense of justice, and the conflict between selfishness and the two moral issues. The participants were 250 unpaid volunteers (50% male; 50% female) belonging to two age-groups: young people and adults. The study entailed a 2 (gender) x 2 (age-group) x 5 (type of thinking) x 4 (situation) mixed design: the first two variables were betweensubjects, the others were within-subjects. Results have shown that: 1. moral judgment and behaviour are at least partially affected by the type of situations and by interpersonal variables such as gender and age; 2. moral reasoning depends in a similar manner on cognitive and affective factors; 3. there is not a gender polarity between the ethic of justice and the ethic of cure/ altruism; 4. moral reasoning and behavior are perceived as reciprocally congruent even though their congruence decreases with a more objective assessment. Such results were discussed in the light of contrasting theories on morality.
Keywords: Contextual-pragmatic approach to morality, ethic ofcare, ethic of justice, Kohlbergian approach, moral behaviour, moralreasoning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25348229 Optical Limiting Characteristics of Core-Shell Nanoparticles
Authors: G.Vinitha, A.Ramalingam
Abstract:
TiO2 nanoparticles were synthesized by hydrothermal method at 180°C from TiOSO4 aqueous solution with1m/l concentration. The obtained products were coated with silica by means of a seeded polymerization technique for a coating time of 1440 minutes to obtain well defined TiO2@SiO2 core-shell structure. The uncoated and coated nanoparticles were characterized by using X-Ray diffraction technique (XRD), Fourier Transform Infrared Spectroscopy (FT-IR) to study their physico-chemical properties. Evidence from XRD and FTIR results show that SiO2 is homogenously coated on the surface of titania particles. FTIR spectra show that there exists an interaction between TiO2 and SiO2 and results in the formation of Ti-O-Si chemical bonds at the interface of TiO2 particles and SiO2 coating layer. The non linear optical limiting properties of TiO2 and TiO2@SiO2 nanoparticles dispersed in ethylene glycol were studied at 532nm using 5ns Nd:YAG laser pulses. Three-photon absorption is responsible for optical limiting characteristics in these nanoparticles and it is seen that the optical nonlinearity is enhanced in core-shell structures when compared with single counterparts. This effective three-photon type absorption at this wavelength, is of potential application in fabricating optical limiting devices.Keywords: hydrothermal method, optical limiting devicesseeded polymerization technique, three-photon type absorption
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18188228 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. UAV techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. In this paper, a methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of RGB and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.
Keywords: Aerial thermography, data processing, drone, low-cost, point cloud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 341