Search results for: Data Accessibility
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7520

Search results for: Data Accessibility

6080 Direct Measurements of Wind Data over 100 Meters above the Ground in the Site of Lendinara, Italy

Authors: A. Dal Monte, M. Raciti Castelli, G. B. Bellato, L. Stevanato, E. Benini

Abstract:

The wind resource in the Italian site of Lendinara (RO) is analyzed through a systematic anemometric campaign performed on the top of the bell tower, at an altitude of over 100 m above the ground. Both the average wind speed and the Weibull distribution are computed. The resulting average wind velocity is in accordance with the numerical predictions of the Italian Wind Atlas, confirming the accuracy of the extrapolation of wind data adopted for the evaluation of wind potential at higher altitudes with respect to the commonly placed measurement stations.

Keywords: Anemometric campaign, wind resource, Weibull distribution, wind atlas

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1957
6079 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm

Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll

Abstract:

This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.

Keywords: Concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988
6078 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets

Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi

Abstract:

In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.

Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
6077 Model-Free Distributed Control of Dynamical Systems

Authors: Javad Khazaei, Rick S. Blum

Abstract:

Distributed control is an efficient and flexible approach for coordination of multi-agent systems. One of the main challenges in designing a distributed controller is identifying the governing dynamics of the dynamical systems. Data-driven system identification is currently undergoing a revolution. With the availability of high-fidelity measurements and historical data, model-free identification of dynamical systems can facilitate the control design without tedious modeling of high-dimensional and/or nonlinear systems. This paper develops a distributed control design using consensus theory for linear and nonlinear dynamical systems using sparse identification of system dynamics. Compared with existing consensus designs that heavily rely on knowing the detailed system dynamics, the proposed model-free design can accurately capture the dynamics of the system with available measurements and input data and provide guaranteed performance in consensus and tracking problems. Heterogeneous damped oscillators are chosen as examples of dynamical system for validation purposes.

Keywords: Consensus tracking, distributed control, model-free control, sparse identification of dynamical systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 533
6076 Improving Classification in Bayesian Networks using Structural Learning

Authors: Hong Choon Ong

Abstract:

Naïve Bayes classifiers are simple probabilistic classifiers. Classification extracts patterns by using data file with a set of labeled training examples and is currently one of the most significant areas in data mining. However, Naïve Bayes assumes the independence among the features. Structural learning among the features thus helps in the classification problem. In this study, the use of structural learning in Bayesian Network is proposed to be applied where there are relationships between the features when using the Naïve Bayes. The improvement in the classification using structural learning is shown if there exist relationship between the features or when they are not independent.

Keywords: Bayesian Network, Classification, Naïve Bayes, Structural Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598
6075 Net Fee and Commission Income Determinants of European Cooperative Banks

Authors: Karolína Vozková, Matěj Kuc

Abstract:

Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.

Keywords: Cooperative banking, dynamic panel data models, net fee, commission income, system GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2942
6074 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ

Authors: Khaled Abduesslam. M, Mohammed Ali, Basher H Alsdai, Muhammad Nizam, Inayati

Abstract:

This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New- England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.

Keywords: IEEE 39 bus, Least Squares Support Vector Machine, Learning Vector Quantization, Voltage Collapse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
6073 ANN Based Model Development for Material Removal Rate in Dry Turning in Indian Context

Authors: Mangesh R. Phate, V. H. Tatwawadi

Abstract:

This paper is intended to develop an artificial neural network (ANN) based model of material removal rate (MRR) in the turning of ferrous and nonferrous material in a Indian small-scale industry. MRR of the formulated model was proved with the testing data and artificial neural network (ANN) model was developed for the analysis and prediction of the relationship between inputs and output parameters during the turning of ferrous and nonferrous materials. The input parameters of this model are operator, work-piece, cutting process, cutting tool, machine and the environment.

The ANN model consists of a three layered feedforward back propagation neural network. The network is trained with pairs of independent/dependent datasets generated when machining ferrous and nonferrous material. A very good performance of the neural network, in terms of contract with experimental data, was achieved. The model may be used for the testing and forecast of the complex relationship between dependent and the independent parameters in turning operations.

Keywords: Field data based model, Artificial neural network, Simulation, Convectional Turning, Material removal rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
6072 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
6071 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant

Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan

Abstract:

The most important process of the water treatment plant process is coagulation, which uses alum and poly aluminum chloride (PACL). Therefore, determining the dosage of alum and PACL is the most important factor to be prescribed. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for chemical dose prediction, as used for coagulation, such as alum and PACL, with input data consisting of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of the Bangkhen Water Treatment Plant (BKWTP), under the authority of the Metropolitan Waterworks Authority of Thailand. The data were collected from 1 January 2019 to 31 December 2019 in order to cover the changing seasons of Thailand. The input data of ANN are divided into three groups: training set, test set, and validation set. The coefficient of determination and the mean absolute errors of the alum model are 0.73, 3.18 and the PACL model are 0.59, 3.21, respectively.

Keywords: Soft jar test, jar test, water treatment plant process, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 662
6070 Distributional Impacts of Changes in Value Added Tax Rates in the Czech Republic

Authors: Ondřej Bayer

Abstract:

The paper evaluates the ongoing reform of VAT in the Czech Republic in terms of impacts on individual households. The main objective is to analyse the impact of given changes on individual households. The adopted method is based on the data related to household consumption by individual household quintiles; obtained data are subjected to micro-simulation examining. Results are discussed in terms of vertical tax justice. Results of the analysis reveal that VAT behaves regressively and a sole consolidation of rates at a higher level only increases the regression of this tax in the Czech Republic.

Keywords: Consolidation of rates, household quintiles, tax impact, VAT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
6069 An Efficient Iterative Updating Method for Damped Structural Systems

Authors: Jiashang Jiang

Abstract:

Model updating is an inverse eigenvalue problem which concerns the modification of an existing but inaccurate model with measured modal data. In this paper, an efficient gradient based iterative method for updating the mass, damping and stiffness matrices simultaneously using a few of complex measured modal data is developed. Convergence analysis indicates that the iterative solutions always converge to the unique minimum Frobenius norm symmetric solution of the model updating problem by choosing a special kind of initial matrices.

Keywords: Model updating, iterative algorithm, damped structural system, optimal approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2083
6068 Potential of Solar Energy in Zarqa Region

Authors: Ali M. Jawarneh, Ahmad S. AL-Shyyab

Abstract:

The purpose of this work is to present the potential of solar energy in Zarqa region. The solar radiation along year 2009 was obtained from Pyranometer which measures the global radiation over horizontal surfaces. Solar data in several different forms, over period of 5 minutes, hour-by-hour, daily and monthly data radiation have been presented. Briefly, the yearly global solar radiation in Zarqa is 7297.5 MJ/m2 (2027 kWh/m²) and the average annual solar radiation per day is 20 MJ/m2 (5.5 Kwh/m2). More specifically, the average annual solar radiation per day is 12.9 MJ/m2 (3.57 Kwh/m2) in winter and 25 MJ/m2 (7 Kwh/m2) in summer.

Keywords: Solar Energy, Pyranometer, Zarqa Region

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
6067 Applications of Drones in Infrastructures: Challenges and Opportunities

Authors: Jin Fan, M. Ala Saadeghvaziri

Abstract:

Unmanned aerial vehicles (UAVs), also referred to as drones, equipped with various kinds of advanced detecting or surveying systems, are effective and low-cost in data acquisition, data delivery and sharing, which can benefit the building of infrastructures. This paper will give an overview of applications of drones in planning, designing, construction and maintenance of infrastructures. The drone platform, detecting and surveying systems, and post-data processing systems will be introduced, followed by cases with details of the applications. Challenges from different aspects will be addressed. Opportunities of drones in infrastructure include but not limited to the following. Firstly, UAVs equipped with high definition cameras or other detecting equipment are capable of inspecting the hard to reach infrastructure assets. Secondly, UAVs can be used as effective tools to survey and map the landscape to collect necessary information before infrastructure construction. Furthermore, an UAV or multi-UVAs are useful in construction management. UVAs can also be used in collecting roads and building information by taking high-resolution photos for future infrastructure planning. UAVs can be used to provide reliable and dynamic traffic information, which is potentially helpful in building smart cities. The main challenges are: limited flight time, the robustness of signal, post data analyze, multi-drone collaboration, weather condition, distractions to the traffic caused by drones. This paper aims to help owners, designers, engineers and architects to improve the building process of infrastructures for higher efficiency and better performance.

Keywords: Bridge, construction, drones, infrastructure, information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1306
6066 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy

Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie

Abstract:

In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.

Keywords: Data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2555
6065 A Semantic Web Based Ontology in the Financial Domain

Authors: S. Banerjee

Abstract:

The paper describes design of an ontology in the financial domain for mutual funds. The design of this ontology consists of four steps, namely, specification, knowledge acquisition, implementation and semantic query. Specification includes a description of the taxonomy and different types mutual funds and their scope. Knowledge acquisition involves the information extraction from heterogeneous resources. Implementation describes the conceptualization and encoding of this data. Finally, semantic query permits complex queries to integrated data, mapping of these database entities to ontological concepts.

Keywords: Ontology, Semantic Web, Mutual Funds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3651
6064 A Frugal Bidding Procedure for Replicating WWW Content

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Internet, data content replication, static allocation, mechanism design, equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403
6063 Developmental Differences in the Construction of Concepts by Children from 3 to 14-Year-Olds: Perception, Language and Instruction

Authors: Mehmet Ozcan

Abstract:

This study was designed to investigate the relationship between language and children’s construction of the concept of objects, actions, and states. Participants of this study are 120 children whose ages range from 3 to 14 years. Ten children participated from each age group and 10 adults participated as normative group. Data were collected using 28 words which were identified and grouped according to the purpose of this study. Participants were asked the question “What is x?’ for each word in a reserved room. The audio recorded data were transcribed and coded. The data were analyzed primarily qualitatively but quantitatively as well to support qualitative findings. The findings reveal that younger children rely more on their perceptual experience and linguistic input while 7-year-olds and older ones rely more on instructional language in the construction of the concepts related to objects, actions and states. Adults differ from all age groups with their usage of metaphors to refer to objects. It has been noted that linguistic, perceptual and instructional experiences work in an interwoven way but each one seems to be dominant at certain ages.

Keywords: Cognition, concept construction, first language acquisition, language, thought.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
6062 Detailed Phenomenological Study of 14N Elastically Scattered on 12C in a wide Energy Range

Authors: Sh. Hamada, N. Burtebayev, N. Amangeldi, A. Amar

Abstract:

An experiment was performed with a 24.5 MeV 14N beam on a 12C target in the cyclotron DC-60 located in Astana, Kazakhstan, to study the elastic scattering of 14N on 12C; the scattering was also analyzed at different energies for tracking the phenomenon of remarkable structure at large angles. Its aims were to extend the measurements to very large angles, and attempt to uniquely identify the elastic scattering potential. Good agreement between the theoretical and experimental data has been obtained with suitable optical potential parameters. Optical model calculations with l -dependent imaginary potentials were also applied to the data and relatively good agreement was found.

Keywords: Optical Potential Codes, Elastic Scattering, SPIVALCode.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
6061 Investigating Technical and Pedagogical Considerations in Producing Screen Recorded Videos

Authors: M. Nikafrooz, J. Darsareh

Abstract:

Due to the COVID-19 pandemic, its impacts on education all over the world, and the problems arising from the use of traditional methods in education during the pandemic, it was necessary to apply alternative solutions to achieve educational goals. In this regard, electronic content production through screen recording became popular among many teachers. However, the production of screen-recorded videos requires special technical and pedagogical considerations. The purpose of this study was to extract and present the technical and pedagogical considerations for producing screen-recorded videos to provide a useful and comprehensive guideline for e-content producers. This study was applied research, the design was descriptive, and data collection has been done using qualitative method. In order to collect the data, 524 previously produced screen-recorded videos were evaluated by using an open-ended questionnaire. After collecting the data, they were categorized, and finally, 83 items as technical and pedagogical considerations in the form of 5 domains were determined. By applying such considerations, it is expected to decrease producing and editing time, increase the technical and pedagogical quality, and finally facilitate and enhance the processes of teaching and learning.

Keywords: E-learning, e-content, screen recorded-videos, screen recording software, technical and pedagogical considerations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645
6060 A World Map of Seabed Sediment Based on 50 Years of Knowledge

Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès

Abstract:

Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.

Keywords: Marine sedimentology, seabed map, sediment classification, World Ocean.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1037
6059 Context-Aware Querying in Multimedia Databases – A Futuristic Approach

Authors: Nadeem Iftikhar, Zouhaib Zafar, Shaukat Ali

Abstract:

Efficient retrieval of multimedia objects has gained enormous focus in recent years. A number of techniques have been suggested for retrieval of textual information; however, relatively little has been suggested for efficient retrieval of multimedia objects. In this paper we have proposed a generic architecture for contextaware retrieval of multimedia objects. The proposed framework combines the well-known approaches of text-based retrieval and context-aware retrieval to formulate architecture for accurate retrieval of multimedia data.

Keywords: Context-aware retrieval, information retrieval, multimedia databases, multimedia data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
6058 Measurement of Operational and Environmental Performance of the Coal-Fired Power Plants in India by Using Data Envelopment Analysis

Authors: Vijay Kumar Bajpai, Sudhir Kumar Singh

Abstract:

In this study, the performance analyses of the twenty five Coal-Fired Power Plants (CFPPs) used for electricity generation are carried out through various Data Envelopment Analysis (DEA) models. Three efficiency indices are defined and pursued. During the calculation of the operational performance, energy and non-energy variables are used as input, and net electricity produced is used as desired output (Model-1). CO2 emitted to the environment is used as the undesired output (Model-2) in the computation of the pure environmental performance while in Model-3 CO2 emissions is considered as detrimental input in the calculation of operational and environmental performance. Empirical results show that most of the plants are operating in increasing returns to scale region and Mettur plant is efficient one with regards to energy use and environment. The result also indicates that the undesirable output effect is insignificant in the research sample. The present study will provide clues to plant operators towards raising the operational and environmental performance of CFPPs.

Keywords: Coal fired power plants, environmental performance, data envelopment analysis, operational performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360
6057 Identification of Coauthors in Scientific Database

Authors: Thiago M. R Dias, Gray F. Moita

Abstract:

The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.

Keywords: Extraction and data integration, Information Retrieval, Scientific Collaboration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
6056 Performance Comparison of Cooperative Banks in the EU, USA and Canada

Authors: Matěj Kuc

Abstract:

This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.

Keywords: Cooperative banking, panel data, profitability measures, random effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 649
6055 Road Safety in Great Britain: An Exploratory Data Analysis

Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari

Abstract:

Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. For the past 30 years, the UK has had a good record in reducing fatalities over the past 30 years, there is still a considerable number of road deaths. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe. This study represents an exploratory analysis with deep insights which could provide policy makers with invaluable insights into how accidents happen and how they can be mitigated. We use STATS19 data published by the UK government. Since we need more information about locations which is not provided in STATA19, we first expand the features of the dataset using OpenStreetMap and Visual Crossing. This paper also provides a discussion regarding new road safety methods.

Keywords: Road safety, data analysis, OpenStreetMap, feature expanding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 342
6054 Clinical Decision Support for Disease Classification based on the Tests Association

Authors: Sung Ho Ha, Seong Hyeon Joo, Eun Kyung Kwon

Abstract:

Until recently, researchers have developed various tools and methodologies for effective clinical decision-making. Among those decisions, chest pain diseases have been one of important diagnostic issues especially in an emergency department. To improve the ability of physicians in diagnosis, many researchers have developed diagnosis intelligence by using machine learning and data mining. However, most of the conventional methodologies have been generally based on a single classifier for disease classification and prediction, which shows moderate performance. This study utilizes an ensemble strategy to combine multiple different classifiers to help physicians diagnose chest pain diseases more accurately than ever. Specifically the ensemble strategy is applied by using the integration of decision trees, neural networks, and support vector machines. The ensemble models are applied to real-world emergency data. This study shows that the performance of the ensemble models is superior to each of single classifiers.

Keywords: Diagnosis intelligence, ensemble approach, data mining, emergency department

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
6053 A Comparative Study of Main Memory Databases and Disk-Resident Databases

Authors: F. Raja, M.Rahgozar, N. Razavi, M. Siadaty

Abstract:

Main Memory Database systems (MMDB) store their data in main physical memory and provide very high-speed access. Conventional database systems are optimized for the particular characteristics of disk storage mechanisms. Memory resident systems, on the other hand, use different optimizations to structure and organize data, as well as to make it reliable. This paper provides a brief overview on MMDBs and one of the memory resident systems named FastDB and compares the processing time of this system with a typical disc resident database based on the results of the implementation of TPC benchmarks environment on both.

Keywords: Disk-Resident Database, FastDB, Main MemoryDatabase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3063
6052 Student Perceptions of Defense Acquisition University Courses: An Explanatory Data Collection Approach

Authors: Melissa C. LaDuke

Abstract:

The overarching purpose of this study was to determine the relationship between the current format of online delivery for Defense Acquisition University (DAU) courses and Air Force Acquisition (AFA) personnel participation. AFA personnel (hereafter named “student”) were particularly of interest, as they have been mandated to take anywhere from 3 to 30 online courses to earn various DAU specialization certifications. Participants in this qualitative case study were AFA personnel who pursued DAU certifications in science and technology management, program/contract management, and other related fields. Air Force personnel were interviewed about their experiences with online courses. The data gathered were analyzed and grouped into 12 major themes. The themes tied into the theoretical framework and addressed either teacher-centered or student-centered educational practices within DAU. Based on the results of the data analysis, various factors contributed to student perceptions of DAU courses to include the online course construct and relevance to their job. The analysis also found students want to learn the information presented but would like to be able to apply the information learned in meaningful ways.

Keywords: Educational theory, computer-based training, interview, student perceptions, online course design, teacher positionality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196
6051 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies  the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: Retail stores, Faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 581