Search results for: Learning Management Tool
1325 Design Transformation to Reduce Cost in Irrigation Using Value Engineering
Authors: F. S. Al-Anzi, M. Sarfraz, A. Elmi, A. R. Khan
Abstract:
Researchers are responding to the environmental challenges of Kuwait in localized, innovative, effective and economic ways. One of the vital and significant examples of the natural challenges is lack or water and desertification. In this research, the project team focuses on redesigning a prototype, using Value Engineering Methodology, which would provide similar functionalities to the well-known technology of Waterboxx kits while reducing the capital and operational costs and simplifying the process of manufacturing and usability by regular farmers. The design employs used tires and recycled plastic sheets as raw materials. Hence, this approach is going to help not just fighting desertification but also helping in getting rid of ever growing huge tire dumpsters in Kuwait, as well as helping in avoiding hazards of tire fires yielding in a safer and friendlier environment. Several alternatives for implementing the prototype have been considered. The best alternative in terms of value has been selected after thorough Function Analysis System Technique (FAST) exercise has been developed. A prototype has been fabricated and tested in a controlled simulated lab environment that is being followed by real environment field testing. Water and soil analysis conducted on the site of the experiment to cross compare between the composition of the soil before and after the experiment to insure that the prototype being tested is actually going to be environment safe. Experimentation shows that the design was equally as effective as, and may exceed, the original design with significant savings in cost. An estimated total cost reduction using the VE approach of 43.84% over the original design. This cost reduction does not consider the intangible costs of environmental issue of waste recycling which many further intensify the total savings of using the alternative VE design. This case study shows that Value Engineering Methodology can be an important tool in innovating new designs for reducing costs.
Keywords: Desertification, functional analysis, scrap tires, value engineering, waste recycling, water irrigation rationing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14611324 Temporal Signal Processing by Inference Bayesian Approach for Detection of Abrupt Variation of Statistical Characteristics of Noisy Signals
Authors: Farhad Asadi, Hossein Sadati
Abstract:
In fields such as neuroscience and especially in cognition modeling of mental processes, uncertainty processing in temporal zone of signal is vital. In this paper, Bayesian online inferences in estimation of change-points location in signal are constructed. This method separated the observed signal into independent series and studies the change and variation of the regime of data locally with related statistical characteristics. We give conditions on simulations of the method when the data characteristics of signals vary, and provide empirical evidence to show the performance of method. It is verified that correlation between series around the change point location and its characteristics such as Signal to Noise Ratios and mean value of signal has important factor on fluctuating in finding proper location of change point. And one of the main contributions of this study is related to representing of these influences of signal statistical characteristics for finding abrupt variation in signal. There are two different structures for simulations which in first case one abrupt change in temporal section of signal is considered with variable position and secondly multiple variations are considered. Finally, influence of statistical characteristic for changing the location of change point is explained in details in simulation results with different artificial signals.
Keywords: Time series, fluctuation in statistical characteristics, optimal learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5641323 Competitiveness and Pricing Policy Assessment for Resilience Surface Access System at Airports
Authors: Dimitrios J. Dimitriou
Abstract:
Considering a worldwide tendency, air transports are growing very fast and many changes have taken place in planning, management and decision making process. Given the complexity of airport operation, the best use of existing capacity is the key driver of efficiency and productivity. This paper deals with the evaluation framework for the ground access at airports, by using a set of mode choice indicators providing key messages towards airport’s ground access performance. The application presents results for a sample of 12 European airports, illustrating recommendations to define policy and improve service for the air transport access chain.
Keywords: Air transport chain, airport ground access, airport access performance, airport policy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11291322 A Robust Implementation of a Building Resources Access Rights Management System
Authors: E. Neagoe, V. Balanica
Abstract:
A Smart Building Controller (SBC) is a server software that offers secured access to a pool of building specific resources, executes monitoring tasks and performs automatic administration of a building, thus optimizing the exploitation cost and maximizing comfort. This paper brings to discussion the issues that arise with the secure exploitation of the SBC administered resources and proposes a technical solution to implement a robust secure access system based on roles, individual rights and privileges (special rights).
Keywords: Access authorization, smart building controller, software security, access rights.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19071321 A Methodology for Automatic Diversification of Document Categories
Authors: Dasom Kim, Chen Liu, Myungsu Lim, Soo-Hyeon Jeon, Byeoung Kug Jeon, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, numerous documents including large volumes of unstructured data and text have been created because of the rapid increase in the use of social media and the Internet. Usually, these documents are categorized for the convenience of users. Because the accuracy of manual categorization is not guaranteed, and such categorization requires a large amount of time and incurs huge costs. Many studies on automatic categorization have been conducted to help mitigate the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorize complex documents with multiple topics because they work on the assumption that individual documents can be categorized into single categories only. Therefore, to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, the learning process employed in these studies involves training using a multi-categorized document set. These methods therefore cannot be applied to the multi-categorization of most documents unless multi-categorized training sets using traditional multi-categorization algorithms are provided. To overcome this limitation, in this study, we review our novel methodology for extending the category of a single-categorized document to multiple categorizes, and then introduce a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.Keywords: Big Data Analysis, Document Classification, Text Mining, Topic Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17451320 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion detection system (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw dataset for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle component analysis (PCA), Linear Discriminant Analysis (LDA) and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. This optimal feature subset is used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) are used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.
Keywords: Particle Swarm Optimization (PSO), Principle component analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27651319 A Weighted-Profiling Using an Ontology Basefor Semantic-Based Search
Authors: Hikmat A. M. Abd-El-Jaber, Tengku M. T. Sembok
Abstract:
The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are not really satisfied with their search results. Using ontologies for semantic-based searching is likely the key solution. Adopting profiling approach and using ontology base characteristics, this work proposes a strategy for finding the exact meaning of the query terms in order to retrieve relevant information according to user needs. The evaluation of conducted experiments has shown the effectiveness of the suggested methodology and conclusion is presented.Keywords: information retrieval, user profiles, semantic Web, ontology, search engine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32171318 Synchronous Courses Attendance in Distance Higher Education: Case Study of a Computer Science Department
Authors: Thierry Eude
Abstract:
The use of videoconferencing platforms adapted to teaching offers students the opportunity to take distance education courses in much the same way as traditional in-class training. The sessions can be recorded and they allow students the option of following the courses synchronously or asynchronously. Three typical profiles can then be distinguished: students who choose to follow the courses synchronously, students who could attend the course in synchronous mode but choose to follow the session off-line, and students who follow the course asynchronously as they cannot attend the course when it is offered because of professional or personal constraints. Our study consists of observing attendance at all distance education courses offered in the synchronous mode by the Computer Science and Software Engineering Department at Laval University during 10 consecutive semesters. The aim is to identify factors that influence students in their choice of attending the distance courses in synchronous mode. It was found that participation tends to be relatively stable over the years for any one semester (fall, winter summer) and is similar from one course to another, although students may be increasingly familiar with the synchronous distance education courses. Average participation is around 28%. There may be deviations, but they concern only a few courses during certain semesters, suggesting that these deviations would only have occurred because of the composition of particular promotions during specific semesters. Furthermore, course schedules have a great influence on the attendance rate. The highest rates are all for courses which are scheduled outside office hours.
Keywords: Attendance, distance undergraduate education in computer science, student behavior, synchronous e-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14331317 Performance Optimization of Data Mining Application Using Radial Basis Function Classifier
Authors: M. Govindarajan, R. M.Chandrasekaran
Abstract:
Text data mining is a process of exploratory data analysis. Classification maps data into predefined groups or classes. It is often referred to as supervised learning because the classes are determined before examining the data. This paper describes proposed radial basis function Classifier that performs comparative crossvalidation for existing radial basis function Classifier. The feasibility and the benefits of the proposed approach are demonstrated by means of data mining problem: direct Marketing. Direct marketing has become an important application field of data mining. Comparative Cross-validation involves estimation of accuracy by either stratified k-fold cross-validation or equivalent repeated random subsampling. While the proposed method may have high bias; its performance (accuracy estimation in our case) may be poor due to high variance. Thus the accuracy with proposed radial basis function Classifier was less than with the existing radial basis function Classifier. However there is smaller the improvement in runtime and larger improvement in precision and recall. In the proposed method Classification accuracy and prediction accuracy are determined where the prediction accuracy is comparatively high.Keywords: Text Data Mining, Comparative Cross-validation, Radial Basis Function, runtime, accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15541316 Conflicts and Compromise at the Management of Transboundry Water Resources (The Case of the Central Asia)
Authors: Sobir T. Navruzov
Abstract:
The problem of complex use of water resources in Central Asia by taking into consideration the sovereignty of the states and increasing demand on use of water for economic aspects are considered. Complex program with appropriate mathematical software intended for calculation of possible variants of using the Amudarya up-stream water resources according to satisfaction of incompatible requirements of the national economics in irrigation and energy generation is proposed.Keywords: water resources, national economics, irrigation, transboundry, energy generation, optimal solution, program complex, up-stream, down-stream, compensating services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13561315 Scenario and Decision Analysis for Solar Energy in Egypt by 2035 Using Dynamic Bayesian Network
Authors: Rawaa H. El-Bidweihy, Hisham M. Abdelsalam, Ihab A. El-Khodary
Abstract:
Bayesian networks are now considered to be a promising tool in the field of energy with different applications. In this study, the aim was to indicate the states of a previous constructed Bayesian network related to the solar energy in Egypt and the factors affecting its market share, depending on the followed data distribution type for each factor, and using either the Z-distribution approach or the Chebyshev’s inequality theorem. Later on, the separate and the conditional probabilities of the states of each factor in the Bayesian network were derived, either from the collected and scrapped historical data or from estimations and past studies. Results showed that we could use the constructed model for scenario and decision analysis concerning forecasting the total percentage of the market share of the solar energy in Egypt by 2035 and using it as a stable renewable source for generating any type of energy needed. Also, it proved that whenever the use of the solar energy increases, the total costs decreases. Furthermore, we have identified different scenarios, such as the best, worst, 50/50, and most likely one, in terms of the expected changes in the percentage of the solar energy market share. The best scenario showed an 85% probability that the market share of the solar energy in Egypt will exceed 10% of the total energy market, while the worst scenario showed only a 24% probability that the market share of the solar energy in Egypt will exceed 10% of the total energy market. Furthermore, we applied policy analysis to check the effect of changing the controllable (decision) variable’s states acting as different scenarios, to show how it would affect the target nodes in the model. Additionally, the best environmental and economical scenarios were developed to show how other factors are expected to be, in order to affect the model positively. Additional evidence and derived probabilities were added for the weather dynamic nodes whose states depend on time, during the process of converting the Bayesian network into a dynamic Bayesian network.
Keywords: Bayesian network, Chebyshev, decision variable, dynamic Bayesian network, Z-distribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5041314 A Study of the Lighting Control System for a Daylit Office
Authors: Chih-Jian Hu, Chung-Chih Cheng, Hsiao-Yuan Wu., Nien-Tzu Chao
Abstract:
Increasing user comfort and reducing operation costs have always been primary objectives of lighting control strategies in a building. This paper proposes an architecture of the lighting control system for a daylit office. The system consists of the lighting controller, A/D & D/A converter, dimmable LED lights, and the lighting management software. Verification tests are conducted using the proposed system specialized for the interior lighting of a open-plan office. The results showed the proposed architecture of the lighting system would improve the overall system reliability, lower the system cost, and provide ease of installation and maintenance.Keywords: control, dimming, LED, lighting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18901313 A Simulated Environment Approach to Investigate the Effect of Adversarial Perturbations on Traffic Sign for Automotive Software-in-Loop Testing
Authors: Sunil Patel, Pallab Maji
Abstract:
To study the effect of adversarial attack environment must be controlled. Autonomous driving includes mainly 5 phases sense, perceive, map, plan, and drive. Autonomous vehicles sense their surrounding with the help of different sensors like cameras, radars, and lidars. Deep learning techniques are considered Blackbox and found to be vulnerable to adversarial attacks. In this research, we study the effect of the various known adversarial attacks with the help of the Unreal Engine-based, high-fidelity, real-time raytraced simulated environment. The goal of this experiment is to find out if adversarial attacks work in moving vehicles and if an unknown network may be targeted. We discovered that the existing Blackbox and Whitebox attacks have varying effects on different traffic signs. We observed that attacks that impair detection in static scenarios do not have the same effect on moving vehicles. It was found that some adversarial attacks with hardly noticeable perturbations entirely blocked the recognition of certain traffic signs. We observed that the daylight condition has a substantial impact on the model's performance by simulating the interplay of light on traffic signs. Our findings have been found to closely resemble outcomes encountered in the real world.
Keywords: Adversarial attack simulation, computer simulation, ray-traced environment, realistic simulation, unreal engine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4331312 Heart Rate Variability Analysis for Early Stage Prediction of Sudden Cardiac Death
Authors: Reeta Devi, Hitender Kumar Tyagi, Dinesh Kumar
Abstract:
In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.Keywords: Early stage prediction, heart rate variability, linear and non linear analysis, sudden cardiac death.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18051311 Vision-Based Collision Avoidance for Unmanned Aerial Vehicles by Recurrent Neural Networks
Authors: Yao-Hong Tsai
Abstract:
Due to the sensor technology, video surveillance has become the main way for security control in every big city in the world. Surveillance is usually used by governments for intelligence gathering, the prevention of crime, the protection of a process, person, group or object, or the investigation of crime. Many surveillance systems based on computer vision technology have been developed in recent years. Moving target tracking is the most common task for Unmanned Aerial Vehicle (UAV) to find and track objects of interest in mobile aerial surveillance for civilian applications. The paper is focused on vision-based collision avoidance for UAVs by recurrent neural networks. First, images from cameras on UAV were fused based on deep convolutional neural network. Then, a recurrent neural network was constructed to obtain high-level image features for object tracking and extracting low-level image features for noise reducing. The system distributed the calculation of the whole system to local and cloud platform to efficiently perform object detection, tracking and collision avoidance based on multiple UAVs. The experiments on several challenging datasets showed that the proposed algorithm outperforms the state-of-the-art methods.Keywords: Unmanned aerial vehicle, object tracking, deep learning, collision avoidance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9531310 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition
Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu
Abstract:
In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.Keywords: Biometry, image processing, pattern recognition, speech analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19441309 Towards an Extended SQLf: Bipolar Query Language with Preferences
Authors: L. Ludovic, R. Daniel, S-E Tbahriti
Abstract:
Database management systems that integrate user preferences promise better solution for personalization, greater flexibility and higher quality of query responses. This paper presents a tentative work that studies and investigates approaches to express user preferences in queries. We sketch an extend capabilities of SQLf language that uses the fuzzy set theory in order to define the user preferences. For that, two essential points are considered: the first concerns the expression of user preferences in SQLf by so-called fuzzy commensurable predicates set. The second concerns the bipolar way in which these user preferences are expressed on mandatory and/or optional preferences.
Keywords: Flexible query language, relational database, userpreference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10131308 Using Satellite Images Datasets for Road Intersection Detection in Route Planning
Authors: Fatma El-zahraa El-taher, Ayman Taha, Jane Courtney, Susan Mckeever
Abstract:
Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions is critical to decisions such as crossing roads or selecting safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset are examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of detection of intersections in satellite images is evaluated.
Keywords: Satellite images, remote sensing images, data acquisition, autonomous vehicles, robot navigation, route planning, road intersections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7571307 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.
Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14211306 MPPT Operation for PV Grid-connected System using RBFNN and Fuzzy Classification
Authors: A. Chaouachi, R. M. Kamel, K. Nagasaka
Abstract:
This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW Photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three Radial Basis Function Neural Networks (RBFNN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated RBFNN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology, comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and non-linear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network.
Keywords: MPPT, neuro-fuzzy, RBFN, grid-connected, photovoltaic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31821305 How Stock Market Reacts to Guidance Revisions and Actual Earnings Surprises
Authors: Tero Halme, Juho Kanniainen, Markus Nordberg
Abstract:
According to the existing literature, companies manage analysts’ expectations of their future earnings by issuing pessimistic earnings guidance to meet the expectations. Consequently, one could expect that markets price this pessimistic bias in advance and penalize companies more for lowering the guidance than reward for beating the guidance. In this paper we confirm this empirically. In addition we show that although guidance revisions have a statistically significant relation to stock returns, that is not the case with the actual earnings surprise. Reason for this could be that, after the annual earnings report also information on future earnings power is given at the same time.
Keywords: Management guidance, earnings guidance, pessimistic bias
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30221304 CASTE: a Cloud-Based Automatic Software Test Environment
Authors: Fuyang Peng, Bo Deng, Chao Qi
Abstract:
This paper presents the design and implementation of CASTE, a Cloud-based automatic software test environment. We first present the architecture of CASTE, then the main packages and classes of it are described in detail. CASTE is built upon a private Infrastructure as a Service platform. Through concentrated resource management of virtualized testing environment and automatic execution control of test scripts, we get a better solution to the testing resource utilization and test automation problem. Experiments on CASTE give very appealing results.
Keywords: Software testing, test environment, test script, cloud computing, IaaS, test automation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21781303 Development of a Neural Network based Algorithm for Multi-Scale Roughness Parameters and Soil Moisture Retrieval
Authors: L. Bennaceur Farah, I. R. Farah, R. Bennaceur, Z. Belhadj, M. R. Boussema
Abstract:
The overall objective of this paper is to retrieve soil surfaces parameters namely, roughness and soil moisture related to the dielectric constant by inverting the radar backscattered signal from natural soil surfaces. Because the classical description of roughness using statistical parameters like the correlation length doesn't lead to satisfactory results to predict radar backscattering, we used a multi-scale roughness description using the wavelet transform and the Mallat algorithm. In this description, the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each having a spatial scale. A second step in this study consisted in adapting a direct model simulating radar backscattering namely the small perturbation model to this multi-scale surface description. We investigated the impact of this description on radar backscattering through a sensitivity analysis of backscattering coefficient to the multi-scale roughness parameters. To perform the inversion of the small perturbation multi-scale scattering model (MLS SPM) we used a multi-layer neural network architecture trained by backpropagation learning rule. The inversion leads to satisfactory results with a relative uncertainty of 8%.Keywords: Remote sensing, rough surfaces, inverse problems, SAR, radar scattering, Neural networks and Fractals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15951302 Freedom of Media, Democracy and Gezi Park
Authors: Emine Tirali
Abstract:
This article provides a conceptual framework of the freedom of media and its correlation with democracy. In a democracy, media should serve the publics’ right to know and reflect human rights violations and offer options for meaningful political choices and effective participation in civic affairs. On that point, the 2013 events at Gezi Park in Turkey are a good empirical example to be discussed. During the events, when self-censorship was broadly employed by mainstream Turkish media, social media filled the important role of providing information to the public. New technologies have made information into a fundamental tool for change and growth, and as a consequence, societies worldwide have merged into a single, interdependent, and autonomous organism. For this reason, violations of human rights can no longer be considered domestic issues, but rather global ones. Only global political action is an adequate response. Democracy depends on people shaping the society they live in, and in order to accomplish this, they need to express themselves. Freedom of expression is therefore necessary in order to understand diversity and differing perspectives, which in turn are necessary to resolve conflicts among people. Moreover, freedom of information is integral to freedom of expression. In this context, the international rules and laws regarding freedom of expression and freedom of information – indispensable for a free and independent media – are examined. These were put in place by international institutions such as the United Nations, UNESCO, the Council of Europe, and the European Union, which have aimed to build a free, democratic, and pluralist world committed to human rights and the rule of law. The methods of international human rights institutions depend on effective and frequent employment of mass media to relay human rights violations to the public. Therefore, in this study, the relationship between mass media and democracy, the process of how mass media forms public opinion, the problems of mass media, the neo-liberal theory of mass media, and the use of mass media by NGOs will be evaluated.
Keywords: Freedom of expression, democracy, public opinion, self-censorship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16961301 In Search of a Suitable Neural Network Capable of Fast Monitoring of Congestion Level in Electric Power Systems
Authors: Pradyumna Kumar Sahoo, Prasanta Kumar Satpathy
Abstract:
This paper aims at finding a suitable neural network for monitoring congestion level in electrical power systems. In this paper, the input data has been framed properly to meet the target objective through supervised learning mechanism by defining normal and abnormal operating conditions for the system under study. The congestion level, expressed as line congestion index (LCI), is evaluated for each operating condition and is presented to the NN along with the bus voltages to represent the input and target data. Once, the training goes successful, the NN learns how to deal with a set of newly presented data through validation and testing mechanism. The crux of the results presented in this paper rests on performance comparison of a multi-layered feed forward neural network with eleven types of back propagation techniques so as to evolve the best training criteria. The proposed methodology has been tested on the standard IEEE-14 bus test system with the support of MATLAB based NN toolbox. The results presented in this paper signify that the Levenberg-Marquardt backpropagation algorithm gives best training performance of all the eleven cases considered in this paper, thus validating the proposed methodology.
Keywords: Line congestion index, critical bus, contingency, neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17871300 A Multi-layer Artificial Neural Network Architecture Design for Load Forecasting in Power Systems
Authors: Axay J Mehta, Hema A Mehta, T.C.Manjunath, C. Ardil
Abstract:
In this paper, the modelling and design of artificial neural network architecture for load forecasting purposes is investigated. The primary pre-requisite for power system planning is to arrive at realistic estimates of future demand of power, which is known as Load Forecasting. Short Term Load Forecasting (STLF) helps in determining the economic, reliable and secure operating strategies for power system. The dependence of load on several factors makes the load forecasting a very challenging job. An over estimation of the load may cause premature investment and unnecessary blocking of the capital where as under estimation of load may result in shortage of equipment and circuits. It is always better to plan the system for the load slightly higher than expected one so that no exigency may arise. In this paper, a load-forecasting model is proposed using a multilayer neural network with an appropriately modified back propagation learning algorithm. Once the neural network model is designed and trained, it can forecast the load of the power system 24 hours ahead on daily basis and can also forecast the cumulative load on daily basis. The real load data that is used for the Artificial Neural Network training was taken from LDC, Gujarat Electricity Board, Jambuva, Gujarat, India. The results show that the load forecasting of the ANN model follows the actual load pattern more accurately throughout the forecasted period.
Keywords: Power system, Load forecasting, Neural Network, Neuron, Stabilization, Network structure, Load.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34231299 Production Structure Monitoring - A Neurologic Based Approach
Authors: G. Reinhart, J. Pohl
Abstract:
Manufacturing companies are facing a broad variety of challenges caused by a dynamic production environment. To succeed in such an environment, it is crucial to minimize the loss of time required to trigger the adaptation process of a company-s production structures. This paper presents an approach for the continuous monitoring of production structures by neurologic principles. It enhances classical monitoring concepts, which are principally focused on reactive strategies, and enables companies to act proactively. Thereby, strategic aspects regarding the harmonization of certain life cycles are integrated into the decision making process for triggering the reconfiguration process of the production structure.Keywords: Continuous Factory Planning, Production Structure, Production Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14471298 Complexity of Operation and Maintenance in Irrigation Network Management-A Case of the Dez Scheme in the Greater Dezful, Iran
Authors: Najaf Hedayat
Abstract:
Food and fibre production in arid and semi-arid regions has emerged as one of the major challenges for various socio-economic and political reasons such as the food security and self-sufficiency. Productive use of the renewable water resources has risen on top ofthe decision-making agenda. For this reason, efficient operation and maintenance of modern irrigation and drainage schemes become part and parcel and indispensible reality in agricultural policy making arena. The aim of this paper is to investigate the complexity of operating and maintaining such schemes, mainly focussing on challenges which enhance and opportunities that impedsustainable food and fibre production. The methodology involved using secondary data complemented byroutine observations and stakeholders views on issues that influence the O&M in the Dez command area. The SPSS program was used as an analytical framework for data analysis and interpretation.Results indicate poor application efficiency in most croplands, much of which is attributed to deficient operation of conveyance and distribution canals. These in turn, are reportedly linked to inadequate maintenance of the pumping stations and hydraulic structures like turnouts,flumes and other control systems particularly in the secondary and tertiary canals. Results show that the aforementioned deficiencies have been the major impediment to establishing regular flow toward the farm gates which subsequently undermine application efficiency and tillage operationsat farm level. Results further show that accumulative impact of such deficiencies has been the major causes of poorcrop yield and quality that deem production system in these croplands uneconomic. Results further show that the present state might undermine the sustainability of agricultural system in the command area. The overall conclusion being that present water management is unlikely to be responsive to challenges that the sector faces. And in the absence of coherent measures to shift the status quo situation in favour of more productive resource use, it would be hard to fulfil the objectives of the National Economic and Socio-cultural Development Plans.
Keywords: renewable water resources, Dez scheme, irrigationand drainage, sustainable crop production, O&M
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16191297 The Effect of Insurance on Foreign Direct Investments Inflow to Nigeria
Authors: Chimaobi V. Okolo, Afamefuna J. Ani, Ebere U. Okolo
Abstract:
This paper seeks to assess the implications of insurance to foreign direct investment inflow in Nigeria. Multiple linear regression technique and correlation matrix test were employed to measure the extent to which foreign direct investment was influenced. The result showed that insurance premium (IP), asset size of insurance industry (AS), and total investment of the industry (TI) impacted significantly and positively on foreign direct investment inflow in Nigeria. There should be effective risk transfer mechanism and financial intermediation, which gives the investor confidence in the risk management strength of the host country.Keywords: Foreign direct investment, insurance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30481296 Fetal and Infant Mortality in Botucatu City, São Paulo State, Brazil: Evaluation of Maternal - Infant Health Care
Authors: Noda L. M., Salvador I. C, C. M. L. G. Parada, Fonseca C. R. B.
Abstract:
In Brazil, neonatal mortality rate is considered incompatible with the country development conditions, and has been a Public Health concern. Reduction in infant mortality rates has also been part of the Millennium Development Goals, a commitment made by countries, members of the Organization of United Nations (OUN), including Brazil. Fetal mortality rate is considered a highly sensitive indicator of health care quality. Suitable actions, such as good quality and access to health services may contribute positively towards reduction in these fetal and neonatal rates. With appropriate antenatal follow-up and health care during gestation and delivery, some death causes could be reduced or even prevented by means of early diagnosis and intervention, as well as changes in risk factors and interventions. Objectives: To study the quality of maternal and infant health care based on fetal and neonatal mortality, as well as the possible actions to prevent those deaths in Botucatu (Brazil). Methods: Classification of prevention according to the International Classification of Diseases and the modified Wigglesworth´s classification. In order to evaluate adequacy, indicators of quality of antenatal and delivery care were established by the authors. Results: Considering fetal deaths, 56.7% of them occurred before delivery, which reveals possible shortcomings in antenatal care, and 38.2% of them were a result of intra- labor changes, which could be prevented or reduced by adequate obstetric management. These findings were different from those in the group of early neonatal deaths which were also studied. Adequacy of health services showed that antenatal and childbirth care was appropriate for 24% and 33.3% of pregnant women, respectively, which corroborates the results of prevention. These results revealed that shortcomings in obstetric and antenatal care could be the causes of deaths in the study. Early and late neonatal deaths have similar characteristics: 76% could be prevented or reduced mainly by adequate newborn care (52.9%) and adequate health care for gestational women (11.7%). When adequacy of care was evaluated, childbirth and newborn care was adequate in 25.8% and antenatal care was adequate in 16.1%. In conclusion, direct relationship was found between adequacy and quality of care rendered to pregnant women and newborns, and fetal and infant mortality. Moreover, our findings highlight that deaths could be prevented by an adequate obstetric and neonatal management.
Keywords: Fetal Mortality, Infant Mortality, Maternal-Child Health Services, Program Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5069