Search results for: hybrid neural network.
200 Secure Low-Bandwidth Video Streaming through Reliable Multipath Propagation in MANETs
Authors: S. Mohideen Badhusha, K. Duraiswamy
Abstract:
Most of the existing video streaming protocols provide video services without considering security aspects in decentralized mobile ad-hoc networks. The security policies adapted to the currently existing non-streaming protocols, do not comply with the live video streaming protocols resulting in considerable vulnerability, high bandwidth consumption and unreliability which cause severe security threats, low bandwidth and error prone transmission respectively in video streaming applications. Therefore a synergized methodology is required to reduce vulnerability and bandwidth consumption, and enhance reliability in the video streaming applications in MANET. To ensure the security measures with reduced bandwidth consumption and improve reliability of the video streaming applications, a Secure Low-bandwidth Video Streaming through Reliable Multipath Propagation (SLVRMP) protocol architecture has been proposed by incorporating the two algorithms namely Secure Low-bandwidth Video Streaming Algorithm and Reliable Secure Multipath Propagation Algorithm using Layered Video Coding in non-overlapping zone routing network topology. The performances of the proposed system are compared to those of the other existing secure multipath protocols Sec-MR, SPREAD using NS 2.34 and the simulation results show that the performances of the proposed system get considerably improved.Keywords: Bandwidth consumption, layered video coding, multipath propagation, reliability, security threats, video streaming applications, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881199 Event Information Extraction System (EIEE): FSM vs HMM
Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani
Abstract:
Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2314198 Utilizing Biological Models to Determine the Recruitment of the Irish Republican Army
Authors: Erika Ann Schaub, Christian J Darken
Abstract:
Sociological models (e.g., social network analysis, small-group dynamic and gang models) have historically been used to predict the behavior of terrorist groups. However, they may not be the most appropriate method for understanding the behavior of terrorist organizations because the models were not initially intended to incorporate violent behavior of its subjects. Rather, models that incorporate life and death competition between subjects, i.e., models utilized by scientists to examine the behavior of wildlife populations, may provide a more accurate analysis. This paper suggests the use of biological models to attain a more robust method for understanding the behavior of terrorist organizations as compared to traditional methods. This study also describes how a biological population model incorporating predator-prey behavior factors can predict terrorist organizational recruitment behavior for the purpose of understanding the factors that govern the growth and decline of terrorist organizations. The Lotka-Volterra, a biological model that is based on a predator-prey relationship, is applied to a highly suggestive case study, that of the Irish Republican Army. This case study illuminates how a biological model can be utilized to understand the actions of a terrorist organization.
Keywords: Biological Models, Lotka-Volterra Predator-Prey Model, Terrorist Organizational Behavior, Terrorist Recruitment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523197 Ensemble Approach for Predicting Student's Academic Performance
Authors: L. A. Muhammad, M. S. Argungu
Abstract:
Educational data mining (EDM) has recorded substantial considerations. Techniques of data mining in one way or the other have been proposed to dig out out-of-sight knowledge in educational data. The result of the study got assists academic institutions in further enhancing their process of learning and methods of passing knowledge to students. Consequently, the performance of students boasts and the educational products are by no doubt enhanced. This study adopted a student performance prediction model premised on techniques of data mining with Students' Essential Features (SEF). SEF are linked to the learner's interactivity with the e-learning management system. The performance of the student's predictive model is assessed by a set of classifiers, viz. Bayes Network, Logistic Regression, and Reduce Error Pruning Tree (REP). Consequently, ensemble methods of Bagging, Boosting, and Random Forest (RF) are applied to improve the performance of these single classifiers. The study reveals that the result shows a robust affinity between learners' behaviors and their academic attainment. Result from the study shows that the REP Tree and its ensemble record the highest accuracy of 83.33% using SEF. Hence, in terms of the Receiver Operating Curve (ROC), boosting method of REP Tree records 0.903, which is the best. This result further demonstrates the dependability of the proposed model.
Keywords: Ensemble, bagging, Random Forest, boosting, data mining, classifiers, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758196 Furniko Flour: An Emblematic Traditional Food of Greek Pontic Cuisine
Authors: A. Keramaris, T. Sawidis, E. Kasapidou, P. Mitlianga
Abstract:
Although the gastronomy of the Greeks of Pontus is well-known, it has not received the same level of scientific scrutiny as another Greek local cuisine, that of Crete. As a result, we planned to concentrate our research on Greek Pontic cuisine to shed light on its distinct recipes, food products, and, ultimately, its characteristics. The Greeks of Pontus have one of Greece's most distinguished local cuisines, having lived for a long time in the northern part (Black Sea Region) of modern Turkey and now widely inhabiting northern Greece. Despite its simplicity, their cuisine contains several mouthwatering delicacies. Even though they have been in Greece for a century, their gastronomic culture remains an important part of their collective identity. As a first step toward understanding Greek Pontic cuisine, furniko flour, one of its most well-known traditional products, was investigated. For this project, we targeted residents of Western Macedonia, a province in northern Greece with a large population of descendants of Pontus Greeks who are primarily engaged in agricultural activities. In this quest, we approached a descendant of Pontus Greeks who is involved in the production of furniko flour and agreed to show us the entire process as we participated in it. Furniko flour is made from heirloom non-hybrid corn. When the moisture content of the seeds is low enough to make them suitable for roasting, they are harvested by hand. Harvesting by hand entails removing the cob from the plant and separating the husks. The harvested cobs are then roasted in a traditional wood oven for 24 hours. After that, the these are collected and stored in sacks. The next step is to extract the seeds by rubbing the cobs together. Ideally, the seeds should be ground in a traditional stone hand mill. The outcome of this process is aromatic, dark golden furniko flour, which is used to make havitz. Along with the furniko flour preparation, we also documented the havitz cooking process (a porridge-like corn flour dish). One of the most delectable dishes in Greek Pontic cuisine, this savory delicacy is simple to prepare. Because of the ingredients of furniko flour, havitz is a highly nutritious dish, according to the research participant. Furthermore, he claims that preparing havitz is a wonderful way to bring families together, share stories, and revisit happy memories. Finally, as an initial effort to highlight elements of Pontic Greek cuisine, this study illustrates the traditional preparation of furniko flour and its use in various traditional recipes. Our next objective would be to evaluate the nutritional value of furniko flour by analyzing its chemical components.
Keywords: Furniko flour, Greek Pontic cuisine, Havitz, traditional foods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 449195 Evolutionary Multi-objective Optimization for Positioning of Residential Houses
Authors: Ayman El Ansary, Mohamed Shalaby
Abstract:
The current study describes a multi-objective optimization technique for positioning of houses in a residential neighborhood. The main task is the placement of residential houses in a favorable configuration satisfying a number of objectives. Solving the house layout problem is a challenging task. It requires an iterative approach to satisfy design requirements (e.g. energy efficiency, skyview, daylight, roads network, visual privacy, and clear access to favorite views). These design requirements vary from one project to another based on location and client preferences. In the Gulf region, the most important socio-cultural factor is the visual privacy in indoor space. Hence, most of the residential houses in this region are surrounded by high fences to provide privacy, which has a direct impact on other requirements (e.g. daylight and direction to favorite views). This investigation introduces a novel technique to optimally locate and orient residential buildings to satisfy a set of design requirements. The developed technique explores the search space for possible solutions. This study considers two dimensional house planning problems. However, it can be extended to solve three dimensional cases.
Keywords: Evolutionary optimization, Houses planning, Urban modeling, Daylight, Visual Privacy, Residential compounds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1543194 Microseismicity of the Tehran Region Based on Three Seismic Networks
Authors: Jamileh Vasheghani Farahani
Abstract:
The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).
Keywords: Iran, major faults, microseismicity, Tehran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515193 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling
Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo
Abstract:
Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806192 Integrated Waste-to-Energy Approach: An Overview
Authors: Tsietsi J. Pilusa, Tumisang G. Seodigeng
Abstract:
This study evaluates the benefits of advanced waste management practices in unlocking waste-to-energy opportunities within the solid waste industry. The key drivers of sustainable waste management practices, specifically with respect to packaging waste-to-energy technology options are discussed. The success of a waste-to-energy system depends significantly on the appropriateness of available technologies, including those that are well established as well as those that are less so. There are hard and soft interventions to be considered when packaging an integrated waste treatment solution. Technology compatibility with variation in feedstock (waste) quality and quantities remains a key factor. These factors influence the technology reliability in terms of production efficiencies and product consistency, which in turn, drives the supply and demand network. Waste treatment technologies rely on the waste material as feedstock; the feedstock varies in quality and quantities depending on several factors; hence, the technology fails, as a result. It is critical to design an advanced waste treatment technology in an integrated approach to minimize the possibility of technology failure due to unpredictable feedstock quality, quantities, conversion efficiencies, and inconsistent product yield or quality. An integrated waste-to-energy approach offers a secure system design that considers sustainable waste management practices.
Keywords: Emerging markets, evaluation tool, interventions, waste treatment technologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1008191 Mesoscopic Defects of Forming and Induced Properties on the Impact of a Composite Glass/Polyester
Authors: Bachir Kacimi, Fatiha Teklal, Arezki Djebbar
Abstract:
Forming processes induce residual deformations on the reinforcement and sometimes lead to mesoscopic defects, which are more recurrent than macroscopic defects during the manufacture of complex structural parts. This study deals with the influence of the fabric shear and buckles defects, which appear during draping processes of composite, on the impact behavior of a glass fiber reinforced polymer. To achieve this aim, we produced several specimens with different amplitude of deformations (shear) and defects on the fabric using a specific bench. The specimens were manufactured using the contact molding and tested with several impact energies. The results and measurements made on tested specimens were compared to those of the healthy material. The results showed that the buckle defects have a negative effect on elastic parameters and revealed a larger damage with significant out-of-plane mode relatively to the healthy composite material. This effect is the consequence of a local fiber impoverishment and a disorganization of the fibrous network, with a reorientation of the fibers following the out-of-plane buckling of the yarns, in the area where the defects are located. For the material with calibrated shear of the reinforcement, the increased local fiber rate due to the shear deformations and the contribution to stiffness of the transverse yarns led to an increase in mechanical properties.
Keywords: Defects, forming, impact, induced properties, textiles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 522190 Design of Direct Power Controller for a High Power Neutral Point Clamped Converter Using Real Time Simulator
Authors: Amin Zabihinejad, Philippe Viarouge
Abstract:
In this paper, a direct power control (DPC) strategies have been investigated in order to control a high power AC/DC converter with time variable load. This converter is composed of a three level three phase neutral point clamped (NPC) converter as rectifier and an H-bridge four quadrant current control converter. In the high power application, controller not only must adjust the desire outputs but also decrease the level of distortions which are injected to the network from the converter. Regarding to this reason and nonlinearity of the power electronic converter, the conventional controllers cannot achieve appropriate responses. In this research, the precise mathematical analysis has been employed to design the appropriate controller in order to control the time variable load. A DPC controller has been proposed and simulated using Matlab/ Simulink. In order to verify the simulation result, a real time simulator- OPAL-RT- has been employed. In this paper, the dynamic response and stability of the high power NPC with variable load has been investigated and compared with conventional types using a real time simulator. The results proved that the DPC controller is more stable and has more precise outputs in comparison with conventional controller.
Keywords: Direct Power Control, Three Level Rectifier, Real Time Simulator, High Power Application.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969189 Exploring Social Impact of Emerging Technologies from Futuristic Data
Authors: Heeyeul Kwon, Yongtae Park
Abstract:
Despite the highly touted benefits, emerging technologies have unleashed pervasive concerns regarding unintended and unforeseen social impacts. Thus, those wishing to create safe and socially acceptable products need to identify such side effects and mitigate them prior to the market proliferation. Various methodologies in the field of technology assessment (TA), namely Delphi, impact assessment, and scenario planning, have been widely incorporated in such a circumstance. However, literatures face a major limitation in terms of sole reliance on participatory workshop activities. They unfortunately missed out the availability of a massive untapped data source of futuristic information flooding through the Internet. This research thus seeks to gain insights into utilization of futuristic data, future-oriented documents from the Internet, as a supplementary method to generate social impact scenarios whilst capturing perspectives of experts from a wide variety of disciplines. To this end, network analysis is conducted based on the social keywords extracted from the futuristic documents by text mining, which is then used as a guide to produce a comprehensive set of detailed scenarios. Our proposed approach facilitates harmonized depictions of possible hazardous consequences of emerging technologies and thereby makes decision makers more aware of, and responsive to, broad qualitative uncertainties.
Keywords: Emerging technologies, futuristic data, scenario, text mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2390188 A Robust Approach to the Load Frequency Control Problem with Speed Regulation Uncertainty
Authors: S. Z. Sayed Hassen
Abstract:
The load frequency control problem of power systems has attracted a lot of attention from engineers and researchers over the years. Increasing and quickly changing load demand, coupled with the inclusion of more generators with high variability (solar and wind power generators) on the network are making power systems more difficult to regulate. Frequency changes are unavoidable but regulatory authorities require that these changes remain within a certain bound. Engineers are required to perform the tricky task of adjusting the control system to maintain the frequency within tolerated bounds. It is well known that to minimize frequency variations, a large proportional feedback gain (speed regulation constant) is desirable. However, this improvement in performance using proportional feedback comes about at the expense of a reduced stability margin and also allows some steady-state error. A conventional PI controller is then included as a secondary control loop to drive the steadystate error to zero. In this paper, we propose a robust controller to replace the conventional PI controller which guarantees performance and stability of the power system over the range of variation of the speed regulation constant. Simulation results are shown to validate the superiority of the proposed approach on a simple single-area power system model.
Keywords: Robust control, power system, integral action, minimax LQG control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918187 Coordination for Synchronous Cooperative Systems Based on Fuzzy Causal Relations
Authors: Luis A. Morales Rosales, Saul E. Pomares Hernandez, Gustavo Rodriguez Gomez
Abstract:
Synchronous cooperative systems (SCS) bring together users that are geographically distributed and connected through a network to carry out a task. Examples of SCS include Tele- Immersion and Tele-Conferences. In SCS, the coordination is the core of the system, and it has been defined as the act of managing interdependencies between activities performed to achieve a goal. Some of the main problems that SCS present deal with the management of constraints between simultaneous activities and the execution ordering of these activities. In order to resolve these problems, orderings based on Lamport-s happened-before relation have been used, namely, causal, Δ-causal, and causal-total orderings. They mainly differ in the degree of asynchronous execution allowed. One of the most important orderings is the causal order, which establishes that the events must be seen in the cause-effect order as they occur in the system. In this paper we show that for certain SCS (e.g. videoconferences, tele-immersion) where some degradation of the system is allowed, ensuring the causal order is still rigid, which can render negative affects to the system. In this paper, we illustrate how a more relaxed ordering, which we call Fuzzy Causal Order (FCO), is useful for such kind of systems by allowing a more asynchronous execution than the causal order. The benefit of the FCO is illustrated by applying it to a particular scenario of intermedia synchronization of an audio-conference system.
Keywords: Event ordering, fuzzy causal ordering, happenedbefore relation and cooperative systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496186 Metabolomics Profile Recognition for Cancer Diagnostics
Authors: Valentina L. Kouznetsova, Jonathan W. Wang, Igor F. Tsigelny
Abstract:
Metabolomics has become a rising field of research for various diseases, particularly cancer. Increases or decreases in metabolite concentrations in the human body are indicative of various cancers. Further elucidation of metabolic pathways and their significance in cancer research may greatly spur medicinal discovery. We analyzed the metabolomics profiles of lung cancer. Thirty-three metabolites were selected as significant. These metabolites are involved in 37 metabolic pathways delivered by MetaboAnalyst software. The top pathways are glyoxylate and dicarboxylate pathway (its hubs are formic acid and glyoxylic acid) along with Citrate cycle pathway followed by Taurine and hypotaurine pathway (the hubs in the latter are taurine and sulfoacetaldehyde) and Glycine, serine, and threonine pathway (the hubs are glycine and L-serine). We studied interactions of the metabolites with the proteins involved in cancer-related signaling networks, and developed an approach to metabolomics biomarker use in cancer diagnostics. Our analysis showed that a significant part of lung-cancer-related metabolites interacts with main cancer-related signaling pathways present in this network: PI3K–mTOR–AKT pathway, RAS–RAF–ERK1/2 pathway, and NFKB pathway. These results can be employed for use of metabolomics profiles in elucidation of the related cancer proteins signaling networks.Keywords: Cancer, metabolites, metabolic pathway, signaling pathway.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390185 Increase of Organization in Complex Systems
Authors: Georgi Yordanov Georgiev, Michael Daly, Erin Gombos, Amrit Vinod, Gajinder Hoonjan
Abstract:
Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system - the central processing unit (CPU) of computers. The quantity of organization for several generations of CPUs shows a double exponential rate of change of organization with time. The exact functional dependence has a fine, S-shaped structure, revealing some of the mechanisms of self-organization. The principle of least action helps to explain the mechanism of increase of organization through quantity accumulation and constraint and curvature minimization with an attractor, the least average sum of actions of all elements and for all motions. This approach can help describe, quantify, measure, manage, design and predict future behavior of complex systems to achieve the highest rates of self organization to improve their quality. It can be applied to other complex systems from Physics, Chemistry, Biology, Ecology, Economics, Cities, network theory and others where complex systems are present.
Keywords: Organization, self-organization, complex system, complexification, quantitative measure, principle of least action, principle of stationary action, attractor, progressive development, acceleration, stochastic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640184 Rotorcraft Performance and Environmental Impact Evaluation by Multidisciplinary Modelling
Authors: Pierre-Marie Basset, Gabriel Reboul, Binh DangVu, Sébastien Mercier
Abstract:
Rotorcraft provides invaluable services thanks to their Vertical Take-Off and Landing (VTOL), hover and low speed capabilities. Yet their use is still often limited by their cost and environmental impact, especially noise and energy consumption. One of the main brakes to the expansion of the use of rotorcraft for urban missions is the environmental impact. The first main concern for the population is the noise. In order to develop the transversal competency to assess the rotorcraft environmental footprint, a collaboration has been launched between six research departments within ONERA. The progress in terms of models and methods are capitalized into the numerical workshop C.R.E.A.T.I.O.N. “Concepts of Rotorcraft Enhanced Assessment Through Integrated Optimization Network”. A typical mission for which the environmental impact issue is of great relevance has been defined. The first milestone is to perform the pre-sizing of a reference helicopter for this mission. In a second milestone, an alternate rotorcraft concept has been defined: a tandem rotorcraft with optional propulsion. The key design trends are given for the pre-sizing of this rotorcraft aiming at a significant reduction of the global environmental impact while still giving equivalent flight performance and safety with respect to the reference helicopter. The models and methods have been improved for catching sooner and more globally, the relative variations on the environmental impact when changing the rotorcraft architecture, the pre-design variables and the operation parameters.Keywords: Environmental impact, flight performance, helicopter, rotorcraft pre-sizing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497183 Semantic Modeling of Management Information: Enabling Automatic Reasoning on DMTF-CIM
Authors: Fernando Alonso, Rafael Fernandez, Sonia Frutos, Javier Soriano
Abstract:
CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping can be used for automatic reasoning about the management information models, as a design aid, by means of new-generation CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.Keywords: CIM, Knowledge-based Information Models, Ontology Languages, OWL, Description Logics, Integrated Network Management, Intelligent Agents, Automatic Reasoning Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731182 Development of Fuzzy Logic and Neuro-Fuzzy Surface Roughness Prediction Systems Coupled with Cutting Current in Milling Operation
Authors: Joseph C. Chen, Venkata Mohan Kudapa
Abstract:
Development of two real-time surface roughness (Ra) prediction systems for milling operations was attempted. The systems used not only cutting parameters, such as feed rate and spindle speed, but also the cutting current generated and corrected by a clamp type energy sensor. Two different approaches were developed. First, a fuzzy inference system (FIS), in which the fuzzy logic rules are generated by experts in the milling processes, was used to conduct prediction modeling using current cutting data. Second, a neuro-fuzzy system (ANFIS) was explored. Neuro-fuzzy systems are adaptive techniques in which data are collected on the network, processed, and rules are generated by the system. The inference system then uses these rules to predict Ra as the output. Experimental results showed that the parameters of spindle speed, feed rate, depth of cut, and input current variation could predict Ra. These two systems enable the prediction of Ra during the milling operation with an average of 91.83% and 94.48% accuracy by FIS and ANFIS systems, respectively. Statistically, the ANFIS system provided better prediction accuracy than that of the FIS system.Keywords: Surface roughness, input current, fuzzy logic, neuro-fuzzy, milling operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 492181 110 MW Geothermal Power Plant Multiple Simulator, Using Wireless Technology
Authors: Guillermo Romero-Jiménez, Luis A. Jiménez-Fraustro, Mayolo Salinas-Camacho, Heriberto Avalos-Valenzuela
Abstract:
A geothermal power plant multiple simulator for operators training is presented. The simulator is designed to be installed in a wireless local area network and has a capacity to train one to six operators simultaneously, each one with an independent simulation session. The sessions must be supervised only by one instructor. The main parts of this multiple simulator are: instructor and operator-s stations. On the instructor station, the instructor controls the simulation sessions, establishes training exercises and supervises each power plant operator in individual way. This station is hosted in a Main Personal Computer (NS) and its main functions are: to set initial conditions, snapshots, malfunctions or faults, monitoring trends, and process and soft-panel diagrams. On the other hand the operators carry out their actions over the power plant simulated on the operator-s stations; each one is also hosted in a PC. The main software of instructor and operator-s stations are executed on the same NS and displayed in PCs through graphical Interactive Process Diagrams (IDP). The geothermal multiple simulator has been installed in the Geothermal Simulation Training Center (GSTC) of the Comisi├│n Federal de Electricidad, (Federal Commission of Electricity, CFE), Mexico, and is being utilized as a part of the training courses for geothermal power plant operators.Keywords: Geothermal power plant, multiple simulator, training operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047180 Discovering Complex Regularities: from Tree to Semi-Lattice Classifications
Authors: A. Faro, D. Giordano, F. Maiorana
Abstract:
Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optimize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is able to automatically suggest a strategy to optimize the number of classes optimization, but also support both tree classifications and semi-lattice organizations of the classes to give to the users the possibility of passing from one class to the ones with which it has some aspects in common. Examples of using tree and semi-lattice classifications are given to illustrate advantages and problems. The tool is applied to classify macroeconomic data that report the most developed countries- import and export. It is possible to classify the countries based on their economic behaviour and use the tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation. Possible interrelationships between the classes and their meaning are also discussed.Keywords: Unsupervised classification, Kohonen networks, macroeconomics, Visual data mining, Cluster interpretation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541179 A Simulated Environment Approach to Investigate the Effect of Adversarial Perturbations on Traffic Sign for Automotive Software-in-Loop Testing
Authors: Sunil Patel, Pallab Maji
Abstract:
To study the effect of adversarial attack environment must be controlled. Autonomous driving includes mainly 5 phases sense, perceive, map, plan, and drive. Autonomous vehicles sense their surrounding with the help of different sensors like cameras, radars, and lidars. Deep learning techniques are considered Blackbox and found to be vulnerable to adversarial attacks. In this research, we study the effect of the various known adversarial attacks with the help of the Unreal Engine-based, high-fidelity, real-time raytraced simulated environment. The goal of this experiment is to find out if adversarial attacks work in moving vehicles and if an unknown network may be targeted. We discovered that the existing Blackbox and Whitebox attacks have varying effects on different traffic signs. We observed that attacks that impair detection in static scenarios do not have the same effect on moving vehicles. It was found that some adversarial attacks with hardly noticeable perturbations entirely blocked the recognition of certain traffic signs. We observed that the daylight condition has a substantial impact on the model's performance by simulating the interplay of light on traffic signs. Our findings have been found to closely resemble outcomes encountered in the real world.
Keywords: Adversarial attack simulation, computer simulation, ray-traced environment, realistic simulation, unreal engine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 432178 Alkali Silica Reaction Mitigation and Prevention Measures for Arkansas Local Aggregates
Authors: Amin Kamal Akhnoukh, Lois Zaki Kamel, Magued Mourad Barsoum
Abstract:
The objective of this research is to mitigate and prevent the alkali silica reactivity (ASR) in highway construction projects. ASR is a deleterious reaction initiated when the silica content of the aggregate reacts with alkali hydroxides in cement in the presence of relatively high moisture content. The ASR results in the formation of an expansive white colored gel-like material which forms the destructive tensile stresses inside hardened concrete. In this research, different types of local aggregates available in the State of Arkansas were mixed and mortar bars were poured according to the ASTM specifications. Mortar bars expansion was measured versus time and aggregates with potential ASR problems were detected. Different types of supplementary cementitious materials (SCMs) were used in remixing mortar bars with highly reactive aggregates. Length changes for remixed bars proved that different types of SCMs can be successfully used in reducing the expansive effect of ASR. SCMs percentage by weight is highly dependent on the SCM type. The result of this study will help avoiding future losses due to ASR cracking in construction project and reduce the maintenance, repair, and replacement budgets required for highways network.Keywords: Alkali Silica Reaction, Aggregates, Moisture, Cracks, Mortar Bar Test supplementary cementitious materials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026177 Detection of Linkages Between Extreme Flow Measures and Climate Indices
Authors: Mohammed Sharif, Donald Burn
Abstract:
Large scale climate signals and their teleconnections can influence hydro-meteorological variables on a local scale. Several extreme flow and timing measures, including high flow and low flow measures, from 62 hydrometric stations in Canada are investigated to detect possible linkages with several large scale climate indices. The streamflow data used in this study are derived from the Canadian Reference Hydrometric Basin Network and are characterized by relatively pristine and stable land-use conditions with a minimum of 40 years of record. A composite analysis approach was used to identify linkages between extreme flow and timing measures and climate indices. The approach involves determining the 10 highest and 10 lowest values of various climate indices from the data record. Extreme flow and timing measures for each station were examined for the years associated with the 10 largest values and the years associated with the 10 smallest values. In each case, a re-sampling approach was applied to determine if the 10 values of extreme flow measures differed significantly from the series mean. Results indicate that several stations are impacted by the large scale climate indices considered in this study. The results allow the determination of any relationship between stations that exhibit a statistically significant trend and stations for which the extreme measures exhibit a linkage with the climate indices.
Keywords: flood analysis, low-flow events, climate change, trend analysis, Canada
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600176 Data Recording for Remote Monitoring of Autonomous Vehicles
Authors: Rong-Terng Juang
Abstract:
Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.
Keywords: Autonomous vehicle, data recording, remote monitoring, controller area network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350175 Offset Dependent Uniform Delay Mathematical Optimization Model for Signalized Traffic Network Using Differential Evolution Algorithm
Authors: Tahseen Al-Shaikhli, Halim Ceylan, Jonathan Weaver, Osman Nuri Çelik, Onur Gungor Sahin
Abstract:
A concept of uniform delay offset dependent mathematical optimization problem is derived as the main objective for this study using a differential evolution algorithm. Furthermore, the objectives are to control the coordination problem which mainly depends on offset selection, and to estimate the uniform delay based on the offset choice at each signalized intersection. The assumption is the periodic sinusoidal function for arrival and departure patterns. The cycle time is optimized at the entry links and the optimized value is used in the non-entry links as a common cycle time. The offset optimization algorithm is used to calculate the uniform delay at each link. The results are illustrated by using a case study and compared with the canonical uniform delay model derived by Webster and the highway capacity manual’s model. The findings show that the derived model minimizes the total uniform delay to almost half compared to conventional models; the mathematical objective function is robust; the algorithm convergence time is fast.
Keywords: Area traffic control, differential evolution, offset variable, sinusoidal periodic function, traffic flow, uniform delay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 365174 A Web Oriented Spread Spectrum Watermarking Procedure for MPEG-2 Videos
Authors: Franco Frattolillo
Abstract:
In the last decade digital watermarking procedures have become increasingly applied to implement the copyright protection of multimedia digital contents distributed on the Internet. To this end, it is worth noting that a lot of watermarking procedures for images and videos proposed in literature are based on spread spectrum techniques. However, some scepticism about the robustness and security of such watermarking procedures has arisen because of some documented attacks which claim to render the inserted watermarks undetectable. On the other hand, web content providers wish to exploit watermarking procedures characterized by flexible and efficient implementations and which can be easily integrated in their existing web services frameworks or platforms. This paper presents how a simple spread spectrum watermarking procedure for MPEG-2 videos can be modified to be exploited in web contexts. To this end, the proposed procedure has been made secure and robust against some well-known and dangerous attacks. Furthermore, its basic scheme has been optimized by making the insertion procedure adaptive with respect to the terminals used to open the videos and the network transactions carried out to deliver them to buyers. Finally, two different implementations of the procedure have been developed: the former is a high performance parallel implementation, whereas the latter is a portable Java and XML based implementation. Thus, the paper demonstrates that a simple spread spectrum watermarking procedure, with limited and appropriate modifications to the embedding scheme, can still represent a valid alternative to many other well-known and more recent watermarking procedures proposed in literature.Keywords: Copyright protection, digital watermarking, intellectual property protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508173 Modeling and Analysis for Effective Capacity of a Cross-Layer Optimized Wireless Networks
Authors: Reham A. El-mayet, Hesham M. El-Badawy, Salwa H. Elramly
Abstract:
New generation mobile communication networks have the ability of supporting triple play. In order that, Orthogonal Frequency Division Multiplexing (OFDM) access techniques have been chosen to enlarge the system ability for high data rates networks. Many of cross-layer modeling and optimization schemes for Quality of Service (QoS) and capacity of downlink multiuser OFDM system were proposed. In this paper, the Maximum Weighted Capacity (MWC) based resource allocation at the Physical (PHY) layer is used. This resource allocation scheme provides a much better QoS than the previous resource allocation schemes, while maintaining the highest or nearly highest capacity and costing similar complexity. In addition, the Delay Satisfaction (DS) scheduling at the Medium Access Control (MAC) layer, which allows more than one connection to be served in each slot is used. This scheduling technique is more efficient than conventional scheduling to investigate both of the number of users as well as the number of subcarriers against system capacity. The system will be optimized for different operational environments: the outdoor deployment scenarios as well as the indoor deployment scenarios are investigated and also for different channel models. In addition, effective capacity approach [1] is used not only for providing QoS for different mobile users, but also to increase the total wireless network's throughput.Keywords: Cross-layer, effective capacity, LTE, OFDM, QoS, resource allocation, wireless networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795172 Reduction of Power Losses in Distribution Systems
Authors: Y. Al-Mahroqi, I.A. Metwally, A. Al-Hinai, A. Al-Badi
Abstract:
Losses reduction initiatives in distribution systems have been activated due to the increasing cost of supplying electricity, the shortage in fuel with ever-increasing cost to produce more power, and the global warming concerns. These initiatives have been introduced to the utilities in shape of incentives and penalties. Recently, the electricity distribution companies in Oman have been incentivized to reduce the distribution technical and non-technical losses with an equal annual reduction rate for 6 years. In this paper, different techniques for losses reduction in Mazoon Electricity Company (MZEC) are addressed. In this company, high numbers of substation and feeders were found to be non-compliant with the Distribution System Security Standard (DSSS). Therefore, 33 projects have been suggested to bring non-complying 29 substations and 28 feeders to meet the planed criteria and to comply with the DSSS. The largest part of MZEC-s network (South Batinah region) was modeled by ETAP software package. The model has been extended to implement the proposed projects and to examine their effects on losses reduction. Simulation results have shown that the implementation of these projects leads to a significant improvement in voltage profile, and reduction in the active and the reactive power losses. Finally, the economical analysis has revealed that the implementation of the proposed projects in MZEC leads to an annual saving of about US$ 5 million.Keywords: Losses Reduction, Technical Losses, Non-Technical Losses, Cost Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9368171 Broadband PowerLine Communications: Performance Analysis
Authors: Justinian Anatory, Nelson Theethayi, M. M. Kissaka, N. H. Mvungi
Abstract:
Power line channel is proposed as an alternative for broadband data transmission especially in developing countries like Tanzania [1]. However the channel is affected by stochastic attenuation and deep notches which can lead to the limitation of channel capacity and achievable data rate. Various studies have characterized the channel without giving exactly the maximum performance and limitation in data transfer rate may be this is due to complexity of channel modeling being used. In this paper the channel performance of medium voltage, low voltage and indoor power line channel is presented. In the investigations orthogonal frequency division multiplexing (OFDM) with phase shift keying (PSK) as carrier modulation schemes is considered, for indoor, medium and low voltage channels with typical ten branches and also Golay coding is applied for medium voltage channel. From channels, frequency response deep notches are observed in various frequencies which can lead to reduce the achievable data rate. However, is observed that data rate up to 240Mbps is realized for a signal to noise ratio of about 50dB for indoor and low voltage channels, however for medium voltage a typical link with ten branches is affected by strong multipath and coding is required for feasible broadband data transfer.
Keywords: Powerline Communications, branched network, channel model, modulation, channel performance, OFDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832