Search results for: fuzzy credibility constrained programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1690

Search results for: fuzzy credibility constrained programming

160 A Scalable Media Job Framework for an Open Source Search Engine

Authors: Pooja Mishra, Chris Pollett

Abstract:

This paper explores efficient ways to implement various media-updating features like news aggregation, video conversion, and bulk email handling. All of these jobs share the property that they are periodic in nature, and they all benefit from being handled in a distributed fashion. The data for these jobs also often comes from a social or collaborative source. We isolate the class of periodic, one round map reduce jobs as a useful setting to describe and handle media updating tasks. As such tasks are simpler than general map reduce jobs, programming them in a general map reduce platform could easily become tedious. This paper presents a MediaUpdater module of the Yioop Open Source Search Engine Web Portal designed to handle such jobs via an extension of a PHP class. We describe how to implement various media-updating tasks in our system as well as experiments carried out using these implementations on an Amazon Web Services cluster.

Keywords: Distributed jobs framework, news aggregation, video conversion, email.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995
159 Review of Strategies for Hybrid Energy Storage Management System in Electric Vehicle Application

Authors: Kayode A. Olaniyi, Adeola A. Ogunleye, Tola M. Osifeko

Abstract:

Electric Vehicles (EV) appear to be gaining increasing patronage as a feasible alternative to Internal Combustion Engine Vehicles (ICEVs) for having low emission and high operation efficiency. The EV energy storage systems are required to handle high energy and power density capacity constrained by limited space, operating temperature, weight and cost. The choice of strategies for energy storage evaluation, monitoring and control remains a challenging task. This paper presents review of various energy storage technologies and recent researches in battery evaluation techniques used in EV applications. It also underscores strategies for the hybrid energy storage management and control schemes for the improvement of EV stability and reliability. The study reveals that despite the advances recorded in battery technologies there is still no cell which possess both the optimum power and energy densities among other requirements, for EV application. However combination of two or more energy storages as hybrid and allowing the advantageous attributes from each device to be utilized is a promising solution. The review also reveals that State-of-Charge (SoC) is the most crucial method for battery estimation. The conventional method of SoC measurement is however questioned in the literature and adaptive algorithms that include all model of disturbances are being proposed. The review further suggests that heuristic-based approach is commonly adopted in the development of strategies for hybrid energy storage system management. The alternative approach which is optimization-based is found to be more accurate but is memory and computational intensive and as such not recommended in most real-time applications.

Keywords: Hybrid electric vehicle, hybrid energy storage, battery state estimation, ate of charge, state of health.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 987
158 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling

Authors: M. Almutairi, S. Hadjiloucas

Abstract:

The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.

Keywords: Harmonics, passive filter, power factor, power quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
157 Wavelet-Based Data Compression Technique for Wireless Sensor Networks

Authors: P. Kumsawat, N. Pimpru, K. Attakitmongcol, A.Srikaew

Abstract:

In this paper, we proposed an efficient data compression strategy exploiting the multi-resolution characteristic of the wavelet transform. We have developed a sensor node called “Smart Sensor Node; SSN". The main goals of the SSN design are lightweight, minimal power consumption, modular design and robust circuitry. The SSN is made up of four basic components which are a sensing unit, a processing unit, a transceiver unit and a power unit. FiOStd evaluation board is chosen as the main controller of the SSN for its low costs and high performance. The software coding of the implementation was done using Simulink model and MATLAB programming language. The experimental results show that the proposed data compression technique yields recover signal with good quality. This technique can be applied to compress the collected data to reduce the data communication as well as the energy consumption of the sensor and so the lifetime of sensor node can be extended.

Keywords: Wireless sensor network, wavelet transform, data compression, ZigBee, skipped high-pass sub-band.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2960
156 Data Embedding Based on Better Use of Bits in Image Pixels

Authors: Rehab H. Alwan, Fadhil J. Kadhim, Ahmad T. Al-Taani

Abstract:

In this study, a novel approach of image embedding is introduced. The proposed method consists of three main steps. First, the edge of the image is detected using Sobel mask filters. Second, the least significant bit LSB of each pixel is used. Finally, a gray level connectivity is applied using a fuzzy approach and the ASCII code is used for information hiding. The prior bit of the LSB represents the edged image after gray level connectivity, and the remaining six bits represent the original image with very little difference in contrast. The proposed method embeds three images in one image and includes, as a special case of data embedding, information hiding, identifying and authenticating text embedded within the digital images. Image embedding method is considered to be one of the good compression methods, in terms of reserving memory space. Moreover, information hiding within digital image can be used for security information transfer. The creation and extraction of three embedded images, and hiding text information is discussed and illustrated, in the following sections.

Keywords: Image embedding, Edge detection, gray level connectivity, information hiding, digital image compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107
155 Optimization of Solar Tracking Systems

Authors: A. Zaher, A. Traore, F. Thiéry, T. Talbert, B. Shaer

Abstract:

In this paper, an intelligent approach is proposed to optimize the orientation of continuous solar tracking systems on cloudy days. Considering the weather case, the direct sunlight is more important than the diffuse radiation in case of clear sky. Thus, the panel is always pointed towards the sun. In case of an overcast sky, the solar beam is close to zero, and the panel is placed horizontally to receive the maximum of diffuse radiation. Under partly covered conditions, the panel must be pointed towards the source that emits the maximum of solar energy and it may be anywhere in the sky dome. Thus, the idea of our approach is to analyze the images, captured by ground-based sky camera system, in order to detect the zone in the sky dome which is considered as the optimal source of energy under cloudy conditions. The proposed approach is implemented using experimental setup developed at PROMES-CNRS laboratory in Perpignan city (France). Under overcast conditions, the results were very satisfactory, and the intelligent approach has provided efficiency gains of up to 9% relative to conventional continuous sun tracking systems.

Keywords: Clouds detection, fuzzy inference systems, images processing, sun trackers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
154 Movement Optimization of Robotic Arm Movement Using Soft Computing

Authors: V. K. Banga

Abstract:

Robots are now playing a very promising role in industries. Robots are commonly used in applications in repeated operations or where operation by human is either risky or not feasible. In most of the industrial applications, robotic arm manipulators are widely used. Robotic arm manipulator with two link or three link structures is commonly used due to their low degrees-of-freedom (DOF) movement. As the DOF of robotic arm increased, complexity increases. Instrumentation involved with robotics plays very important role in order to interact with outer environment. In this work, optimal control for movement of various DOFs of robotic arm using various soft computing techniques has been presented. We have discussed about different robotic structures having various DOF robotics arm movement. Further stress is on kinematics of the arm structures i.e. forward kinematics and inverse kinematics. Trajectory planning of robotic arms using soft computing techniques is demonstrating the flexibility of this technique. The performance is optimized for all possible input values and results in optimized movement as resultant output. In conclusion, soft computing has been playing very important role for achieving optimized movement of robotic arm. It also requires very limited knowledge of the system to implement soft computing techniques.

Keywords: Artificial intelligence, kinematics, robotic arm, neural networks, fuzzy logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
153 Implementation of Adder-Subtracter Design with VerilogHDL

Authors: May Phyo Thwal, Khin Htay Kyi, Kyaw Swar Soe

Abstract:

According to the density of the chips, designers are trying to put so any facilities of computational and storage on single chips. Along with the complexity of computational and storage circuits, the designing, testing and debugging become more and more complex and expensive. So, hardware design will be built by using very high speed hardware description language, which is more efficient and cost effective. This paper will focus on the implementation of 32-bit ALU design based on Verilog hardware description language. Adder and subtracter operate correctly on both unsigned and positive numbers. In ALU, addition takes most of the time if it uses the ripple-carry adder. The general strategy for designing fast adders is to reduce the time required to form carry signals. Adders that use this principle are called carry look- ahead adder. The carry look-ahead adder is to be designed with combination of 4-bit adders. The syntax of Verilog HDL is similar to the C programming language. This paper proposes a unified approach to ALU design in which both simulation and formal verification can co-exist.

Keywords: Addition, arithmetic logic unit, carry look-ahead adder, Verilog HDL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8888
152 Development of Mobile Application Social Guidance and Counseling for Junior High School

Authors: Suyoto, Tri Prasetyaningrum

Abstract:

At this paper, we will present the development of mobile application Social Guidance and Counseling (GC) that called “m-NingBK: Social GC”. The application is used for GC services that run on mobile devices. The application is designed specifically for Junior High School student. The methods are a combination of interactive multimedia approaches and educational psychology. Therefore, the design process is carried out three processes, which are digitizing of material social GC services, visualizing wisely and making interactive. This method is intended to make students not only hear and see but also "do" the virtual. There are five components used in multimedia applications "m-NingBK: Social GC" i.e. text, images / graphics, audio / sound, animation and video. Four menus provided by this application is the potential self, social, Expert System and about. The application is built using the Java programming language. This application was tested using a Smartphone with Android Operating System. Based on the test, people give rating: 16.7% excellent, 61.1% good, 19.4% adequate, and 2.8% poor.

Keywords: Expert Systems, Guidance and Counseling, mobile application, multimedia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2877
151 Dynamic State Estimation with Optimal PMU and Conventional Measurements for Complete Observability

Authors: M. Ravindra, R. Srinivasa Rao

Abstract:

This paper presents a Generalized Binary Integer Linear Programming (GBILP) method for optimal allocation of Phasor Measurement Units (PMUs) and to generate Dynamic State Estimation (DSE) solution with complete observability. The GBILP method is formulated with Zero Injection Bus (ZIB) constraints to reduce the number of locations for placement of PMUs in the case of normal and single line contingency. The integration of PMU and conventional measurements is modeled in DSE process to estimate accurate states of the system. To estimate the dynamic behavior of the power system with proposed method, load change up to 40% considered at a bus in the power system network. The proposed DSE method is compared with traditional Weighted Least Squares (WLS) state estimation method in presence of load changes to show the impact of PMU measurements. MATLAB simulations are carried out on IEEE 14, 30, 57, and 118 bus systems to prove the validity of the proposed approach.

Keywords: Observability, phasor measurement units, PMU, state estimation, dynamic state estimation, SCADA measurements, zero injection bus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1221
150 Computer Proven Correctness of the Rabin Public-Key Scheme

Authors: Johannes Buchmann, Markus Kaiser

Abstract:

We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.

Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
149 Towards a New Methodology for Developing Web-Based Systems

Authors: Omer Ishag Eldai, Ahmed Hassan M. H. Ali, S. Raviraja

Abstract:

Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.

Keywords: Web based systems, Web engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
148 Low Energy Method for Data Delivery in Ubiquitous Network

Authors: Tae Kyung Kim, Hee Suk Seo

Abstract:

Recent advances in wireless sensor networks have led to many routing methods designed for energy-efficiency in wireless sensor networks. Despite that many routing methods have been proposed in USN, a single routing method cannot be energy-efficient if the environment of the ubiquitous sensor network varies. We present the controlling network access to various hosts and the services they offer, rather than on securing them one by one with a network security model. When ubiquitous sensor networks are deployed in hostile environments, an adversary may compromise some sensor nodes and use them to inject false sensing reports. False reports can lead to not only false alarms but also the depletion of limited energy resource in battery powered networks. The interleaved hop-by-hop authentication scheme detects such false reports through interleaved authentication. This paper presents a LMDD (Low energy method for data delivery) algorithm that provides energy-efficiency by dynamically changing protocols installed at the sensor nodes. The algorithm changes protocols based on the output of the fuzzy logic which is the fitness level of the protocols for the environment.

Keywords: Data delivery, routing, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
147 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnels projects in which there is a number of tunnels and different professional teams involved. In this regard, a comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, an applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate and so forth can be calculated and reported in a standard format.

Keywords: Engineering geology, rock mass classification, rock mechanic, tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44
146 Performance Evaluation of Iris Region Detection and Localization for Biometric Identification System

Authors: Chit Su Htwe, Win Htay

Abstract:

The iris recognition technology is the most accurate, fast and less invasive one compared to other biometric techniques using for example fingerprints, face, retina, hand geometry, voice or signature patterns. The system developed in this study has the potential to play a key role in areas of high-risk security and can enable organizations with means allowing only to the authorized personnel a fast and secure way to gain access to such areas. The paper aim is to perform the iris region detection and iris inner and outer boundaries localization. The system was implemented on windows platform using Visual C# programming language. It is easy and efficient tool for image processing to get great performance accuracy. In particular, the system includes two main parts. The first is to preprocess the iris images by using Canny edge detection methods, segments the iris region from the rest of the image and determine the location of the iris boundaries by applying Hough transform. The proposed system tested on 756 iris images from 60 eyes of CASIA iris database images.

Keywords: Canny, C#, hough transform, image preprocessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
145 Perspectives and Outcomes of a Long and Shorter Community Mental Health Program

Authors: Danielle Klassen, Reiko Yeap, Margo Schmitt-Boshnick, Scott Oddie

Abstract:

The development of the 7-week Alberta Happiness Basics program was initiated in 2010 in response to the need for community mental health programming. This provincial wide program aims to increase overall happiness and reduce negative thoughts and feelings through a positive psychology intervention. While the 7-week program has proven effective, a shortened 4-week program has additionally been developed to address client needs. In this study, participants were interviewed to determine if the 4- and 7-week programs had similar success of producing lasting behavior change at 3, 6, and 9 months post-program. A health quality of life (HQOL) measure was also used to compare the two programs and examine patient outcomes. Quantitative and qualitative analysis showed significant improvements in HQOL and sustainable behavior change for both programs. Findings indicate that the shorter, patient-centered program was effective in increasing happiness and reducing negative thoughts and feelings.

Keywords: Primary care, mental health, depression, short duration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878
144 Autohydrolysis Treatment of Olive Cake to Extract Fructose and Sucrose

Authors: G. Blázquez, A. Gálvez-Pérez, M. Calero, I. Iáñez-Rodríguez, M. A. Martín-Lara, A. Pérez

Abstract:

The production of olive oil is considered as one of the most important agri-food industries. However, some of the by-products generated in the process are potential pollutants and cause environmental problems. Consequently, the management of these by-products is currently considered as a challenge for the olive oil industry. In this context, several technologies have been developed and tested. In this sense, the autohydrolysis of these by-products could be considered as a promising technique. Therefore, this study focused on autohydrolysis treatments of a solid residue from the olive oil industry denominated olive cake. This one comes from the olive pomace extraction with hexane. Firstly, a water washing was carried out to eliminate the water soluble compounds. Then, an experimental design was developed for the autohydrolysis experiments carried out in the hydrothermal pressure reactor. The studied variables were temperature (30, 60 and 90 ºC) and time (30, 60, 90 min). On the other hand, aliquots of liquid obtained fractions were analysed by HPLC to determine the fructose and sucrose contents present in the liquid fraction. Finally, the obtained results of sugars contents and the yields of the different experiments were fitted to a neuro-fuzzy and to a polynomial model.

Keywords: ANFIS, olive cake, polyols, saccharides.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 619
143 Genetic Folding: Analyzing the Mercer-s Kernels Effect in Support Vector Machine using Genetic Folding

Authors: Mohd A. Mezher, Maysam F. Abbod

Abstract:

Genetic Folding (GF) a new class of EA named as is introduced for the first time. It is based on chromosomes composed of floating genes structurally organized in a parent form and separated by dots. Although, the genotype/phenotype system of GF generates a kernel expression, which is the objective function of superior classifier. In this work the question of the satisfying mapping-s rules in evolving populations is addressed by analyzing populations undergoing either Mercer-s or none Mercer-s rule. The results presented here show that populations undergoing Mercer-s rules improve practically models selection of Support Vector Machine (SVM). The experiment is trained multi-classification problem and tested on nonlinear Ionosphere dataset. The target of this paper is to answer the question of evolving Mercer-s rule in SVM addressed using either genetic folding satisfied kernel-s rules or not applied to complicated domains and problems.

Keywords: Genetic Folding, GF, Evolutionary Algorithms, Support Vector Machine, Genetic Algorithm, Genetic Programming, Multi-Classification, Mercer's Rules

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
142 Swarmed Discriminant Analysis for Multifunction Prosthesis Control

Authors: Rami N. Khushaba, Ahmed Al-Ani, Adel Al-Jumaily

Abstract:

One of the approaches enabling people with amputated limbs to establish some sort of interface with the real world includes the utilization of the myoelectric signal (MES) from the remaining muscles of those limbs. The MES can be used as a control input to a multifunction prosthetic device. In this control scheme, known as the myoelectric control, a pattern recognition approach is usually utilized to discriminate between the MES signals that belong to different classes of the forearm movements. Since the MES is recorded using multiple channels, the feature vector size can become very large. In order to reduce the computational cost and enhance the generalization capability of the classifier, a dimensionality reduction method is needed to identify an informative yet moderate size feature set. This paper proposes a new fuzzy version of the well known Fisher-s Linear Discriminant Analysis (LDA) feature projection technique. Furthermore, based on the fact that certain muscles might contribute more to the discrimination process, a novel feature weighting scheme is also presented by employing Particle Swarm Optimization (PSO) for estimating the weight of each feature. The new method, called PSOFLDA, is tested on real MES datasets and compared with other techniques to prove its superiority.

Keywords: Discriminant Analysis, Pattern Recognition, SignalProcessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
141 A Bi-Objective Model for Location-Allocation Problem within Queuing Framework

Authors: Amirhossein Chambari, Seyed Habib Rahmaty, Vahid Hajipour, Aida Karimi

Abstract:

This paper proposes a bi-objective model for the facility location problem under a congestion system. The idea of the model is motivated by applications of locating servers in bank automated teller machines (ATMS), communication networks, and so on. This model can be specifically considered for situations in which fixed service facilities are congested by stochastic demand within queueing framework. We formulate this model with two perspectives simultaneously: (i) customers and (ii) service provider. The objectives of the model are to minimize (i) the total expected travelling and waiting time and (ii) the average facility idle-time. This model represents a mixed-integer nonlinear programming problem which belongs to the class of NP-hard problems. In addition, to solve the model, two metaheuristic algorithms including nondominated sorting genetic algorithms (NSGA-II) and non-dominated ranking genetic algorithms (NRGA) are proposed. Besides, to evaluate the performance of the two algorithms some numerical examples are produced and analyzed with some metrics to determine which algorithm works better.

Keywords: Queuing, Location, Bi-objective, NSGA-II, NRGA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2245
140 RS Based SCADA System for Longer Distance Powered Devices

Authors: Harkishen Singh, Gavin Mangeni

Abstract:

This project aims at building an efficient and automatic power monitoring SCADA system, which is capable of monitoring the electrical parameters of high voltage powered devices in real time for example RMS voltage and current, frequency, energy consumed, power factor etc. The system uses RS-485 serial communication interface to transfer data over longer distances. Embedded C programming is the platform used to develop two hardware modules namely: RTU and Master Station modules, which both use the CC2540 BLE 4.0 microcontroller configured in slave / master mode. The Si8900 galvanic ally isolated microchip is used to perform ADC externally. The hardware communicates via UART port and sends data to the user PC using the USB port. Labview software is used to design a user interface to display current state of the power loads being monitored as well as logs data to excel spreadsheet file. An understanding of the Si8900’s auto baud rate process is key to successful implementation of this project.

Keywords: SCADA, RS485, CC2540, Labview, Si8900.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447
139 The Security Trade-Offs in Resource Constrained Nodes for IoT Application

Authors: Sultan Alharby, Nick Harris, Alex Weddell, Jeff Reeve

Abstract:

The concept of the Internet of Things (IoT) has received much attention over the last five years. It is predicted that the IoT will influence every aspect of our lifestyles in the near future. Wireless Sensor Networks are one of the key enablers of the operation of IoTs, allowing data to be collected from the surrounding environment. However, due to limited resources, nature of deployment and unattended operation, a WSN is vulnerable to various types of attack. Security is paramount for reliable and safe communication between IoT embedded devices, but it does, however, come at a cost to resources. Nodes are usually equipped with small batteries, which makes energy conservation crucial to IoT devices. Nevertheless, security cost in terms of energy consumption has not been studied sufficiently. Previous research has used a security specification of 802.15.4 for IoT applications, but the energy cost of each security level and the impact on quality of services (QoS) parameters remain unknown. This research focuses on the cost of security at the IoT media access control (MAC) layer. It begins by studying the energy consumption of IEEE 802.15.4 security levels, which is followed by an evaluation for the impact of security on data latency and throughput, and then presents the impact of transmission power on security overhead, and finally shows the effects of security on memory footprint. The results show that security overhead in terms of energy consumption with a payload of 24 bytes fluctuates between 31.5% at minimum level over non-secure packets and 60.4% at the top security level of 802.15.4 security specification. Also, it shows that security cost has less impact at longer packet lengths, and more with smaller packet size. In addition, the results depicts a significant impact on data latency and throughput. Overall, maximum authentication length decreases throughput by almost 53%, and encryption and authentication together by almost 62%.

Keywords: Internet of Things, IEEE 802.15.4, security cost evaluation, wireless sensor network, energy consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
138 A Study on the Factors Affecting Student Behavior Intention to Attend Robotics Courses at the Primary and Secondary School Levels

Authors: Jingwen Shan

Abstract:

In order to explore the key factors affecting the robot program learning intention of school students, this study takes the technology acceptance model as the theoretical basis and invites 167 students from Jiading District of Shanghai as the research subjects. In the robot course, the model of school students on their learning behavior is constructed. By verifying the causal path relationship between variables, it is concluded that teachers can enhance students’ perceptual usefulness to robotics courses by enhancing subjective norms, entertainment perception, and reducing technical anxiety, such as focusing on the gradual progress of programming and analyzing learner characteristics. Students can improve perceived ease of use by enhancing self-efficacy. At the same time, robot hardware designers can optimize in terms of entertainment and interactivity, which will directly or indirectly increase the learning intention of the robot course. By changing these factors, the learning behavior of primary and secondary school students can be more sustainable.

Keywords: TAM, learning behavior intentions, robot courses, primary and secondary school students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601
137 Modeling and Design of an Active Leg Orthosis for Tumble Protection

Authors: Eileen Chih-Ying Yang, Liang-Han Wu, Chieh-Min Chang

Abstract:

The design of an active leg orthosis for tumble protection is proposed in this paper. The orthosis would be applied to assist elders or invalids in rebalancing while they fall unexpectedly. We observe the regain balance motion of healthy and youthful people, and find the difference to elders or invalids. First, the physical model of leg would be established, and we consider the leg motions are achieve through four joints (phalanx stem, ankle, knee, and hip joint) and five links (phalanges, talus, tibia, femur, and hip bone). To formulate the dynamic equations, the coordinates which can clearly describe the position in 3D space are first defined accordance with the human movement of leg, and the kinematics and dynamics of the leg movement can be formulated based on the robotics. For the purpose, assisting elders and invalids in avoiding tumble, the posture variation of unbalance and regaining balance motion are recorded by the motion-capture image system, and the trajectory is taken as the desire one. Then we calculate the force and moment of each joint based on the leg motion model through programming MATLAB code. The results would be primary information of the active leg orthosis design for tumble protection.

Keywords: Active leg orthosis, Tumble protection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
136 A Review on Comparative Analysis of Path Planning and Collision Avoidance Algorithms

Authors: Divya Agarwal, Pushpendra S. Bharti

Abstract:

Autonomous mobile robots (AMR) are expected as smart tools for operations in every automation industry. Path planning and obstacle avoidance is the backbone of AMR as robots have to reach their goal location avoiding obstacles while traversing through optimized path defined according to some criteria such as distance, time or energy. Path planning can be classified into global and local path planning where environmental information is known and unknown/partially known, respectively. A number of sensors are used for data collection. A number of algorithms such as artificial potential field (APF), rapidly exploring random trees (RRT), bidirectional RRT, Fuzzy approach, Purepursuit, A* algorithm, vector field histogram (VFH) and modified local path planning algorithm, etc. have been used in the last three decades for path planning and obstacle avoidance for AMR. This paper makes an attempt to review some of the path planning and obstacle avoidance algorithms used in the field of AMR. The review includes comparative analysis of simulation and mathematical computations of path planning and obstacle avoidance algorithms using MATLAB 2018a. From the review, it could be concluded that different algorithms may complete the same task (i.e. with a different set of instructions) in less or more time, space, effort, etc.

Keywords: Autonomous mobile robots, obstacle avoidance, path planning, and processing time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
135 Introductory Design Optimisation of a Machine Tool using a Virtual Machine Concept

Authors: Johan Wall, Johan Fredin, Anders Jönsson, Göran Broman

Abstract:

Designing modern machine tools is a complex task. A simulation tool to aid the design work, a virtual machine, has therefore been developed in earlier work. The virtual machine considers the interaction between the mechanics of the machine (including structural flexibility) and the control system. This paper exemplifies the usefulness of the virtual machine as a tool for product development. An optimisation study is conducted aiming at improving the existing design of a machine tool regarding weight and manufacturing accuracy at maintained manufacturing speed. The problem can be categorised as constrained multidisciplinary multiobjective multivariable optimisation. Parameters of the control and geometric quantities of the machine are used as design variables. This results in a mix of continuous and discrete variables and an optimisation approach using a genetic algorithm is therefore deployed. The accuracy objective is evaluated according to international standards. The complete systems model shows nondeterministic behaviour. A strategy to handle this based on statistical analysis is suggested. The weight of the main moving parts is reduced by more than 30 per cent and the manufacturing accuracy is improvement by more than 60 per cent compared to the original design, with no reduction in manufacturing speed. It is also shown that interaction effects exist between the mechanics and the control, i.e. this improvement would most likely not been possible with a conventional sequential design approach within the same time, cost and general resource frame. This indicates the potential of the virtual machine concept for contributing to improved efficiency of both complex products and the development process for such products. Companies incorporating such advanced simulation tools in their product development could thus improve its own competitiveness as well as contribute to improved resource efficiency of society at large.

Keywords: Machine tools, Mechatronics, Non-deterministic, Optimisation, Product development, Virtual machine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
134 A Study on Early Prediction of Fault Proneness in Software Modules using Genetic Algorithm

Authors: Parvinder S. Sandhu, Sunil Khullar, Satpreet Singh, Simranjit K. Bains, Manpreet Kaur, Gurvinder Singh

Abstract:

Fault-proneness of a software module is the probability that the module contains faults. To predict faultproneness of modules different techniques have been proposed which includes statistical methods, machine learning techniques, neural network techniques and clustering techniques. The aim of proposed study is to explore whether metrics available in the early lifecycle (i.e. requirement metrics), metrics available in the late lifecycle (i.e. code metrics) and metrics available in the early lifecycle (i.e. requirement metrics) combined with metrics available in the late lifecycle (i.e. code metrics) can be used to identify fault prone modules using Genetic Algorithm technique. This approach has been tested with real time defect C Programming language datasets of NASA software projects. The results show that the fusion of requirement and code metric is the best prediction model for detecting the faults as compared with commonly used code based model.

Keywords: Genetic Algorithm, Fault Proneness, Software Faultand Software Quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
133 Extended Well-Founded Semantics in Bilattices

Authors: Daniel Stamate

Abstract:

One of the most used assumptions in logic programming and deductive databases is the so-called Closed World Assumption (CWA), according to which the atoms that cannot be inferred from the programs are considered to be false (i.e. a pessimistic assumption). One of the most successful semantics of conventional logic programs based on the CWA is the well-founded semantics. However, the CWA is not applicable in all circumstances when information is handled. That is, the well-founded semantics, if conventionally defined, would behave inadequately in different cases. The solution we adopt in this paper is to extend the well-founded semantics in order for it to be based also on other assumptions. The basis of (default) negative information in the well-founded semantics is given by the so-called unfounded sets. We extend this concept by considering optimistic, pessimistic, skeptical and paraconsistent assumptions, used to complete missing information from a program. Our semantics, called extended well-founded semantics, expresses also imperfect information considered to be missing/incomplete, uncertain and/or inconsistent, by using bilattices as multivalued logics. We provide a method of computing the extended well-founded semantics and show that Kripke-Kleene semantics is captured by considering a skeptical assumption. We show also that the complexity of the computation of our semantics is polynomial time.

Keywords: Logic programs, imperfect information, multivalued logics, bilattices, assumptions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1231
132 School Architecture of the Future Supported by Evidence-Based Design and Design Patterns

Authors: Pedro Padilha Gonçalves, Doris C. C. K. Kowaltowski, Benjamin Cleveland

Abstract:

Trends in education affect schooling, needing incorporation into design concepts to support desired learning processes with appropriate and stimulating environments. A design process for school architecture demands research, debates, reflections, and efficient decision-making methods. This paper presents research on evidence-based design, related to middle schools, based on a systematic literature review and the elaboration of a set of architectural design patterns, through a graphic translation of new concepts for classroom configurations, to support programming debates and the synthesis phase of design. The investigation resulted in nine patterns that configure the concepts of boundaries, flexibility, levels of openness, mindsets, neighborhoods, movement and interaction, territories, opportunities for learning, and sightlines for classrooms. The research is part of a continuous investigation of design methods, on contemporary school architecture to produce an architectural pattern matrix based on scientific information translated into an insightful graphic design language.

Keywords: School architecture, design process, design patterns, evidence-based design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 867
131 Estimation of Real Power Transfer Allocation Using Intelligent Systems

Authors: H. Shareef, A. Mohamed, S. A. Khalid, Aziah Khamis

Abstract:

This paper presents application artificial intelligent (AI) techniques, namely artificial neural network (ANN), adaptive neuro fuzzy interface system (ANFIS), to estimate the real power transfer between generators and loads. Since these AI techniques adopt supervised learning, it first uses modified nodal equation method (MNE) to determine real power contribution from each generator to loads. Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques. The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of both AI methods compared to that of the MNE method. The mean squared error of the estimate of ANN and ANFIS power transfer allocation methods are 1.19E-05 and 2.97E-05, respectively. Furthermore, when compared to MNE method, ANN and ANFIS methods computes generator contribution to loads within 20.99 and 39.37msec respectively whereas the MNE method took 360msec for the calculation of same real power transfer allocation. 

Keywords: Artificial intelligence, Power tracing, Artificial neural network, ANFIS, Power system deregulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2543