Search results for: theoretical framework.
328 Field-Programmable Gate Array Based Tester for Protective Relay
Authors: H. Bentarzi, A. Zitouni
Abstract:
The reliability of the power grid depends on the successful operation of thousands of protective relays. The failure of one relay to operate as intended may lead the entire power grid to blackout. In fact, major power system failures during transient disturbances may be caused by unnecessary protective relay tripping rather than by the failure of a relay to operate. Adequate relay testing provides a first defense against false trips of the relay and hence improves power grid stability and prevents catastrophic bulk power system failures. The goal of this research project is to design and enhance the relay tester using a technology such as Field Programmable Gate Array (FPGA) card NI 7851. A PC based tester framework has been developed using Simulink power system model for generating signals under different conditions (faults or transient disturbances) and LabVIEW for developing the graphical user interface and configuring the FPGA. Besides, the interface system has been developed for outputting and amplifying the signals without distortion. These signals should be like the generated ones by the real power system and large enough for testing the relay’s functionality. The signals generated that have been displayed on the scope are satisfactory. Furthermore, the proposed testing system can be used for improving the performance of protective relay.
Keywords: Amplifier class D, FPGA, protective relay, tester.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803327 Modeling Residential Electricity Consumption Function in Malaysia: Time Series Approach
Authors: L. L. Ivy-Yap, H. A. Bekhet
Abstract:
As the Malaysian residential electricity consumption continued to increase rapidly, effective energy policies, which address factors affecting residential electricity consumption, is urgently needed. This study attempts to investigate the relationship between residential electricity consumption (EC), real disposable income (Y), price of electricity (Pe) and population (Po) in Malaysia for 1978-2011 period. Unlike previous studies on Malaysia, the current study focuses on the residential sector, a sector that is important for the contemplation of energy policy. The Phillips-Perron (P-P) unit root test is employed to infer the stationarity of each variable while the bound test is executed to determine the existence of co-integration relationship among the variables, modelled in an Autoregressive Distributed Lag (ARDL) framework. The CUSUM and CUSUM of squares tests are applied to ensure the stability of the model. The results suggest the existence of long-run equilibrium relationship and bidirectional Granger causality between EC and the macroeconomic variables. The empirical findings will help policy makers of Malaysia in developing new monitoring standards of energy consumption. As it is the major contributing factor in economic growth and CO2 emission, there is a need for more proper planning in Malaysia to attain future targets in order to cut emissions.
Keywords: Co-integration, Elasticity, Granger causality, Malaysia, Residential electricity consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4101326 Online Graduate Students’ Perspective on Engagement in Active Learning in the United States
Authors: Ehi E. Aimiuwu
Abstract:
As of 2017, many researchers in educational journals are still wondering if students are effectively and efficiently engaged in active learning in the online learning environment. The goal of this qualitative single case study and narrative research is to explore if students are actively engaged in their online learning. Seven online students in the United States from LinkedIn and residencies were interviewed for this study. Eleven online learning techniques from research were used as a framework. Data collection tools were used for the study that included a digital audiotape, observation sheet, interview protocol, transcription, and NVivo 12 Plus qualitative software. Data analysis process, member checking, and key themes were used to reach saturation. About 85.7% of students preferred individual grading. About 71.4% of students valued professor’s interacting 2-3 times weekly, participating through posts and responses, having good internet access, and using email. Also, about 57.1% said students log in 2-3 times weekly to daily, professor’s social presence helps, regular punctuality in work submission, and prefer assessments style of research, essay, and case study. About 42.9% appreciated syllabus usefulness and professor’s expertise.Keywords: Class facilitation, course management, online teaching, online education, student engagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 690325 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: Constraint parameters, derivative matrix, magnetocardiography, regular term, total variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696324 Algorithms for Computing of Optimization Problems with a Common Minimum-Norm Fixed Point with Applications
Authors: Apirak Sombat, Teerapol Saleewong, Poom Kumam, Parin Chaipunya, Wiyada Kumam, Anantachai Padcharoen, Yeol Je Cho, Thana Sutthibutpong
Abstract:
This research is aimed to study a two-step iteration process defined over a finite family of σ-asymptotically quasi-nonexpansive nonself-mappings. The strong convergence is guaranteed under the framework of Banach spaces with some additional structural properties including strict and uniform convexity, reflexivity, and smoothness assumptions. With similar projection technique for nonself-mapping in Hilbert spaces, we hereby use the generalized projection to construct a point within the corresponding domain. Moreover, we have to introduce the use of duality mapping and its inverse to overcome the unavailability of duality representation that is exploit by Hilbert space theorists. We then apply our results for σ-asymptotically quasi-nonexpansive nonself-mappings to solve for ideal efficiency of vector optimization problems composed of finitely many objective functions. We also showed that the obtained solution from our process is the closest to the origin. Moreover, we also give an illustrative numerical example to support our results.Keywords: σ-asymptotically quasi-nonexpansive nonselfmapping, strong convergence, fixed point, uniformly convex and uniformly smooth Banach space.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1094323 SPA-VNDN: Enhanced Smart Parking Application by Vehicular Named Data Networking
Authors: Bassma Aldahlan, Zongming Fei
Abstract:
Recently, there is a great interest in smart parking application. Theses applications are enhanced by a vehicular ad-hoc network, which helps drivers find and reserve satiable packing spaces for a period of time ahead of time. Named Data Networking (NDN) is a future Internet architecture that benefits vehicular ad-hoc networks because of its clean-slate design and pure communication model. In this paper, we proposed an NDN-based frame-work for smart parking that involved a fog computing architecture. The proposed application had two main directions: First, we allowed drivers to query the number of parking spaces in a particular parking lot. Second, we introduced a technique that enabled drivers to make intelligent reservations before their arrival time. We also introduced a “push-based” model supporting the NDN-based framework for smart parking applications. To evaluate the proposed solution’s performance, we analyzed the function for finding parking lots with available parking spaces and the function for reserving a parking space. Our system showed high performance results in terms of response time and push overhead. The proposed reservation application performed better than the baseline approach.
Keywords: Cloud Computing, Vehicular Named Data Networking, Smart Parking Applications, Fog Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225322 Masquerade and “What Comes Behind Six Is More Than Seven”: Thoughts on Art History and Visual Culture Research Methods
Authors: Osa D Egonwa
Abstract:
In the 21st century, the disciplinary boundaries of past centuries that we often create through mainstream art historical classification, techniques and sources may have been eroded by visual culture, which seems to provide a more inclusive umbrella for the new ways artists go about the creative process and its resultant commodities. Over the past four decades, artists in Africa have resorted to new materials, techniques and themes which have affected our ways of research on these artists and their art. Frontline artists such as El Anatsui, Yinka Shonibare, Erasmus Onyishi are demonstrating that any material is just suitable for artistic expression. Most of times, these materials come with their own techniques/effects and visual syntax: a combination of materials compounds techniques, formal aesthetic indexes, halo effects, and iconography. This tends to challenge the categories and we lean on to view, think and talk about them. This renders our main stream art historical research methods inadequate, thus suggesting new discursive concepts, terms and theories. This paper proposed the Africanist eclectic methods derived from the dual framework of Masquerade Theory and What Comes Behind Six is More Than Seven. This paper shares thoughts/research on art historical methods, terminological re-alignments on classification/source data, presentational format and interpretation arising from the emergent trends in our subject. The outcome provides useful tools to mediate new thoughts and experiences in recent African art and visual culture.
Keywords: Art Historical Methods, Classifications, Concepts , Re-alignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 637321 Modeling the Effects of Type and Intensity of Selective Logging on Forests of the Amazon
Authors: Theodore N.S. Karfakis, Anna Andrade, Carolina Volkmer-Castilho, Dennis R. Valle, Eric Arets, Paul van Gardingen
Abstract:
The aim of the work presented here was to either use existing forest dynamic simulation models or calibrate a new one both within the SYMFOR framework with the purpose of examining changes in stand level basal area and functional composition in response to selective logging considering trees > 10 cm d.b.h for two areas of undisturbed Amazonian non flooded tropical forest in Brazil and one in Peru. Model biological realism was evaluated for forest in the undisturbed and selectively logged state and it was concluded that forest dynamics were realistically represented. Results of the logging simulation experiments showed that in relation to undisturbed forest simulation subject to no form of harvesting intervention there was a significant amount of change over a 90 year simulation period that was positively proportional to the intensity of logging. Areas which had in the dynamic equilibrium of undisturbed forest a greater proportion of a specific ecological guild of trees known as the light hardwoods (LHW’s) seemed to respond more favorably in terms of less deviation but only within a specific range of baseline forest composition beyond which compositional diversity became more important. These finds are in line partially with practical management experience and partiality basic systematics theory respectively.
Keywords: Amazonbasin, ecological species guild, selective logging, simulation modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664320 Bit Model Based Key Management Scheme for Secure Group Communication
Authors: R. Varalakshmi
Abstract:
For the last decade, researchers have started to focus their interest on Multicast Group Key Management Framework. The central research challenge is secure and efficient group key distribution. The present paper is based on the Bit model based Secure Multicast Group key distribution scheme using the most popular absolute encoder output type code named Gray Code. The focus is of two folds. The first fold deals with the reduction of computation complexity which is achieved in our scheme by performing fewer multiplication operations during the key updating process. To optimize the number of multiplication operations, an O(1) time algorithm to multiply two N-bit binary numbers which could be used in an N x N bit-model of reconfigurable mesh is used in this proposed work. The second fold aims at reducing the amount of information stored in the Group Center and group members while performing the update operation in the key content. Comparative analysis to illustrate the performance of various key distribution schemes is shown in this paper and it has been observed that this proposed algorithm reduces the computation and storage complexity significantly. Our proposed algorithm is suitable for high performance computing environment.
Keywords: Multicast Group key distribution, Bit model, Integer Multiplications, reconfigurable mesh, optimal algorithm, Gray Code, Computation Complexity, Storage Complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970319 Performance Analysis of New Types of Reference Targets Based on Spaceborne and Airborne SAR Data
Authors: Y. S. Zhou, C. R. Li, L. L. Tang, C. X. Gao, D. J. Wang, Y. Y. Guo
Abstract:
Triangular trihedral corner reflector (CR) has been widely used as point target for synthetic aperture radar (SAR) calibration and image quality assessment. The additional “tip” of the triangular plate does not contribute to the reflector’s theoretical RCS and if it interacts with a perfectly reflecting ground plane, it will yield an increase of RCS at the radar bore-sight and decrease the accuracy of SAR calibration and image quality assessment. Regarding this problem, two types of CRs were manufactured. One was the hexagonal trihedral CR. It is a self-illuminating CR with relatively small plate edge length, while large edge length usually introduces unexpected edge diffraction error. The other was the triangular trihedral CR with extended bottom plate which considers the effect of ‘tip’ into the total RCS. In order to assess the performance of the two types of new CRs, flight campaign over the National Calibration and Validation Site for High Resolution Remote Sensors was carried out. Six hexagonal trihedral CRs and two bottom-extended trihedral CRs, as well as several traditional triangular trihedral CRs, were deployed. KOMPSAT-5 X-band SAR image was acquired for the performance analysis of the hexagonal trihedral CRs. C-band airborne SAR images were acquired for the performance analysis of the bottom-extended trihedral CRs. The analysis results showed that the impulse response function of both the hexagonal trihedral CRs and bottom-extended trihedral CRs were much closer to the ideal sinc-function than the traditional triangular trihedral CRs. The flight campaign results validated the advantages of new types of CRs and they might be useful in the future SAR calibration mission.
Keywords: Synthetic Aperture Radar, calibration, corner reflector, KOMPSAT-5.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1230318 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis
Authors: Young-Seok Choi
Abstract:
This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.
Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060317 Intelligent Temperature Controller for Water-Bath System
Authors: Om Prakash Verma, Rajesh Singla, Rajesh Kumar
Abstract:
Conventional controller’s usually required a prior knowledge of mathematical modelling of the process. The inaccuracy of mathematical modelling degrades the performance of the process, especially for non-linear and complex control problem. The process used is Water-Bath system, which is most widely used and nonlinear to some extent. For Water-Bath system, it is necessary to attain desired temperature within a specified period of time to avoid the overshoot and absolute error, with better temperature tracking capability, else the process is disturbed.
To overcome above difficulties intelligent controllers, Fuzzy Logic (FL) and Adaptive Neuro-Fuzzy Inference System (ANFIS), are proposed in this paper. The Fuzzy controller is designed to work with knowledge in the form of linguistic control rules. But the translation of these linguistic rules into the framework of fuzzy set theory depends on the choice of certain parameters, for which no formal method is known. To design ANFIS, Fuzzy-Inference-System is combined with learning capability of Neural-Network.
It is analyzed that ANFIS is best suitable for adaptive temperature control of above system. As compared to PID and FLC, ANFIS produces a stable control signal. It has much better temperature tracking capability with almost zero overshoot and minimum absolute error.
Keywords: PID Controller, FLC, ANFIS, Non-Linear Control System, Water-Bath System, MATLAB-7.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5547316 Learning to Recognize Faces by Local Feature Design and Selection
Authors: Yanwei Pang, Lei Zhang, Zhengkai Liu
Abstract:
Studies in neuroscience suggest that both global and local feature information are crucial for perception and recognition of faces. It is widely believed that local feature is less sensitive to variations caused by illumination, expression and illumination. In this paper, we target at designing and learning local features for face recognition. We designed three types of local features. They are semi-global feature, local patch feature and tangent shape feature. The designing of semi-global feature aims at taking advantage of global-like feature and meanwhile avoiding suppressing AdaBoost algorithm in boosting weak classifies established from small local patches. The designing of local patch feature targets at automatically selecting discriminative features, and is thus different with traditional ways, in which local patches are usually selected manually to cover the salient facial components. Also, shape feature is considered in this paper for frontal view face recognition. These features are selected and combined under the framework of boosting algorithm and cascade structure. The experimental results demonstrate that the proposed approach outperforms the standard eigenface method and Bayesian method. Moreover, the selected local features and observations in the experiments are enlightening to researches in local feature design in face recognition.Keywords: Face recognition, local feature, AdaBoost, subspace analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596315 Exploring the Need to Study the Efficacy of VR Training Compared to Traditional Cybersecurity Training
Authors: Shaila Rana, Wasim Alhamdani
Abstract:
Effective cybersecurity training is of the utmost importance, given the plethora of attacks that continue to increase in complexity and ubiquity. VR cybersecurity training remains a starkly understudied discipline. Studies that evaluated the effectiveness of VR cybersecurity training over traditional methods are required. An engaging and interactive platform can support knowledge retention of the training material. Consequently, an effective form of cybersecurity training is required to support a culture of cybersecurity awareness. Measurements of effectiveness varied throughout the studies, with surveys and observations being the two most utilized forms of evaluating effectiveness. Further research is needed to evaluate the effectiveness of VR cybersecurity training and traditional training. Additionally, research for evaluating if VR cybersecurity training is more effective than traditional methods is vital. This paper proposes a methodology to compare the two cybersecurity training methods and their effectiveness. The proposed framework includes developing both VR and traditional cybersecurity training methods and delivering them to at least 100 users. A quiz along with a survey will be administered and statistically analyzed to determine if there is a difference in knowledge retention and user satisfaction. The aim of this paper is to bring attention to the need to study VR cybersecurity training and its effectiveness compared to traditional training methods. This paper hopes to contribute to the cybersecurity training field by providing an effective way to train users for security awareness. If VR training is deemed more effective, this could create a new direction for cybersecurity training practices.
Keywords: Virtual reality cybersecurity training, VR cybersecurity training, traditional cybersecurity training, evaluating efficacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1080314 Electrophoretic Deposition of p-Type Bi2Te3 for Thermoelectric Applications
Authors: Tahereh Talebi, Reza Ghomashchi, Pejman Talemi, Sima Aminorroaya
Abstract:
Electrophoretic deposition (EPD) of p-type Bi2Te3 material has been accomplished, and a high quality crack-free thick film has been achieved for thermoelectric (TE) applications. TE generators (TEG) can convert waste heat into electricity, which can potentially solve global warming problems. However, TEG is expensive due to the high cost of materials, as well as the complex and expensive manufacturing process. EPD is a simple and cost-effective method which has been used recently for advanced applications. In EPD, when a DC electric field is applied to the charged powder particles suspended in a suspension, they are attracted and deposited on the substrate with the opposite charge. In this study, it has been shown that it is possible to prepare a TE film using the EPD method and potentially achieve high TE properties at low cost. The relationship between the deposition weight and the EPD-related process parameters, such as applied voltage and time, has been investigated and a linear dependence has been observed, which is in good agreement with the theoretical principles of EPD. A stable EPD suspension of p-type Bi2Te3 was prepared in a mixture of acetone-ethanol with triethanolamine as a stabilizer. To achieve a high quality homogenous film on a copper substrate, the optimum voltage and time of the EPD process was investigated. The morphology and microstructures of the green deposited films have been investigated using a scanning electron microscope (SEM). The green Bi2Te3 films have shown good adhesion to the substrate. In summary, this study has shown that not only EPD of p-type Bi2Te3 material is possible, but its thick film is of high quality for TE applications.
Keywords: Electrical conductivity, electrophoretic deposition, p-type Bi2Te3, thermoelectric materials, thick films.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1006313 Analysis of the Learners’ Responses of the Adjusted Rorschach Comprehensive System: Critical Psychological Perspective
Authors: Mokgadi Moletsane-Kekae, Robert Kananga Mukuna
Abstract:
The study focused on the analysis of the Adjusted Rorschach Comprehensive System’s responses. The objective of this study is to analyse the participants’ response rate of the Adjusted Rorschach Comprehensive System with regards to critical psychology approach. The use of critical psychology theory in this study was crucial because it responds to the current inadequate western theory or practice in the field of psychology. The study adopted a qualitative approach and a case study design. The study was grounded on interpretivist paradigm. The sample size comprised six learners (three boys and three girls, aged of 14 years) from historically disadvantaged school in the Western Cape, South Africa. The Adjusted Rorschach Comprehensive System (ARCS) administration procedure, biographical information, semi-structured interviews, and observation were used to collect data. Data was analysed using thematic framework. The study found out that, factors that increased the response rates during the administration of ARCS were, language, seating arrangement, drawing, viewing, and describing. The study recommended that, psychological test designers take into consideration the philosophy or worldviews of the local people for whom the test is designed to minimize low response rates.Keywords: Adjusted Rorschach comprehensive system, critical psychology, learners, responses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984312 Pattern Matching Based on Regular Tree Grammars
Authors: Riad S. Jabri
Abstract:
Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Keywords: Bottom-up automata, Code selection, Pattern matching, Regular tree grammars, Match trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1268311 Logic Programming and Artificial Neural Networks in Pharmacological Screening of Schinus Essential Oils
Authors: José Neves, M. Rosário Martins, Fátima Candeias, Diana Ferreira, Sílvia Arantes, Júlio Cruz-Morais, Guida Gomes, Joaquim Macedo, António Abelha, Henrique Vicente
Abstract:
Some plants of genus Schinus have been used in the folk medicine as topical antiseptic, digestive, purgative, diuretic, analgesic or antidepressant, and also for respiratory and urinary infections. Chemical composition of essential oils of S. molle and S. terebinthifolius had been evaluated and presented high variability according with the part of the plant studied and with the geographic and climatic regions. The pharmacological properties, namely antimicrobial, anti-tumoural and anti-inflammatory activities are conditioned by chemical composition of essential oils. Taking into account the difficulty to infer the pharmacological properties of Schinus essential oils without hard experimental approach, this work will focus on the development of a decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks and the respective Degree-of-Confidence that one has on such an occurrence.Keywords: Artificial neuronal networks, essential oils, knowledge representation and reasoning, logic programming, Schinus molle L, Schinus terebinthifolius raddi.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2421310 Effect of Clustering on Energy Efficiency and Network Lifetime in Wireless Sensor Networks
Authors: Prakash G L, Chaitra K Meti, Poojitha K, Divya R.K.
Abstract:
Wireless Sensor Network is Multi hop Self-configuring Wireless Network consisting of sensor nodes. The deployment of wireless sensor networks in many application areas, e.g., aggregation services, requires self-organization of the network nodes into clusters. Efficient way to enhance the lifetime of the system is to partition the network into distinct clusters with a high energy node as cluster head. The different methods of node clustering techniques have appeared in the literature, and roughly fall into two families; those based on the construction of a dominating set and those which are based solely on energy considerations. Energy optimized cluster formation for a set of randomly scattered wireless sensors is presented. Sensors within a cluster are expected to be communicating with cluster head only. The energy constraint and limited computing resources of the sensor nodes present the major challenges in gathering the data. In this paper we propose a framework to study how partially correlated data affect the performance of clustering algorithms. The total energy consumption and network lifetime can be analyzed by combining random geometry techniques and rate distortion theory. We also present the relation between compression distortion and data correlation.Keywords: Clusters, multi hop, random geometry, rate distortion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636309 Understanding the Experience of the Visually Impaired towards a Multi-Sensorial Architectural Design
Authors: Sarah M. Oteifa, Lobna A. Sherif, Yasser M. Mostafa
Abstract:
Visually impaired people, in their daily lives, face struggles and spatial barriers because the built environment is often designed with an extreme focus on the visual element, causing what is called architectural visual bias or ocularcentrism. The aim of the study is to holistically understand the world of the visually impaired as an attempt to extract the qualities of space that accommodate their needs, and to show the importance of multi-sensory, holistic designs for the blind. Within the framework of existential phenomenology, common themes are reached through "intersubjectivity": experience descriptions by blind people and blind architects, observation of how blind children learn to perceive their surrounding environment, and a personal lived blind-folded experience are analyzed. The extracted themes show how visually impaired people filter out and prioritize tactile (active, passive and dynamic touch), acoustic and olfactory spatial qualities respectively, and how this happened during the personal lived blind folded experience. The themes clarify that haptic and aural inclusive designs are essential to create environments suitable for the visually impaired to empower them towards an independent, safe and efficient life.
Keywords: Visually impaired, architecture, multi-sensory design, architectural ocularcentrism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145308 Bio-Inspired Design Approach Analysis: A Case Study of Antoni Gaudi and Santiago Calatrava
Authors: Marzieh Imani
Abstract:
Antoni Gaudi and Santiago Calatrava have reputation for designing bio-inspired creative and technical buildings. Even though they have followed different independent approaches towards design, the source of bio-inspiration seems to be common. Taking a closer look at their projects reveals that Calatrava has been influenced by Gaudi in terms of interpreting nature and applying natural principles into the design process. This research firstly discusses the dialogue between Biomimicry and architecture. This review also explores human/nature discourse during the history by focusing on how nature revealed itself to the fine arts. This is explained by introducing naturalism and romantic style in architecture as the outcome of designers’ inclination towards nature. Reviewing the literature, theoretical background and practical illustration of nature have been included. The most dominant practical aspects of imitating nature are form and function. Nature has been reflected in architectural science resulted in shaping different architectural styles such as organic, green, sustainable, bionic, and biomorphic. By defining a set of common aspects of Gaudi and Calatrava‘s design approach and by considering biomimetic design categories (organism, ecosystem, and behaviour as the main division and form, function, process, material, and construction as subdivisions), Gaudi’s and Calatrava’s project have been analysed. This analysis explores if their design approaches are equivalent or different. Based on this analysis, Gaudi’s architecture can be recognised as biomorphic while Calatrava’s projects are literally biomimetic. Referring to these architects, this review suggests a new set of principles by which a bio-inspired project can be determined either biomorphic or biomimetic.
Keywords: Biomimicry, Calatrava, Gaudi, nature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3307307 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing
Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill
Abstract:
In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.Keywords: Idea ontology, innovation management, open innovation, semantic search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783306 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine
Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji
Abstract:
The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.
Keywords: Medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098305 Thermo-mechanical Deformation Behavior of Functionally Graded Rectangular Plates Subjected to Various Boundary Conditions and Loadings
Authors: Mohammad Talha, B. N. Singh
Abstract:
This paper deals with the thermo-mechanical deformation behavior of shear deformable functionally graded ceramicmetal (FGM) plates. Theoretical formulations are based on higher order shear deformation theory with a considerable amendment in the transverse displacement using finite element method (FEM). The mechanical properties of the plate are assumed to be temperaturedependent and graded in the thickness direction according to a powerlaw distribution in terms of the volume fractions of the constituents. The temperature field is supposed to be a uniform distribution over the plate surface (XY plane) and varied in the thickness direction only. The fundamental equations for the FGM plates are obtained using variational approach by considering traction free boundary conditions on the top and bottom faces of the plate. A C0 continuous isoparametric Lagrangian finite element with thirteen degrees of freedom per node have been employed to accomplish the results. Convergence and comparison studies have been performed to demonstrate the efficiency of the present model. The numerical results are obtained for different thickness ratios, aspect ratios, volume fraction index and temperature rise with different loading and boundary conditions. Numerical results for the FGM plates are provided in dimensionless tabular and graphical forms. The results proclaim that the temperature field and the gradient in the material properties have significant role on the thermo-mechanical deformation behavior of the FGM plates.
Keywords: Functionally graded material, higher order shear deformation theory, finite element method, independent field variables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2333304 Semantic Modeling of Management Information: Enabling Automatic Reasoning on DMTF-CIM
Authors: Fernando Alonso, Rafael Fernandez, Sonia Frutos, Javier Soriano
Abstract:
CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping can be used for automatic reasoning about the management information models, as a design aid, by means of new-generation CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.Keywords: CIM, Knowledge-based Information Models, Ontology Languages, OWL, Description Logics, Integrated Network Management, Intelligent Agents, Automatic Reasoning Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731303 Analysis of Transformer Reactive Power Fluctuations during Adverse Space Weather
Authors: Patience Muchini, Electdom Matandiroya, Emmanuel Mashonjowa
Abstract:
A ground-end manifestation of space weather phenomena is known as geomagnetically induced currents (GICs). GICs flow along the electric power transmission cables connecting the transformers and between the grounding points of power transformers during significant geomagnetic storms. Zimbabwe has no study that notes if grid failures have been caused by GICs. Research and monitoring are needed to investigate this possible relationship purpose of this paper is to characterize GICs with a power grid network. This paper analyses data collected, which are geomagnetic data, which include the Kp index, Disturbance storm time (DST) index, and the G-Scale from geomagnetic storms and also analyses power grid data, which includes reactive power, relay tripping, and alarms from high voltage substations and then correlates the data. This research analysis was first theoretically analyzed by studying geomagnetic parameters and then experimented upon. To correlate, MATLAB was used as the basic software to analyze the data. Latitudes of the substations were also brought into scrutiny to note if they were an impact due to the location as low latitudes areas like most parts of Zimbabwe, there are less severe geomagnetic variations. Based on theoretical and graphical analysis, it has been proven that there is a slight relationship between power system failures and GICs. Further analyses can be done by implementing measuring instruments to measure any currents in the grounding of high-voltage transformers when geomagnetic storms occur. Mitigation measures can then be developed to minimize the susceptibility of the power network to GICs.
Keywords: Adverse space weather, DST index, geomagnetically induced currents, Kp index, reactive power.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155302 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework
Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim
Abstract:
Background modeling and subtraction in video analysis has been widely used as an effective method for moving objects detection in many computer vision applications. Recently, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are the most frequently occurred problems in the practical situation. This paper presents a favorable two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean value of each RGB color channel. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the output of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate very competitive performance compared to previous models.Keywords: Background subtraction, codebook model, local binary pattern, dynamic background, illumination changes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964301 Socio-Technical Systems: Transforming Theory into Practice
Authors: L. Ngowi, N. H. Mvungi
Abstract:
This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.
Keywords: Socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2825300 The Impact of Local Decision-Making in Regional Development Schemes on the Achievement of Efficiency in EU Funds
Authors: Kuyucu Helvacioglu Asli Deniz, Tektas Arzu
Abstract:
European Union candidate status provides a strong motivation for decision-making in the candidate countries in shaping the regional development policy where there is an envisioned transfer of power from center to the periphery. The process of Europeanization anticipates the candidate countries configure their regional institutional templates in the context of the requirements of the European Union policies and introduces new instruments of incentive framework of enlargement to be employed in regional development schemes. It is observed that the contribution of the local actors to the decision making in the design of the allocation architectures enhances the efficiency of the funds and increases the positive effects of the projects funded under the regional development objectives. This study aims at exploring the performances of the three regional development grant schemes in Turkey, established and allocated under the pre-accession process with a special emphasis given to the roles of the national and local actors in decision-making for regional development. Efficiency analyses have been conducted using the DEA methodology which has proved to be a superior method in comparative efficiency and benchmarking measurements. The findings of this study as parallel to similar international studies, provides that the participation of the local actors to the decision-making in funding contributes both to the quality and the efficiency of the projects funded under the EU schemes.Keywords: Efficiency, European Union Funds, RegionalDevelopment, Turkey
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642299 Comparison between Pushover Analysis Techniques and Validation of the Simplified Modal Pushover Analysis
Authors: N. F. Hanna, A. M. Haridy
Abstract:
One of the main drawbacks of the Modal Pushover Analysis (MPA) is the need to perform nonlinear time-history analysis, which complicates the analysis method and time. A simplified version of the MPA has been proposed based on the concept of the inelastic deformation ratio. Furthermore, the effect of the higher modes of vibration is considered by assuming linearly-elastic responses, which enables the use of standard elastic response spectrum analysis. In this thesis, the simplified MPA (SMPA) method is applied to determine the target global drift and the inter-story drifts of steel frame building. The effect of the higher vibration modes is considered within the framework of the SMPA. A comprehensive survey about the inelastic deformation ratio is presented. After that, a suitable expression from literature is selected for the inelastic deformation ratio and then implemented in the SMPA. The estimated seismic demands using the SMPA, such as target drift, base shear, and the inter-story drifts, are compared with the seismic responses determined by applying the standard MPA. The accuracy of the estimated seismic demands is validated by comparing with the results obtained by the nonlinear time-history analysis using real earthquake records.
Keywords: Modal analysis, pushover analysis, seismic performance, target displacement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620