Search results for: active learning strategies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3673

Search results for: active learning strategies

403 Educational Knowledge Transfer in Indigenous Mexican Areas Using Cloud Computing

Authors: L. R. Valencia Pérez, J. M. Peña Aguilar, A. Lamadrid Álvarez, A. Pastrana Palma, H. F. Valencia Pérez, M. Vivanco Vargas

Abstract:

This work proposes a Cooperation-Competitive (Coopetitive) approach that allows coordinated work among the Secretary of Public Education (SEP), the Autonomous University of Querétaro (UAQ) and government funds from National Council for Science and Technology (CONACYT) or some other international organizations. To work on an overall knowledge transfer strategy with e-learning over the Cloud, where experts in junior high and high school education, working in multidisciplinary teams, perform analysis, evaluation, design, production, validation and knowledge transfer at large scale using a Cloud Computing platform. Allowing teachers and students to have all the information required to ensure a homologated nationally knowledge of topics such as mathematics, statistics, chemistry, history, ethics, civism, etc. This work will start with a pilot test in Spanish and initially in two regional dialects Otomí and Náhuatl. Otomí has more than 285,000 speaking indigenes in Queretaro and Mexico´s central region. Náhuatl is number one indigenous dialect spoken in Mexico with more than 1,550,000 indigenes. The phase one of the project takes into account negotiations with indigenous tribes from different regions, and the Information and Communication technologies to deliver the knowledge to the indigenous schools in their native dialect. The methodology includes the following main milestones: Identification of the indigenous areas where Otomí and Náhuatl are the spoken dialects, research with the SEP the location of actual indigenous schools, analysis and inventory or current schools conditions, negotiation with tribe chiefs, analysis of the technological communication requirements to reach the indigenous communities, identification and inventory of local teachers technology knowledge, selection of a pilot topic, analysis of actual student competence with traditional education system, identification of local translators, design of the e-learning platform, design of the multimedia resources and storage strategy for “Cloud Computing”, translation of the topic to both dialects, Indigenous teachers training, pilot test, course release, project follow up, analysis of student requirements for the new technological platform, definition of a new and improved proposal with greater reach in topics and regions. Importance of phase one of the project is multiple, it includes the proposal of a working technological scheme, focusing in the cultural impact in Mexico so that indigenous tribes can improve their knowledge about new forms of crop improvement, home storage technologies, proven home remedies for common diseases, ways of preparing foods containing major nutrients, disclose strengths and weaknesses of each region, communicating through cloud computing platforms offering regional products and opening communication spaces for inter-indigenous cultural exchange.

Keywords: Mexicans indigenous tribes, education, knowledge transfer, cloud computing, Otomí, Náhuatl, language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 867
402 Clinical Decision Support for Disease Classification based on the Tests Association

Authors: Sung Ho Ha, Seong Hyeon Joo, Eun Kyung Kwon

Abstract:

Until recently, researchers have developed various tools and methodologies for effective clinical decision-making. Among those decisions, chest pain diseases have been one of important diagnostic issues especially in an emergency department. To improve the ability of physicians in diagnosis, many researchers have developed diagnosis intelligence by using machine learning and data mining. However, most of the conventional methodologies have been generally based on a single classifier for disease classification and prediction, which shows moderate performance. This study utilizes an ensemble strategy to combine multiple different classifiers to help physicians diagnose chest pain diseases more accurately than ever. Specifically the ensemble strategy is applied by using the integration of decision trees, neural networks, and support vector machines. The ensemble models are applied to real-world emergency data. This study shows that the performance of the ensemble models is superior to each of single classifiers.

Keywords: Diagnosis intelligence, ensemble approach, data mining, emergency department

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1602
401 PoPCoRN: A Power-Aware Periodic Surveillance Scheme in Convex Region using Wireless Mobile Sensor Networks

Authors: A. K. Prajapati

Abstract:

In this paper, the periodic surveillance scheme has been proposed for any convex region using mobile wireless sensor nodes. A sensor network typically consists of fixed number of sensor nodes which report the measurements of sensed data such as temperature, pressure, humidity, etc., of its immediate proximity (the area within its sensing range). For the purpose of sensing an area of interest, there are adequate number of fixed sensor nodes required to cover the entire region of interest. It implies that the number of fixed sensor nodes required to cover a given area will depend on the sensing range of the sensor as well as deployment strategies employed. It is assumed that the sensors to be mobile within the region of surveillance, can be mounted on moving bodies like robots or vehicle. Therefore, in our scheme, the surveillance time period determines the number of sensor nodes required to be deployed in the region of interest. The proposed scheme comprises of three algorithms namely: Hexagonalization, Clustering, and Scheduling, The first algorithm partitions the coverage area into fixed sized hexagons that approximate the sensing range (cell) of individual sensor node. The clustering algorithm groups the cells into clusters, each of which will be covered by a single sensor node. The later determines a schedule for each sensor to serve its respective cluster. Each sensor node traverses all the cells belonging to the cluster assigned to it by oscillating between the first and the last cell for the duration of its life time. Simulation results show that our scheme provides full coverage within a given period of time using few sensors with minimum movement, less power consumption, and relatively less infrastructure cost.

Keywords: Sensor Network, Graph Theory, MSN, Communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
400 A Neuro Adaptive Control Strategy for Movable Power Source of Proton Exchange Membrane Fuel Cell Using Wavelets

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Movable power sources of proton exchange membrane fuel cells (PEMFC) are the important research done in the current fuel cells (FC) field. The PEMFC system control influences the cell performance greatly and it is a control system for industrial complex problems, due to the imprecision, uncertainty and partial truth and intrinsic nonlinear characteristics of PEMFCs. In this paper an adaptive PI control strategy using neural network adaptive Morlet wavelet for control is proposed. It is based on a single layer feed forward neural networks with hidden nodes of adaptive morlet wavelet functions controller and an infinite impulse response (IIR) recurrent structure. The IIR is combined by cascading to the network to provide double local structure resulting in improving speed of learning. The proposed method is applied to a typical 1 KW PEMFC system and the results show the proposed method has more accuracy against to MLP (Multi Layer Perceptron) method.

Keywords: Adaptive Control, Morlet Wavelets, PEMFC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
399 Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information

Authors: Wei-Jong Yang, Wei-Hau Du, Pau-Choo Chang, Jar-Ferr Yang, Pi-Hsia Hung

Abstract:

The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an important feature for visual thing recognition. With color-based SIFT features and SVM, we can discard unreliable matching pairs and increase the robustness of matching tasks. The experimental results show that the proposed object recognition system with color-assistant SIFT SVM classifier achieves higher recognition rate than that with the traditional gray SIFT and SVM classification in various situations.

Keywords: Color moments, visual thing recognition system, SIFT, color SIFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994
398 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel

Authors: F. M. Pisano, M. Ciminello

Abstract:

Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.

Keywords: Interactive dashboards, optical fibers, structural health monitoring, visual analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 780
397 The Study on the Conversed Remediation between Old and New Media in Case of Smart Phone and PC in South Korea

Authors: Jinhwan Yu, Jooyeon Yook

Abstract:

After Apple's first introduction its smart phone, iPhone in the end of 2009 in Korea, the number of Korean smarphone users had been rapidly increasing so that the half of Korean population became smart phone users as of February, 2012. Currently, smart phones are positioned as a major digital media with powerful influences in Korea. And, now, Koreans are leaning new information, enjoying games and communicating other people every time and everywhere. As smart phone devices' performances increased, the number of usable services became more while adequate GUI developments are required to implement various functions with smart phones. The strategy to provide similar experiences on smart phones through familiar features based on employment of existing media's functions mostly contributed to smart phones' popularization in connection with smart phone devices' iconic GUIs. The spread of Smart phone increased mobile web accesses. Therefore, the attempts to implement PC's web in the smart phone's web are continuously made. The mobile web GUI provides familiar experiences to users through designs adequately utilizing the smart phone's GUIs. As the number of users familiarized to smart phones and mobile web GUIs, opposite to reversed remediation from many parts of PCs, PCs are starting to adapt smart phone GUIs. This study defines this phenomenon as the reversed remediation, and reviews the reversed remediation cases of Smart phone GUI' characteristics of PCs. For this purpose, the established study issues are as under: · what is the reversed remediation? · what are the smart phone GUI's characteristics? · what kind of interrelationship exist s between the smart phone and PC's web site? It is meaningful in the forecast of the future GUI's change by understanding of characteristics in the paradigm changes of PC and smart phone's GUI designs. This also will be helpful to establish strategies for digital devices' development and design.

Keywords: Graphic User Interface, Remediation, Smart Phone, South Korea, Web Site

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
396 Advanced Neural Network Learning Applied to Pulping Modeling

Authors: Z. Zainuddin, W. D. Wan Rosli, R. Lanouette, S. Sathasivam

Abstract:

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

Keywords: Convergence, pulping modeling, neural networks, preconditioned conjugate gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
395 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia

Authors: N. A. Samat, S. H. Mohd Imam Ma’arof

Abstract:

Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.

Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3239
394 Forecasting the Fluctuation of Currency Exchange Rate Using Random Forest

Authors: L. Basha, E. Gjika

Abstract:

The exchange rate is one of the most important economic variables, especially for a small, open economy such as Albania. Its effect is noticeable on one country's competitiveness, trade and current account, inflation, wages, domestic economic activity and bank stability. This study investigates the fluctuation of Albania’s exchange rates using monthly average foreign currency, Euro (Eur) to Albanian Lek (ALL) exchange rate with a time span from January 2008 to June 2021 and the macroeconomic factors that have a significant effect on the exchange rate. Initially, the Random Forest Regression algorithm is constructed to understand the impact of economic variables in the behavior of monthly average foreign currencies exchange rates. Then the forecast of macro-economic indicators for 12 months was performed using time series models. The predicted values received are placed in the random forest model in order to obtain the average monthly forecast of Euro to Albanian Lek (ALL) exchange rate for the period July 2021 to June 2022.

Keywords: Exchange rate, Random Forest, time series, Machine Learning, forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 587
393 The Way Digitized Lectures and Film Presence Coaching Impact Academic Identity: An Expert Facilitated Participatory Action Research Case Study

Authors: Amanda Burrell, Tonia Gary, David Wright, Kumara Ward

Abstract:

This paper explores the concept of academic identity as it relates to the lecture, in particular, the digitized lecture delivered to a camera, in the absence of a student audience. Many academics have the performance aspect of the role thrust upon them with little or no training. For the purpose of this study, we look at the performance of the academic identity and examine tailored film presence coaching for its contributions toward academic identity, specifically in relation to feelings of self-confidence and diminishment of discomfort or stage fright. The case is articulated through the lens of scholar-practitioners, using expert facilitated participatory action research. It demonstrates in our sample of experienced academics, all reported some feelings of uncertainty about presenting lectures to camera prior to coaching. We share how power poses and reframing fear, produced improvements in the ease and competency of all participants. We share exactly how this insight could be adapted for self-coaching by any academic when called to present to a camera and consider the relationship between this and academic identity.

Keywords: Academic identity, embodied learning, digitized lecture, performance coaching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 833
392 Online Brands: A Comparative Study of World Top Ranked Universities with Science and Technology Programs

Authors: Zullina H. Shaari, Amzairi Amar, Abdul Mutalib Embong, Hezlina Hashim

Abstract:

University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.

Keywords: Science and technology programs, top-ranked universities, online brands, university websites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2252
391 AI-based Radio Resource and Transmission Opportunity Allocation for 5G-V2X HetNets: NR and NR-U networks

Authors: Farshad Zeinali, Sajedeh Norouzi, Nader Mokari, Eduard A. Jorswieck

Abstract:

The capacity of fifth-generation (5G)vehicle-to-everything (V2X) networks poses significant challenges.To address this challenge, this paper utilizes New Radio (NR) and New Radio Unlicensed (NR-U) networks to develop a vehicular heterogeneous network (HetNet). We propose a framework, named joint BS assignment and resource allocation (JBSRA) for mobile V2X users and also consider coexistence schemes based on flexible duty cycle (DC) mechanism for unlicensed bands. Our objective is to maximize the average throughput of vehicles, while guarantying the WiFi users throughput. In simulations based on deep reinforcement learning (DRL) algorithms such as deep deterministic policy gradient (DDPG) and deep Q network (DQN), our proposed framework outperforms existing solutions that rely on fixed DC or schemes without consideration of unlicensed bands.

Keywords: Vehicle-to-everything, resource allocation, BS assignment, new radio, new radio unlicensed, coexistence NR-U and WiFi, deep deterministic policy gradient, Deep Q-network, Duty cycle mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235
390 Analysis of Linked in Series Servers with Blocking, Priority Feedback Service and Threshold Policy

Authors: Walenty Oniszczuk

Abstract:

The use of buffer thresholds, blocking and adequate service strategies are well-known techniques for computer networks traffic congestion control. This motivates the study of series queues with blocking, feedback (service under Head of Line (HoL) priority discipline) and finite capacity buffers with thresholds. In this paper, the external traffic is modelled using the Poisson process and the service times have been modelled using the exponential distribution. We consider a three-station network with two finite buffers, for which a set of thresholds (tm1 and tm2) is defined. This computer network behaves as follows. A task, which finishes its service at station B, gets sent back to station A for re-processing with probability o. When the number of tasks in the second buffer exceeds a threshold tm2 and the number of task in the first buffer is less than tm1, the fed back task is served under HoL priority discipline. In opposite case, for fed backed tasks, “no two priority services in succession" procedure (preventing a possible overflow in the first buffer) is applied. Using an open Markovian queuing schema with blocking, priority feedback service and thresholds, a closed form cost-effective analytical solution is obtained. The model of servers linked in series is very accurate. It is derived directly from a twodimensional state graph and a set of steady-state equations, followed by calculations of main measures of effectiveness. Consequently, efficient expressions of the low computational cost are determined. Based on numerical experiments and collected results we conclude that the proposed model with blocking, feedback and thresholds can provide accurate performance estimates of linked in series networks.

Keywords: Blocking, Congestion control, Feedback, Markov chains, Performance evaluation, Threshold-base networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
389 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time

Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla

Abstract:

Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.

Keywords: Fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784
388 A Novel Approach to Handle Uncertainty in Health System Variables for Hospital Admissions

Authors: Manisha Rathi, Thierry Chaussalet

Abstract:

Hospital staff and managers are under pressure and concerned for effective use and management of scarce resources. The hospital admissions require many decisions that have complex and uncertain consequences for hospital resource utilization and patient flow. It is challenging to predict risk of admissions and length of stay of a patient due to their vague nature. There is no method to capture the vague definition of admission of a patient. Also, current methods and tools used to predict patients at risk of admission fail to deal with uncertainty in unplanned admission, LOS, patients- characteristics. The main objective of this paper is to deal with uncertainty in health system variables, and handles uncertain relationship among variables. An introduction of machine learning techniques along with statistical methods like Regression methods can be a proposed solution approach to handle uncertainty in health system variables. A model that adapts fuzzy methods to handle uncertain data and uncertain relationships can be an efficient solution to capture the vague definition of admission of a patient.

Keywords: Admission, Fuzzy, Regression, Uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
387 Spatial Indeterminacy: Destabilization of Dichotomies in Modern and Contemporary Architecture

Authors: Adrian Lo

Abstract:

Since the advent of modern architecture, notions of free plan and transparency have proliferated well into current trends. The movement’s notion of a spatially homogeneous, open and limitless ‘free plan’ contrasts with the spatially heterogeneous ‘series of rooms’ defined by load bearing walls, which in turn triggered new notions of transparency created by vast expanses of glazed walls. Similarly, transparency was also dichotomized as something that was physical or optical, as well as something conceptual, akin to spatial organization. As opposed to merely accepting the duality and possible incompatibility of these dichotomies, this paper seeks to ask how can space be both literally and phenomenally transparent, as well as exhibit both homogeneous and heterogeneous qualities? This paper explores this potential destabilization or blurring of spatial phenomena by dissecting the transparent layers and volumes of a series of selected case studies to investigate how different architects have devised strategies of spatial ambiguity and interpenetration. Projects by Peter Eisenman, Sou Fujimoto, and SANAA will be discussed and analyzed to show how the superimposition of geometries and spaces achieve different conditions of layering, transparency, and interstitiality. Their particular buildings will be explored to reveal various innovative kinds of spatial interpenetration produced through the articulate relations of the elements of architecture, which challenge conventional perceptions of interior and exterior whereby visual homogeneity blurs with spatial heterogeneity. The results show how spatial conceptions such as interpenetration and transparency have the ability to subvert not only inside-outside dialectics, but could also produce multiple degrees of interiority within complex and indeterminate spatial dimensions in constant flux as well as present alternative forms of social interaction.

Keywords: interpenetration, literal and phenomenal transparency, spatial heterogeneity, visual homogeneity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 465
386 Energy Supply, Demand and Environmental Analysis – A Case Study of Indian Energy Scenario

Authors: I.V. Saradhi, G.G. Pandit, V.D. Puranik

Abstract:

Increasing concerns over climate change have limited the liberal usage of available energy technology options. India faces a formidable challenge to meet its energy needs and provide adequate energy of desired quality in various forms to users in sustainable manner at reasonable costs. In this paper, work carried out with an objective to study the role of various energy technology options under different scenarios namely base line scenario, high nuclear scenario, high renewable scenario, low growth and high growth rate scenario. The study has been carried out using Model for Energy Supply Strategy Alternatives and their General Environmental Impacts (MESSAGE) model which evaluates the alternative energy supply strategies with user defined constraints on fuel availability, environmental regulations etc. The projected electricity demand, at the end of study period i.e. 2035 is 500490 MWYr. The model predicted the share of the demand by Thermal: 428170 MWYr, Hydro: 40320 MWYr, Nuclear: 14000 MWYr, Wind: 18000 MWYr in the base line scenario. Coal remains the dominant fuel for production of electricity during the study period. However, the import dependency of coal increased during the study period. In baseline scenario the cumulative carbon dioxide emissions upto 2035 are about 11,000 million tones of CO2. In the scenario of high nuclear capacity the carbon dioxide emissions reduced by 10 % when nuclear energy share increased to 9 % compared to 3 % in baseline scenario. Similarly aggressive use of renewables reduces 4 % of carbon dioxide emissions.

Keywords: Carbon dioxide, energy, electricity, message.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2732
385 Scaling up Detection Rates and Reducing False Positives in Intrusion Detection using NBTree

Authors: Dewan Md. Farid, Nguyen Huu Hoa, Jerome Darmont, Nouria Harbi, Mohammad Zahidur Rahman

Abstract:

In this paper, we present a new learning algorithm for anomaly based network intrusion detection using improved self adaptive naïve Bayesian tree (NBTree), which induces a hybrid of decision tree and naïve Bayesian classifier. The proposed approach scales up the balance detections for different attack types and keeps the false positives at acceptable level in intrusion detection. In complex and dynamic large intrusion detection dataset, the detection accuracy of naïve Bayesian classifier does not scale up as well as decision tree. It has been successfully tested in other problem domains that naïve Bayesian tree improves the classification rates in large dataset. In naïve Bayesian tree nodes contain and split as regular decision-trees, but the leaves contain naïve Bayesian classifiers. The experimental results on KDD99 benchmark network intrusion detection dataset demonstrate that this new approach scales up the detection rates for different attack types and reduces false positives in network intrusion detection.

Keywords: Detection rates, false positives, network intrusiondetection, naïve Bayesian tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238
384 Artificial Intelligence Techniques Applications for Power Disturbances Classification

Authors: K.Manimala, Dr.K.Selvi, R.Ahila

Abstract:

Artificial Intelligence (AI) methods are increasingly being used for problem solving. This paper concerns using AI-type learning machines for power quality problem, which is a problem of general interest to power system to provide quality power to all appliances. Electrical power of good quality is essential for proper operation of electronic equipments such as computers and PLCs. Malfunction of such equipment may lead to loss of production or disruption of critical services resulting in huge financial and other losses. It is therefore necessary that critical loads be supplied with electricity of acceptable quality. Recognition of the presence of any disturbance and classifying any existing disturbance into a particular type is the first step in combating the problem. In this work two classes of AI methods for Power quality data mining are studied: Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs). We show that SVMs are superior to ANNs in two critical respects: SVMs train and run an order of magnitude faster; and SVMs give higher classification accuracy.

Keywords: back propagation network, power quality, probabilistic neural network, radial basis function support vector machine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
383 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features

Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan

Abstract:

Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.

Keywords: Pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1230
382 Estimation of Real Power Transfer Allocation Using Intelligent Systems

Authors: H. Shareef, A. Mohamed, S. A. Khalid, Aziah Khamis

Abstract:

This paper presents application artificial intelligent (AI) techniques, namely artificial neural network (ANN), adaptive neuro fuzzy interface system (ANFIS), to estimate the real power transfer between generators and loads. Since these AI techniques adopt supervised learning, it first uses modified nodal equation method (MNE) to determine real power contribution from each generator to loads. Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques. The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of both AI methods compared to that of the MNE method. The mean squared error of the estimate of ANN and ANFIS power transfer allocation methods are 1.19E-05 and 2.97E-05, respectively. Furthermore, when compared to MNE method, ANN and ANFIS methods computes generator contribution to loads within 20.99 and 39.37msec respectively whereas the MNE method took 360msec for the calculation of same real power transfer allocation. 

Keywords: Artificial intelligence, Power tracing, Artificial neural network, ANFIS, Power system deregulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2544
381 Efficacy of Methyl Eugenol and Food-Based Lures in Trapping Oriental Fruit Fly Bactrocera dorsalis (Diptera: Tephritidae) on Mango Homestead Trees

Authors: Juliana Amaka Ugwu

Abstract:

Trapping efficiency of methyl eugenol and three locally made food-based lures were evaluated in three locations for trapping of B. dorsalis on mango homestead trees in Ibadan South west Nigeria. The treatments were methyl eugenol, brewery waste, pineapple juice, orange juice, and control (water). The experiment was laid in a Complete Randomized Block Design (CRBD) and replicated three times in each location. Data collected were subjected to analysis of variance and significant means were separated by Turkey’s test. The results showed that B. dorsalis was recorded in all locations of study. Methyl eugenol significantly (P < 0.05) trapped higher population of B. dorsalis in all the study area. The population density of B. dorsalis was highest during the ripening period of mango in all locations. The percentage trapped flies after 7 weeks were 77.85%-82.38% (methyl eugenol), 7.29%-8.64% (pineapple juice), 5.62-7.62% (brewery waste), 4.41%-5.95% (orange juice), and 0.24-0.47% (control). There were no significance differences (p > 0.05) on the population of B. dorsalis trapped in all locations. Similarly, there were no significant differences (p > 0.05) on the population of flies trapped among the food attractants. However, the three food attractants significantly (p < 0.05) trapped higher flies than control. Methyl eugenol trapped only male flies while brewery waste and other food based attractants trapped both male and female flies. The food baits tested were promising attractants for trapping B. dorsalis on mango homestead tress, hence increased dosage could be considered for monitoring and mass trapping as management strategies against fruit fly infestation.

Keywords: Attractants, trapping, mango, Bactrocera dorsalis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 724
380 Speaker Identification by Joint Statistical Characterization in the Log Gabor Wavelet Domain

Authors: Suman Senapati, Goutam Saha

Abstract:

Real world Speaker Identification (SI) application differs from ideal or laboratory conditions causing perturbations that leads to a mismatch between the training and testing environment and degrade the performance drastically. Many strategies have been adopted to cope with acoustical degradation; wavelet based Bayesian marginal model is one of them. But Bayesian marginal models cannot model the inter-scale statistical dependencies of different wavelet scales. Simple nonlinear estimators for wavelet based denoising assume that the wavelet coefficients in different scales are independent in nature. However wavelet coefficients have significant inter-scale dependency. This paper enhances this inter-scale dependency property by a Circularly Symmetric Probability Density Function (CS-PDF) related to the family of Spherically Invariant Random Processes (SIRPs) in Log Gabor Wavelet (LGW) domain and corresponding joint shrinkage estimator is derived by Maximum a Posteriori (MAP) estimator. A framework is proposed based on these to denoise speech signal for automatic speaker identification problems. The robustness of the proposed framework is tested for Text Independent Speaker Identification application on 100 speakers of POLYCOST and 100 speakers of YOHO speech database in three different noise environments. Experimental results show that the proposed estimator yields a higher improvement in identification accuracy compared to other estimators on popular Gaussian Mixture Model (GMM) based speaker model and Mel-Frequency Cepstral Coefficient (MFCC) features.

Keywords: Speaker Identification, Log Gabor Wavelet, Bayesian Bivariate Estimator, Circularly Symmetric Probability Density Function, SIRP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
379 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks

Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang

Abstract:

The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.

Keywords: Femtocell networks, game theory, interference mitigation, spectrum allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 702
378 Mistranslation in Cross Cultural Communication: A Discourse Analysis on Former President Bush’s Speech in 2001

Authors: Lowai Abed

Abstract:

The differences in languages play a big role in cross-cultural communication. If meanings are not translated accurately, the risk can be crucial not only on an interpersonal level, but also on the international and political levels. The use of metaphorical language by politicians can cause great confusion, often leading to statements being misconstrued. In these situations, it is the translators who struggle to put forward the intended meaning with clarity and this makes translation an important field to study and analyze when it comes to cross-cultural communication. Owing to the growing importance of language and the power of translation in politics, this research analyzes part of President Bush’s speech in 2001 in which he used the word “Crusade” which caused his statement to be misconstrued. The research uses a discourse analysis of cross-cultural communication literature which provides answers supported by historical, linguistic, and communicative perspectives. The first finding indicates that the word ‘crusade’ carries different meaning and significance in the narratives of the Western world when compared to the Middle East. The second one is that, linguistically, maintaining cultural meanings through translation is quite difficult and challenging. Third, when it comes to the cross-cultural communication perspective, the common and frequent usage of literal translation is a sign of poor strategies being followed in translation training. Based on the example of Bush’s speech, this paper hopes to highlight the weak practices in translation in cross-cultural communication which are still commonly used across the world. Translation studies have to take issues such as this seriously and attempt to find a solution. In every language, there are words and phrases that have cultural, historical and social meanings that are woven into the language. Literal translation is not the solution for this problem because that strategy is unable to convey these meanings in the target language.

Keywords: Crusade, metaphor, mistranslation, war in terror.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 794
377 Effective Factors Increasing the Students’ Interest in Mathematics in the Opinion of Mathematic Teachers of Zahedan

Authors: Safiyeh Khayati, Ali Payan

Abstract:

The main objective of this study was to identify factors and conditions that motivated and encouraged students towards the math class and the factors that made this class an attractive and lovely one. To do this end, questionnaires consisting of 15 questions were distributed among 85 math teachers working in schools of Zahedan. Having collected and reviewed these questionnaires, it was shown that doing activity in math class (activity of students while teaching) and previous math teachers' behaviors have had much impact on encouraging the students towards mathematics. Separation of educational classroom of mathematics from the main classroom (which is decorated with crafts created by students themselves with regard to math book including article, wall newspaper, figures and formulas), peers, size and appearance of math book, first grade teachers in each educational level, among whom the Elementary first grade teachers had more importance and impact, were among the most influential and important factors in this regard. Then, school environment, family, conducting research related to mathematics, its application in daily life and other courses and studying the history of mathematics were categorized as important factors that would increase the students’ interest in mathematics.

Keywords: Interest, motivation, mathematical learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8679
376 Democratization, Market Liberalization and the Raise of Vested Interests and Its Impacts on Anti-Corruption Reform in Indonesia

Authors: Ahmad Khoirul Umam

Abstract:

This paper investigates the role of vested interests and its impacts on anti-corruption agenda in Indonesia following the collapse of authoritarian regime in 1998. A pervasive and rampant corruption has been believed as the main cause of the state economy’s fragility. Hence, anti-corruption measures were implemented by applying democratization and market liberalization since the establishment of a consolidated democracy which go hand in hand with a liberal market economy is convinced to be an efficacious prescription for effective anti-corruption. The reform movement has also mandated the establishment of the independent, neutral and professional special anti-corruption agency namely Corruption Eradication Commission (KPK) to more intensify the fight against the systemic corruption. This paper will examine whether these anti-corruption measures have been effective to combat corruption, and investigate to what extend have the anti-corruption efforts, especially those conducted by KPK, been impeded by the emergence of a nexus of vested interests as the side-effect of democratization and market liberalization. Based on interviews with key stakeholders from KPK, other law enforcement agencies, government, prominent scholars, journalists and NGOs in Indonesia, it is found that since the overthrow of Soeharto, anti-corruption movement in the country have become more active and serious. After gradually winning the hearth of people, KPK successfully touched the untouchable corruption perpetrators who were previously protected by political immunity, legal protection and bureaucratic barriers. However, these changes have not necessarily reduced systemic and structural corruption practices. Ironically, intensive and devastating counterattacks were frequently posed by the alignment of business actors, elites of political parties, government, and also law enforcement agencies by hijacking state’s instruments to make KPK deflated, powerless, and surrender. This paper concludes that attempts of democratization, market liberalization and the establishment of anti-corruption agency may have helped Indonesia to reduce corruption. However, it is still difficult to imply that such anti-corruption measures have fostered the more effective anti-corruption works in the newly democratized and weakly regulated liberal economic system.

Keywords: Vested interests, democratization, market liberalization, anti-corruption, leadership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1127
375 Measuring the Cognitive Abilities of Teenage Basketball Players in Singapore

Authors: Stella Y. Ng, John B. Peacock, Kay Chuan Tan

Abstract:

This paper discusses the use of a computerized test to measure the decision-making abilities of teenage basketball players in Singapore. There are five sections in this test – Competitive state anxiety inventory-2 (CSAI-2) questionnaire (measures player’s cognitive anxiety, somatic anxiety and self-confidence), Corsi block-tapping task (measures player’s short-term spatial memory), situation awareness global assessment technique (SAGAT) (measures players’ situation awareness in a basketball game), multiple choice questions on basketball knowledge (measures players’ knowledge of basketball rules and concepts), and lastly, a learning test that requires participants to recall and recognize basketball set plays (measures player’s ability to learn and recognize set plays). A total of 25 basketball players, aged 14 to 16 years old, from three secondary school teams participated in this experiment. The results that these basketball players obtained from this cognitive test were then used to compare with their physical fitness and basketball performance.

Keywords: Basketball, cognitive abilities, computerized test, decision-making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2365
374 Estimating Spatial Disaggregation of Urban Thermal Responsiveness on Summer Diurnal Range with a Numerical Modeling Approach in Bangkok, Thailand

Authors: Manat Srivanit, Hokao Kazunori

Abstract:

Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The urban climate, representing different urban morphologies across central Bangkok metropolitan area (BMA), are used to investigates the effects of both the composition and configuration of variables of urban morphology indicators on the summer diurnal range of urban climate, using correlation analyses and multiple linear regressions. Results show first indicate that approximately 92.6% of the variation in the average maximum daytime near-surface air temperature (Ta) was explained jointly by the two composition variables of urban morphology indicators including open space ratio (OSR) and floor area ratio (FAR). It has been possible to determine the membership of sample areas to the local climate zones (LCZs) using these urban morphology descriptors automatically computed with GIS and remote sensed data. Finally result found the temperature differences among zones of large separation, such as the city center could be respectively from 35.48±1.04ºC (Mean±S.D.) warmer than the outskirt of Bangkok on average for maximum daytime near surface temperature to 28.27±0.21ºC for extreme event and, can exceed as 8ºC. A spatially disaggregation of urban thermal responsiveness map would be helpful for several reasons. First, it would localize urban areas concerned by different climate behavior over summer daytime and be a good indicator of urban climate variability. Second, when overlaid with a land cover map, this map may contribute to identify possible urban management strategies to reduce heat wave effects in BMA.

Keywords: Urban climate, Urban morphology, Local climate zone, Urban planning, GIS and remote sensing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2418