Search results for: reliable features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2154

Search results for: reliable features

894 Voltage Stability Margin-Based Approach for Placement of Distributed Generators in Power Systems

Authors: Oludamilare Bode Adewuyi, Yanxia Sun, Isaiah Gbadegesin Adebayo

Abstract:

Voltage stability analysis is crucial to the reliable and economic operation of power systems. The power system of developing nations is more susceptible to failures due to the continuously increasing load demand which is not matched with generation increase and efficient transmission infrastructures. Thus, most power systems are heavily stressed and the planning of extra generation from distributed generation sources needs to be efficiently done so as to ensure the security of the power system. In this paper, the performance of a relatively different approach using line voltage stability margin indicator, which has proven to have better accuracy, has been presented and compared with a conventional line voltage stability index for distributed generators (DGs) siting using the Nigerian 28 bus system. Critical Boundary Index (CBI) for voltage stability margin estimation was deployed to identify suitable locations for DG placement and the performance was compared with DG placement using Novel Line Stability Index (NLSI) approach. From the simulation results, both CBI and NLSI agreed greatly on suitable locations for DG on the test system; while CBI identified bus 18 as the most suitable at system overload, NLSI identified bus 8 to be the most suitable. Considering the effect of the DG placement at the selected buses on the voltage magnitude profile, the result shows that the DG placed on bus 18 identified by CBI improved the performance of the power system better.

Keywords: Voltage stability analysis, voltage collapse, voltage stability index, distributed generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 431
893 Estimating Bridge Deterioration for Small Data Sets Using Regression and Markov Models

Authors: Yina F. Muñoz, Alexander Paz, Hanns De La Fuente-Mella, Joaquin V. Fariña, Guilherme M. Sales

Abstract:

The primary approach for estimating bridge deterioration uses Markov-chain models and regression analysis. Traditional Markov models have problems in estimating the required transition probabilities when a small sample size is used. Often, reliable bridge data have not been taken over large periods, thus large data sets may not be available. This study presents an important change to the traditional approach by using the Small Data Method to estimate transition probabilities. The results illustrate that the Small Data Method and traditional approach both provide similar estimates; however, the former method provides results that are more conservative. That is, Small Data Method provided slightly lower than expected bridge condition ratings compared with the traditional approach. Considering that bridges are critical infrastructures, the Small Data Method, which uses more information and provides more conservative estimates, may be more appropriate when the available sample size is small. In addition, regression analysis was used to calculate bridge deterioration. Condition ratings were determined for bridge groups, and the best regression model was selected for each group. The results obtained were very similar to those obtained when using Markov chains; however, it is desirable to use more data for better results.

Keywords: Concrete bridges, deterioration, Markov chains, probability matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427
892 Thermal Analysis of Extrusion Process in Plastic Making

Authors: S. K. Fasogbon, T. M. Oladosu, O. S. Osasuyi

Abstract:

Plastic extrusion has been an important process of plastic production since 19th century. Meanwhile, in plastic extrusion process, wide variation in temperature along the extrudate usually leads to scraps formation on the side of finished products. To avoid this situation, there is a need to deeply understand temperature distribution along the extrudate in plastic extrusion process. This work developed an analytical model that predicts the temperature distribution over the billet (the polymers melt) along the extrudate during extrusion process with the limitation that the polymer in question does not cover biopolymer such as DNA. The model was solved and simulated. Results for two different plastic materials (polyvinylchloride and polycarbonate) using self-developed MATLAB code and a commercially developed software (ANSYS) were generated and ultimately compared. It was observed that there is a thermodynamic heat transfer from the entry level of the billet into the die down to the end of it. The graph plots indicate a natural exponential decay of temperature with time and along the die length, with the temperature being 413 K and 474 K for polyvinylchloride and polycarbonate respectively at the entry level and 299.3 K and 328.8 K at the exit when the temperature of the surrounding was 298 K. The extrusion model was validated by comparison of MATLAB code simulation with a commercially available ANSYS simulation and the results favourably agree. This work concludes that the developed mathematical model and the self-generated MATLAB code are reliable tools in predicting temperature distribution along the extrudate in plastic extrusion process.

Keywords: ANSYS, extrusion process, MATLAB, plastic making, thermal analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
891 Comparison of Finite Difference Schemes for Water Flow in Unsaturated Soils

Authors: H. Taheri Shahraiyni, B. Ataie Ashtiani

Abstract:

Flow movement in unsaturated soil can be expressed by a partial differential equation, named Richards equation. The objective of this study is the finding of an appropriate implicit numerical solution for head based Richards equation. Some of the well known finite difference schemes (fully implicit, Crank Nicolson and Runge-Kutta) have been utilized in this study. In addition, the effects of different approximations of moisture capacity function, convergence criteria and time stepping methods were evaluated. Two different infiltration problems were solved to investigate the performance of different schemes. These problems include of vertical water flow in a wet and very dry soils. The numerical solutions of two problems were compared using four evaluation criteria and the results of comparisons showed that fully implicit scheme is better than the other schemes. In addition, utilizing of standard chord slope method for approximation of moisture capacity function, automatic time stepping method and difference between two successive iterations as convergence criterion in the fully implicit scheme can lead to better and more reliable results for simulation of fluid movement in different unsaturated soils.

Keywords: Finite Difference methods, Richards equation, fullyimplicit, Crank-Nicolson, Runge-Kutta.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2364
890 Seamless Handover in Urban 5G-UAV Systems Using Entropy Weighted Method

Authors: Anirudh Sunil Warrier, Saba Al-Rubaye, Dimitrios Panagiotakopoulos, Gokhan Inalhan, Antonios Tsourdos

Abstract:

The demand for increased data transfer rate and network traffic capacity has given rise to the concept of heterogeneous networks. Heterogeneous networks are wireless networks, consisting of devices using different underlying radio access technologies (RAT). For Unmanned Aerial Vehicles (UAVs) this enhanced data rate and network capacity are even more critical especially in their applications of medicine, delivery missions and military. In an urban heterogeneous network environment, the UAVs must be able switch seamlessly from one base station (BS) to another for maintaining a reliable link. Therefore, seamless handover in such urban environments has become a major challenge. In this paper, a scheme to achieve seamless handover is developed, an algorithm based on Received Signal Strength (RSS) criterion for network selection is used and Entropy Weighted Method (EWM) is implemented for decision making. Seamless handover using EWM decision-making is demonstrated successfully for a UAV moving across fifth generation (5G) and long-term evolution (LTE) networks via a simulation level analysis. Thus, a solution for UAV-5G communication, specifically the mobility challenge in heterogeneous networks is solved and this work could act as step forward in making UAV-5G architecture integration a possibility.

Keywords: Air to ground, A2G, fifth generation, 5G, handover, mobility, unmanned aerial vehicle, UAV, urban environments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 405
889 Distributed Splay Suffix Arrays: A New Structure for Distributed String Search

Authors: Tu Kun, Gu Nai-jie, Bi Kun, Liu Gang, Dong Wan-li

Abstract:

As a structure for processing string problem, suffix array is certainly widely-known and extensively-studied. But if the string access pattern follows the “90/10" rule, suffix array can not take advantage of the fact that we often find something that we have just found. Although the splay tree is an efficient data structure for small documents when the access pattern follows the “90/10" rule, it requires many structures and an excessive amount of pointer manipulations for efficiently processing and searching large documents. In this paper, we propose a new and conceptually powerful data structure, called splay suffix arrays (SSA), for string search. This data structure combines the features of splay tree and suffix arrays into a new approach which is suitable to implementation on both conventional and clustered computers.

Keywords: suffix arrays, splay tree, string search, distributedalgorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766
888 3D Printing Technology in Housing Projects Construction

Authors: Mohammed F. Haddad, Mohammad A. Albenayyan

Abstract:

Realistically, 3-D printing as a technology has not yet reached the required maturity level to handle construction housing projects for citizens on a country scale. However, potentially, it has all of the required elements for addressing this issue. There are two main high-level elements of this technology that need to be capitalized on in order for the technology to reach its full potential: technical and logistical. This paper aims to cover how 3-D printing can be a viable technical solution for housing projects and describes the impact of 3-D printing technical features on the logistical aspects of completing a housing project. Additionally, a perspective about 3-D printing in Saudi Arabia will be presented in order to give the reader an idea of where the Kingdom stands in the deployment of this technology. Finally, a glimpse will be given regarding the potential utilization of this technology for space applications.

Keywords: Large-scale 3-D printing, additive manufacturing, D-shape, contour crafting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 749
887 Microservices-Based Provisioning and Control of Network Services for Heterogeneous Networks

Authors: Shameemraj M. Nadaf, Sipra Behera, Hemant K. Rath, Garima Mishra, Raja Mukhopadhyay, Sumanta Patro

Abstract:

Microservices architecture has been widely embraced for rapid, frequent, and reliable delivery of complex applications. It enables organizations to evolve their technology stack in various domains. Today, the networking domain is flooded with plethora of devices and software solutions which address different functionalities ranging from elementary operations, viz., switching, routing, firewall etc., to complex analytics and insights based intelligent services. In this paper, we attempt to bring in the microservices based approach for agile and adaptive delivery of network services for any underlying networking technology. We discuss the life cycle management of each individual microservice and a distributed control approach with emphasis for dynamic provisioning, management, and orchestration in an automated fashion which can provide seamless operations in large scale networks. We have conducted validations of the system in lab testbed comprising of Traditional/Legacy and Software Defined Wireless Local Area networks.

Keywords: Microservices architecture, software defined wireless networks, traditional wireless networks, automation, orchestration, intelligent networks, network analytics, seamless management, single pane control, fine-grain control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 868
886 The Role of Object Oriented Simulation F Modeling in Maintenance Processes

Authors: Abdulsalam A. Al-Sudairi

Abstract:

Object-oriented simulation is considered one of the most sophisticated techniques that has been widely used in planning, designing, executing and maintaining construction projects. This technique enables the modeler to focus on objects which is extremely important for thorough understanding of a system. Thus, identifying an object is an essential point of building a successful simulation model. In a maintenance process an object is a maintenance work order (MWO). This study demonstrates a maintenance simulation model for the building maintenance division of Saudi Consolidated Electric Company (SCECO) in Dammam, Saudi Arabia. The model focused on both types of maintenance processes namely: (1) preventive maintenance (PM) and (2) corrective maintenance (CM). It is apparent from the findings that object-oriented simulation is a good diagnostic and experimental tool. This is because problems, limitations, bottlenecks and so forth are easily identified. These features are very difficult to obtain when using other tools.

Keywords: Object oriented, simulation, maintenance, process, work orders

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
885 Balancing Strategies for Parallel Content-based Data Retrieval Algorithms in a k-tree Structured Database

Authors: Radu Dobrescu, Matei Dobrescu, Daniela Hossu

Abstract:

The paper proposes a unified model for multimedia data retrieval which includes data representatives, content representatives, index structure, and search algorithms. The multimedia data are defined as k-dimensional signals indexed in a multidimensional k-tree structure. The benefits of using the k-tree unified model were demonstrated by running the data retrieval application on a six networked nodes test bed cluster. The tests were performed with two retrieval algorithms, one that allows parallel searching using a single feature, the second that performs a weighted cascade search for multiple features querying. The experiments show a significant reduction of retrieval time while maintaining the quality of results.

Keywords: balancing strategies, multimedia databases, parallelprocessing, retrieval algorithms

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
884 A Validity and Reliability Study of Grasha- Riechmann Student Learning Style Scale

Authors: Yaşar Baykul, Musa Gürsel, Hacı Sulak, Erhan Ertekin, Ersen Yazıcı, Osman Dülger, Yasin Aslan, Kağan Büyükkarcı

Abstract:

The reliability of the tools developed to learn the learning styles is essential to find out students- learning styles trustworthily. For this purpose, the psychometric features of Grasha- Riechman Student Learning Style Inventory developed by Grasha was studied to contribute to this field. The study was carried out on 6th, 7th, and 8th graders of 10 primary education schools in Konya. The inventory was applied twice with an interval of one month, and according to the data of this application, the reliability coefficient numbers of the 6 sub-dimensions pointed in the theory of the inventory was found to be medium. Besides, it was found that the inventory does not have a structure with 6 factors for both Mathematics and English courses as represented in the theory.

Keywords: Learning styles, Grasha-Riechmann, reliability, validity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6545
883 A New Approach to ECG Biometric Systems: A Comparitive Study between LPC and WPD Systems

Authors: Justin Leo Cheang Loong, Khazaimatol S Subari, Rosli Besar, Muhammad Kamil Abdullah

Abstract:

In this paper, a novel method for a biometric system based on the ECG signal is proposed, using spectral coefficients computed through linear predictive coding (LPC). ECG biometric systems have traditionally incorporated characteristics of fiducial points of the ECG signal as the feature set. These systems have been shown to contain loopholes and thus a non-fiducial system allows for tighter security. In the proposed system, incorporating non-fiducial features from the LPC spectrum produced a segment and subject recognition rate of 99.52% and 100% respectively. The recognition rates outperformed the biometric system that is based on the wavelet packet decomposition (WPD) algorithm in terms of recognition rates and computation time. This allows for LPC to be used in a practical ECG biometric system that requires fast, stringent and accurate recognition.

Keywords: biometric, ecg, linear predictive coding, wavelet packet decomposition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2883
882 A Multi-layer Artificial Neural Network Architecture Design for Load Forecasting in Power Systems

Authors: Axay J Mehta, Hema A Mehta, T.C.Manjunath, C. Ardil

Abstract:

In this paper, the modelling and design of artificial neural network architecture for load forecasting purposes is investigated. The primary pre-requisite for power system planning is to arrive at realistic estimates of future demand of power, which is known as Load Forecasting. Short Term Load Forecasting (STLF) helps in determining the economic, reliable and secure operating strategies for power system. The dependence of load on several factors makes the load forecasting a very challenging job. An over estimation of the load may cause premature investment and unnecessary blocking of the capital where as under estimation of load may result in shortage of equipment and circuits. It is always better to plan the system for the load slightly higher than expected one so that no exigency may arise. In this paper, a load-forecasting model is proposed using a multilayer neural network with an appropriately modified back propagation learning algorithm. Once the neural network model is designed and trained, it can forecast the load of the power system 24 hours ahead on daily basis and can also forecast the cumulative load on daily basis. The real load data that is used for the Artificial Neural Network training was taken from LDC, Gujarat Electricity Board, Jambuva, Gujarat, India. The results show that the load forecasting of the ANN model follows the actual load pattern more accurately throughout the forecasted period.

Keywords: Power system, Load forecasting, Neural Network, Neuron, Stabilization, Network structure, Load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3404
881 STATISTICA Software: A State of the Art Review

Authors: S. Sarumathi, N. Shanthi, S. Vidhya, P. Ranjetha

Abstract:

Data mining idea is mounting rapidly in admiration and also in their popularity. The foremost aspire of data mining method is to extract data from a huge data set into several forms that could be comprehended for additional use. The data mining is a technology that contains with rich potential resources which could be supportive for industries and businesses that pay attention to collect the necessary information of the data to discover their customer’s performances. For extracting data there are several methods are available such as Classification, Clustering, Association, Discovering, and Visualization… etc., which has its individual and diverse algorithms towards the effort to fit an appropriate model to the data. STATISTICA mostly deals with excessive groups of data that imposes vast rigorous computational constraints. These results trials challenge cause the emergence of powerful STATISTICA Data Mining technologies. In this survey an overview of the STATISTICA software is illustrated along with their significant features.

Keywords: Data Mining, STATISTICA Data Miner, Text Miner, Enterprise Server, Classification, Association, Clustering, Regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598
880 Comparative Analysis of Diverse Collection of Big Data Analytics Tools

Authors: S. Vidhya, S. Sarumathi, N. Shanthi

Abstract:

Over the past era, there have been a lot of efforts and studies are carried out in growing proficient tools for performing various tasks in big data. Recently big data have gotten a lot of publicity for their good reasons. Due to the large and complex collection of datasets it is difficult to process on traditional data processing applications. This concern turns to be further mandatory for producing various tools in big data. Moreover, the main aim of big data analytics is to utilize the advanced analytic techniques besides very huge, different datasets which contain diverse sizes from terabytes to zettabytes and diverse types such as structured or unstructured and batch or streaming. Big data is useful for data sets where their size or type is away from the capability of traditional relational databases for capturing, managing and processing the data with low-latency. Thus the out coming challenges tend to the occurrence of powerful big data tools. In this survey, a various collection of big data tools are illustrated and also compared with the salient features.

Keywords: Big data, Big data analytics, Business analytics, Data analysis, Data visualization, Data discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3767
879 Reconfiguration of Deregulated Distribution Network for Minimizing Energy Supply Cost by using Multi-Objective BGA

Authors: H. Kazemi Karegar, S. Jalilzadeh, V. Nabaei, A. Shabani

Abstract:

In this paper, the problem of finding the optimal topological configuration of a deregulated distribution network is considered. The new features of this paper are proposing a multiobjective function and its application on deregulated distribution networks for finding the optimal configuration. The multi-objective function will be defined for minimizing total Energy Supply Costs (ESC) and energy losses subject to load flow constraints. The optimal configuration will be obtained by using Binary Genetic Algorithm (BGA).The proposed method has been tested to analyze a sample and a practical distribution networks.

Keywords: Binary Genetic Algorithm, Deregulated Distribution Network, Minimizing Cost, Reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
878 Teachers’ Perceptions of the Negative Impact of Tobephobia on Their Emotions and Job Satisfaction

Authors: Prakash Singh

Abstract:

The aim of this study was to investigate the extent of teachers’ experiences of tobephobia (TBP) in their heterogeneous classrooms and what impact this had on their emotions and job satisfaction. The expansive and continuously changing demands for quality and equal education for all students in educational organisations that have limited resources connotes that the negative effects of TBP cannot be simply ignored as being non-existent in the educational environment. As this quantitative study reveals, teachers disliking their job with low expectations, lack of motivation in their workplace and pessimism, result in their low self-esteem. When there is pessimism in the workplace, then the employees’ self-esteem will inevitably be low, as pointed out by 97.1% of the respondents in this study. Self-esteem is a reliable indicator of whether employees are happy or not in their jobs and the majority of the respondents in this study agreed that their experiences of TBP negatively impacted on their self-esteem. Hence, this exploratory study strongly indicates that productivity in the workplace is directly linked to the employees’ expectations, self-confidence and their self-esteem. It is therefore inconceivable for teachers to be productive in their regular classrooms if their genuine professional concerns, anxieties, and curriculum challenges are not adequately addressed. This empirical study contributes to our knowledge on TBP because it clearly outlines some of the teaching problems that we are grappling with and constantly experience in our schools in this century. Therefore, it is imperative that the tobephobic experiences of teachers are not merely documented, but appropriately addressed with relevant action by every stakeholder associated with education so that our teachers’ emotions and job satisfaction needs are fully taken care of.

Keywords: Demotivated teachers’ pessimism, low expectations of teachers’ job satisfaction, Self-esteem, Tobephobia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
877 A Design of Electronically Tunable Voltagemode Universal Filter with High Input Impedance

Authors: Surapong Siripongdee, Witthaya Mekhum

Abstract:

This article presents a voltage-mode universal biquadratic filter performing simultaneous 3 standard functions: lowpass, high-pass and band-pass functions, employing differential different current conveyor (DDCC) and current controlled current conveyor (CCCII) as active element. The features of the circuit are that: the quality factor and pole frequency can be tuned independently via the input bias currents: the circuit description is very simple, consisting of 1 DDCC, 2 CCCIIs, 2 electronic resistors and 2 grounded capacitors. Without requiring component matching conditions, the proposed circuit is very appropriate to further develop into an integrated circuit. The PSPICE simulation results are depicted. The given results agree well with the theoretical anticipation.

Keywords: Filter, DDCC, CCCII, Analog circuit, Voltagemode, PSPICE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555
876 Heritage Tree Expert Assessment and Classification: Malaysian Perspective

Authors: B.-Y.-S. Lau, Y.-C.-T. Jonathan, M.-S. Alias

Abstract:

Heritage trees are natural large, individual trees with exceptionally value due to association with age or event or distinguished people. In Malaysia, there is an abundance of tropical heritage trees throughout the country. It is essential to set up a repository of heritage trees to prevent valuable trees from being cut down. In this cross domain study, a web-based online expert system namely the Heritage Tree Expert Assessment and Classification (HTEAC) is developed and deployed for public to nominate potential heritage trees. Based on the nomination, tree care experts or arborists would evaluate and verify the nominated trees as heritage trees. The expert system automatically rates the approved heritage trees according to pre-defined grades via Delphi technique. Features and usability test of the expert system are presented. Preliminary result is promising for the system to be used as a full scale public system.

Keywords: Arboriculture, Delphi, expert system, heritage tree, urban forestry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
875 A Comparison of Experimental Data with Monte Carlo Calculations for Optimisation of the Sourceto- Detector Distance in Determining the Efficiency of a LaBr3:Ce (5%) Detector

Authors: H. Aldousari, T. Buchacher, N. M. Spyrou

Abstract:

Cerium-doped lanthanum bromide LaBr3:Ce(5%) crystals are considered to be one of the most advanced scintillator materials used in PET scanning, combining a high light yield, fast decay time and excellent energy resolution. Apart from the correct choice of scintillator, it is also important to optimise the detector geometry, not least in terms of source-to-detector distance in order to obtain reliable measurements and efficiency. In this study a commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce (5%) detector was characterised in terms of its efficiency at varying source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and 137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As a result of the change in solid angle subtended by the detector, the geometric efficiency reduced in efficiency with increasing distance. High efficiencies at low distances can cause pulse pile-up when subsequent photons are detected before previously detected events have decayed. To reduce this systematic error the source-to-detector distance should be balanced between efficiency and pulse pile-up suppression as otherwise pile-up corrections would need to be necessary at short distances. In addition to the experimental measurements Monte Carlo simulations have been carried out for the same setup, allowing a comparison of results. The advantages and disadvantages of each approach have been highlighted.

Keywords: BrilLanCeTM380 LaBr3:Ce(5%), Coincidence summing, GATE simulation, Geometric efficiency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877
874 Pre-Service Teachers’ Assessment of Information Technology Application to Instruction

Authors: Adesanya Anuoluwapo Olusola

Abstract:

Technology has moved into the classroom, and it becomes difficult talking of achievement in and attitude to learning without making mention of it. The use of technology makes learning easy, real and practical as it motivates learners, sustains their interest and improves their attitude to learning. This study, therefore examined the pre-service teachers’ assessment of information technology application to instruction. The use of technology emphasizes and encourages active learning in the classroom. The study involved 100 pre-service teachers in the selected two (2) Colleges of Education, Nigeria. Purposive random sampling was used in selecting the participants and ex-post facto design was adopted the in which there is no manipulation of variables. Two valid and reliable instruments were used for data collection: Access Point ICT facilities and Application of ICT. The study established that pre-service teachers have less access to ICT facilities and Application of ICT in the college, apart from those students having the access outside the college. Also fewer pre-service teachers used ICT facilities on weekly and monthly bases. It was concluded that the establishment of students’ resources centres and Campus wide wireless connectivity must be implemented so as to improve and enhance students’ achievement in and attitude to learning. The time and attention devoted to learning activities and strategic specialized ICT skills and requisite entrepreneur skills should be increased so as to have easy access to information sources and be able to apply it in teaching process.

Keywords: Computer, ICT Application, Learning Facilities, Pre-Service Teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
873 Stochastic Subspace Modelling of Turbulence

Authors: M. T. Sichani, B. J. Pedersen, S. R. K. Nielsen

Abstract:

Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical cross spectral density function for the along-wind turbulence component over the wind field area is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since the succeeding state space and ARMA modelling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.

Keywords: Turbulence, wind turbine, complex coherence, state space modelling, ARMA modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
872 Low Value Capacitance Measurement System with Adjustable Lead Capacitance Compensation

Authors: Gautam Sarkar, Anjan Rakshit, Amitava Chatterjee, Kesab Bhattacharya

Abstract:

The present paper describes the development of a low cost, highly accurate low capacitance measurement system that can be used over a range of 0 – 400 pF with a resolution of 1 pF. The range of capacitance may be easily altered by a simple resistance or capacitance variation of the measurement circuit. This capacitance measurement system uses quad two-input NAND Schmitt trigger circuit CD4093B with hysteresis for the measurement and this system is integrated with PIC 18F2550 microcontroller for data acquisition purpose. The microcontroller interacts with software developed in the PC end through USB architecture and an attractive graphical user interface (GUI) based system is developed in the PC end to provide the user with real time, online display of capacitance under measurement. The system uses a differential mode of capacitance measurement, with reference to a trimmer capacitance, that effectively compensates lead capacitances, a notorious error encountered in usual low capacitance measurements. The hysteresis provided in the Schmitt-trigger circuits enable reliable operation of the system by greatly minimizing the possibility of false triggering because of stray interferences, usually regarded as another source of significant error. The real life testing of the proposed system showed that our measurements could produce highly accurate capacitance measurements, when compared to cutting edge, high end digital capacitance meters.

Keywords: Capacitance measurement, NAND Schmitt trigger, microcontroller, GUI, lead compensation, hysteresis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7362
871 Importance of Hardware Systems and Circuits in Secure Software Development Life Cycle

Authors: Mir Shahriar Emami

Abstract:

Although it is fully impossible to ensure that a software system is quite secure, developing an acceptable secure software system in a convenient platform is not unreachable. In this paper, we attempt to analyze software development life cycle (SDLC) models from the hardware systems and circuits point of view. To date, the SDLC models pay merely attention to the software security from the software perspectives. In this paper, we present new features for SDLC stages to emphasize the role of systems and circuits in developing secure software system through the software development stages, the point that has not been considered previously in the SDLC models.

Keywords: Systems and circuits security, software security, software process engineering, SDLC, SSDLC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
870 Automatic Authentication of Handwritten Documents via Low Density Pixel Measurements

Authors: Abhijit Mitra, Pranab Kumar Banerjee, C. Ardil

Abstract:

We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.

Keywords: Handwritten document verification, Skilled forgeries, Low density pixels, Adaptive decision boundary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
869 Critical Analysis of the Hong Kong International Convention on Ship Recycling

Authors: K. P. Jain, J. F. J. Pruyn, J. J. Hopman

Abstract:

In May 2009, the International Maritime Organization (IMO) adopted the Hong Kong International Convention for the Safe and Environmentally Sound Recycling of Ships to address the growing concerns about the environmental, occupational health and safety risks related to ship recycling. The aim of the Hong Kong Convention is to provide a legally binding instrument which ensures that the process of ship recycling does not pose risks to human health, safety and to the environment. In this paper, critical analysis of the Hong Kong Convention has been carried out in order to study the effectiveness of the Convention to meet its objectives. The Convention has been studied in detail including its background, main features, major stakeholders, strengths and weaknesses. The Convention, though having several deficiencies, is a major breakthrough in not only recognizing but also dealing with the ill-practices associated with ship recycling.

Keywords: Hong Kong Convention, IMO, Ship breaking, Ship recycling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5438
868 Estimating 3D-Position of A Stationary Random Acoustic Source Using Bispectral Analysis of 4-Point Detected Signals

Authors: Katsumi Hirata

Abstract:

To develop the useful acoustic environmental recognition system, the method of estimating 3D-position of a stationary random acoustic source using bispectral analysis of 4-point detected signals is proposed. The method uses information about amplitude attenuation and propagation delay extracted from amplitude ratios and angles of auto- and cross-bispectra of the detected signals. It is expected that using bispectral analysis affects less influence of Gaussian noises than using conventional power spectral one. In this paper, the basic principle of the method is mentioned first, and its validity and features are considered from results of the fundamental experiments assumed ideal circumstances.

Keywords: 4-point detection, a stationary random acoustic source, auto- and cross-bispectra, estimation of 3D-position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
867 Facial Recognition on the Basis of Facial Fragments

Authors: Tetyana Baydyk, Ernst Kussul, Sandra Bonilla Meza

Abstract:

There are many articles that attempt to establish the role of different facial fragments in face recognition. Various approaches are used to estimate this role. Frequently, authors calculate the entropy corresponding to the fragment. This approach can only give approximate estimation. In this paper, we propose to use a more direct measure of the importance of different fragments for face recognition. We propose to select a recognition method and a face database and experimentally investigate the recognition rate using different fragments of faces. We present two such experiments in the paper. We selected the PCNC neural classifier as a method for face recognition and parts of the LFW (Labeled Faces in the Wild) face database as training and testing sets. The recognition rate of the best experiment is comparable with the recognition rate obtained using the whole face.

Keywords: Face recognition, Labeled Faces in the Wild (LFW) database, Random Local Descriptor (RLD), random features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003
866 Markov Game Controller Design Algorithms

Authors: Rajneesh Sharma, M. Gopal

Abstract:

Markov games are a generalization of Markov decision process to a multi-agent setting. Two-player zero-sum Markov game framework offers an effective platform for designing robust controllers. This paper presents two novel controller design algorithms that use ideas from game-theory literature to produce reliable controllers that are able to maintain performance in presence of noise and parameter variations. A more widely used approach for controller design is the H∞ optimal control, which suffers from high computational demand and at times, may be infeasible. Our approach generates an optimal control policy for the agent (controller) via a simple Linear Program enabling the controller to learn about the unknown environment. The controller is facing an unknown environment, and in our formulation this environment corresponds to the behavior rules of the noise modeled as the opponent. Proposed controller architectures attempt to improve controller reliability by a gradual mixing of algorithmic approaches drawn from the game theory literature and the Minimax-Q Markov game solution approach, in a reinforcement-learning framework. We test the proposed algorithms on a simulated Inverted Pendulum Swing-up task and compare its performance against standard Q learning.

Keywords: Reinforcement learning, Markov Decision Process, Matrix Games, Markov Games, Smooth Fictitious play, Controller, Inverted Pendulum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
865 A Signal Driven Adaptive Resolution Short-Time Fourier Transform

Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin

Abstract:

The frequency contents of the non-stationary signals vary with time. For proper characterization of such signals, a smart time-frequency representation is necessary. Classically, the STFT (short-time Fourier transform) is employed for this purpose. Its limitation is the fixed timefrequency resolution. To overcome this drawback an enhanced STFT version is devised. It is based on the signal driven sampling scheme, which is named as the cross-level sampling. It can adapt the sampling frequency and the window function (length plus shape) by following the input signal local variations. This adaptation results into the proposed technique appealing features, which are the adaptive time-frequency resolution and the computational efficiency.

Keywords: Level Crossing Sampling, Activity Selection, Adaptive Resolution Analysis, Computational Complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1557