Search results for: learning using labeled and unlabelled data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8800

Search results for: learning using labeled and unlabelled data

6790 Artificial Neural Networks Modeling in Water Resources Engineering: Infrastructure and Applications

Authors: M. R. Mustafa, M. H. Isa, R. B. Rezaur

Abstract:

The use of artificial neural network (ANN) modeling for prediction and forecasting variables in water resources engineering are being increasing rapidly. Infrastructural applications of ANN in terms of selection of inputs, architecture of networks, training algorithms, and selection of training parameters in different types of neural networks used in water resources engineering have been reported. ANN modeling conducted for water resources engineering variables (river sediment and discharge) published in high impact journals since 2002 to 2011 have been examined and presented in this review. ANN is a vigorous technique to develop immense relationship between the input and output variables, and able to extract complex behavior between the water resources variables such as river sediment and discharge. It can produce robust prediction results for many of the water resources engineering problems by appropriate learning from a set of examples. It is important to have a good understanding of the input and output variables from a statistical analysis of the data before network modeling, which can facilitate to design an efficient network. An appropriate training based ANN model is able to adopt the physical understanding between the variables and may generate more effective results than conventional prediction techniques.

Keywords: ANN, discharge, modeling, prediction, sediment,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5700
6789 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: Data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1229
6788 The Importance of Changing the Traditional Mode of Higher Education in Bangladesh: Creating Huge Job Opportunities for Home and Abroad

Authors: M. M. Shahidul Hassan, Omiya Hassan

Abstract:

Bangladesh has set its goal to reach upper middle-income country status by 2024. To attain this status, the country must satisfy the World Bank requirement of achieving minimum Gross National Income (GNI). Number of youth job seekers in the country is increasing. University graduates are looking for decent jobs. So, the vital issue of this country is to understand how the GNI and jobs can be increased. The objective of this paper is to address these issues and find ways to create more job opportunities for youths at home and abroad which will increase the country’s GNI. The paper studies proportion of different goods Bangladesh exported, and also the percentage of employment in different sectors. The data used here for the purpose of analysis have been collected from the available literature. These data are then plotted and analyzed. Through these studies, it is concluded that growth in sectors like agricultural, ready-made garments (RMG), jute industries and fisheries are declining and the business community is not interested in setting up capital-intensive industries. Under this situation, the country needs to explore other business opportunities for a higher economic growth rate. Knowledge can substitute the physical resource. Since the country consists of the large youth population, higher education will play a key role in economic development. It now needs graduates with higher-order skills with innovative quality. Such dispositions demand changes in a university’s curriculum, teaching and assessment method which will function young generations as active learners and creators. By bringing these changes in higher education, a knowledge-based society can be created. The application of such knowledge and creativity will then become the commodity of Bangladesh which will help to reach its goal as an upper middle-income country.

Keywords: Bangladesh, economic sectors, economic growth, higher education, knowledge-based economy, massifcation of higher education, teaching and learning, universities’ role in society.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999
6787 New Data Reuse Adaptive Filters with Noise Constraint

Authors: Young-Seok Choi

Abstract:

We present a new framework of the data-reusing (DR) adaptive algorithms by incorporating a constraint on noise, referred to as a noise constraint. The motivation behind this work is that the use of the statistical knowledge of the channel noise can contribute toward improving the convergence performance of an adaptive filter in identifying a noisy linear finite impulse response (FIR) channel. By incorporating the noise constraint into the cost function of the DR adaptive algorithms, the noise constrained DR (NC-DR) adaptive algorithms are derived. Experimental results clearly indicate their superior performance over the conventional DR ones.

Keywords: Adaptive filter, data-reusing, least-mean square (LMS), affine projection (AP), noise constraint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
6786 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System

Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar

Abstract:

MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.

Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978
6785 On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Authors: Xiaoshu Lu, Päivi Leino-Arjas, Kustaa Piha, Akseli Aittomäki, Peppiina Saastamoinen, Ossi Rahkonen, Eero Lahelma

Abstract:

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

Keywords: Sickness absence, longitudinal data, methodologies, mix-distribution model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2277
6784 Phase Jitter Transfer in High Speed Data Links

Authors: Tsunwai Gary Yip

Abstract:

Phase locked loops in 10 Gb/s and faster data links are low phase noise devices. Characterization of their phase jitter transfer functions is difficult because the intrinsic noise of the PLLs is comparable to the phase noise of the reference clock signal. The problem is solved by using a linear model to account for the intrinsic noise. This study also introduces a novel technique for measuring the transfer function. It involves the use of the reference clock as a source of wideband excitation, in contrast to the commonly used sinusoidal excitations at discrete frequencies. The data reported here include the intrinsic noise of a PLL for 10 Gb/s links and the jitter transfer function of a PLL for 12.8 Gb/s links. The measured transfer function suggests that the PLL responded like a second order linear system to a low noise reference clock.

Keywords: Intrinsic phase noise, jitter in data link, PLL jitter transfer function, high speed clocking in electronic circuit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
6783 A Review in Recent Development of Network Threats and Security Measures

Authors: Roza Dastres, Mohsen Soori

Abstract:

Networks are vulnerable devices due to their basic feature of facilitating remote access and data communication. The information in the networks needs to be kept secured and safe in order to provide an effective communication and sharing device in the web of data. Due to challenges and threats of the data in networks, the network security is one of the most important considerations in information technology infrastructures. As a result, the security measures are considered in the network in order to decrease the probability of accessing the secured data by the hackers. The purpose of network security is to protect the network and its components from unauthorized access and abuse in order to provide a safe and secured communication device for the users. In the present research work a review in recent development of network threats and security measures is presented and future research works are also suggested. Different attacks to the networks and security measured against them are discussed in order to increase security in the web of data. So, new ideas in the network security systems can be presented by analyzing the published papers in order to move forward the research field.

Keywords: Network threats, network security, security measures, firewalls.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853
6782 Curriculum Development of Successful Intelligence Promoting for Nursing Students

Authors: Saranya Chularee, Tawa Chularee

Abstract:

Successful intelligence (SI) is the integrated set of the ability needed to attain success in life, within individual-s sociocultural context. People are successfully intelligent by recognizing their strengths and weaknesses. They will find ways to strengthen their weakness and maintain their strength or even improve it. SI people can shape, select, and adapt to the environments by using balance of higher-ordered thinking abilities including; critical, creative, and applicative. Aims: The purposes of this study were to; 1) develop curriculum that promotes SI for nursing students, and 2) study the effectiveness of the curriculum development. Method: Research and Development was a method used for this study. The design was divided into two phases; 1) the curriculum development which composed of three steps (needs assessment, curriculum development and curriculum field trail), and 2) the curriculum implementation. In this phase, a pre-experimental research design (one group pretest-posttest design) was conducted. The sample composed of 49 sophomore nursing students of Boromarajonani College of Nursing, Surin, Thailand who enrolled in Nursing care of Health problem course I in 2011 academic year. Data were carefully collected using 4 instruments; 1) Modified essay questions test (MEQ) 2) Nursing Care Plan evaluation form 3) Group processing observation form (α = 0.74) and 4) Satisfied evaluation form of learning (α = 0.82). Data were analyzed using descriptive statistics and content analysis. Results: The results revealed that the sample had post-test average score of SI higher than pre-test average score (mean difference was 5.03, S.D. = 2.84). Fifty seven percentages of the sample passed the MEQ posttest at the criteria of 60 percentages. Students demonstrated the strategies of how to develop nursing care plan. Overall, students- satisfaction on teaching performance was at high level (mean = 4.35, S.D. = 0.46). Conclusion: This curriculum can promote the attribute of characteristic of SI person and was highly required to be continued.

Keywords: Curriculum Development, Nursing Education, Successful Intelligence, Thinking ability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217
6781 A Review on Soft Computing Technique in Intrusion Detection System

Authors: Noor Suhana Sulaiman, Rohani Abu Bakar, Norrozila Sulaiman

Abstract:

Intrusion Detection System is significant in network security. It detects and identifies intrusion behavior or intrusion attempts in a computer system by monitoring and analyzing the network packets in real time. In the recent year, intelligent algorithms applied in the intrusion detection system (IDS) have been an increasing concern with the rapid growth of the network security. IDS data deals with a huge amount of data which contains irrelevant and redundant features causing slow training and testing process, higher resource consumption as well as poor detection rate. Since the amount of audit data that an IDS needs to examine is very large even for a small network, classification by hand is impossible. Hence, the primary objective of this review is to review the techniques prior to classification process suit to IDS data.

Keywords: Intrusion Detection System, security, soft computing, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
6780 Organizational Involvement and Employees’ Consumption of New Work Practices in State-owned Enterprises: The Ghanaian Case

Authors: M. Aminu Sanda, K. Ewontumah

Abstract:

This paper explored the challenges faced by the management of a Ghanaian state enterprise in managing conflicts and disturbances associated with its attempt to implement new work practices to enhance its capability to operate as a commercial entity. The purpose was to understand the extent to which organizational involvement, consistency and adaptability influence employees’ consumption of new work practices in transforming the organization’s organizational activity system. Using selfadministered questionnaires, data were collected from one hundred and eighty (180) employees and analyzed using both descriptive and inferential statistics. The results showed that constraints in organizational involvement and adaptability prevented the positive consumption of new work practices by employees in the organization. It is also found that the organization’s employees failed to consume the new practices being implemented, because they perceived the process as non-involving, and as such, did not encourage the development of employee capability, empowerment, and teamwork. The study concluded that the failure of the organization’s management to create opportunities for organizational learning constrained its ability to get employees consume the new work practices, which situation could have facilitated the organization’s capabilities of operating as a commercial entity.

Keywords: Organizational transformation, new work practices, work practice consumption, organizational involvement, state-owned enterprise, Ghana.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
6779 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: Multi-objective decision support, analysis, data validation, freight delivery, multi-modal transportation, genetic programming methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 497
6778 Modeling and Simulation of Acoustic Link Using Mackenize Propagation Speed Equation

Authors: Christhu Raj M. R., Rajeev Sukumaran

Abstract:

Underwater acoustic networks have attracted great attention in the last few years because of its numerous applications. High data rate can be achieved by efficiently modeling the physical layer in the network protocol stack. In Acoustic medium, propagation speed of the acoustic waves is dependent on many parameters such as temperature, salinity, density, and depth. Acoustic propagation speed cannot be modeled using standard empirical formulas such as Urick and Thorp descriptions. In this paper, we have modeled the acoustic channel using real time data of temperature, salinity, and speed of Bay of Bengal (Indian Coastal Region). We have modeled the acoustic channel by using Mackenzie speed equation and real time data obtained from National Institute of Oceanography and Technology. It is found that acoustic propagation speed varies between 1503 m/s to 1544 m/s as temperature and depth differs. The simulation results show that temperature, salinity, depth plays major role in acoustic propagation and data rate increases with appropriate data sets substituted in the simulated model.

Keywords: Underwater Acoustics, Mackenzie Speed Equation, Temperature, Salinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207
6777 Isobaric Vapor-Liquid Equilibrium Data for Binary Mixture of 2-Methyltetrahydrofuran and Cumene

Authors: V. K. Rattan, Baljinder K. Gill, Seema Kapoor

Abstract:

Isobaric vapor-liquid equilibrium measurements are reported for binary mixture of 2-Methyltetrahydrofuran and Cumene at 97.3 kPa. The data were obtained using a vapor recirculating type (modified Othmer's) equilibrium still. The mixture shows slight negative deviation from ideality. The system does not form an azeotrope. The experimental data obtained in this study are thermodynamically consistent according to the Herington test. The activity coefficients have been satisfactorily correlated by means of the Margules, and NRTL equations. Excess Gibbs free energy has been calculated from the experimental data. The values of activity coefficients have also been obtained by the UNIFAC group contribution method.

Keywords: Binary mixture, 2-Methyltetrahydrofuran, Cumene, Vapor-liquid equilibrium, UNIFAC, Excess Gibbs free energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2729
6776 A Keyword-Based Filtering Technique of Document-Centric XML using NFA Representation

Authors: Changwoo Byun, Kyounghan Lee, Seog Park

Abstract:

XML is becoming a de facto standard for online data exchange. Existing XML filtering techniques based on a publish/subscribe model are focused on the highly structured data marked up with XML tags. These techniques are efficient in filtering the documents of data-centric XML but are not effective in filtering the element contents of the document-centric XML. In this paper, we propose an extended XPath specification which includes a special matching character '%' used in the LIKE operation of SQL in order to solve the difficulty of writing some queries to adequately filter element contents using the previous XPath specification. We also present a novel technique for filtering a collection of document-centric XMLs, called Pfilter, which is able to exploit the extended XPath specification. We show several performance studies, efficiency and scalability using the multi-query processing time (MQPT).

Keywords: XML Data Stream, Document-centric XML, Filtering Technique, Value-based Predicates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
6775 The Use of Project to Enhance Writing Skill

Authors: Duangkamol Thitivesa, Abigail Melad Essien

Abstract:

This paper explores the use of project work in a content-based instruction in a Rajabhat University, a teacher college, where student teachers are instructed to perform teaching roles mainly in basic education level. Its aim is to link theory to practice, and to help language teachers maximize the full potential of project work for genuine communication and give real meaning to writing activity. Two research questions are formulated to guide this study: a) What is the academic achievement of the students- writing skill against the 70% attainment target after the use of project to enhance the skill? and b) To what degree is the development of the students- writing skills during the course of project to enhance the skill? The sample of the study comprised of 38 fourth-year English major students. The data was collected by means of achievement test, student writing works, and project diary. The scores in the summative achievement test were analyzed by mean score, standard deviation, and t-test. Project diary serves as students- record of the language acquired during the project. List of structures and vocabulary noted in the diary has shown students- ability to attend to, recognize, and focus on meaningful patterns of language forms.

Keywords: EFL classroom, Project-Based Learning, project work, writing skill.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3332
6774 A Mixture Model of Two Different Distributions Approach to the Analysis of Heterogeneous Survival Data

Authors: Ülkü Erişoğlu, Murat Erişoğlu, Hamza Erol

Abstract:

In this paper we propose a mixture of two different distributions such as Exponential-Gamma, Exponential-Weibull and Gamma-Weibull to model heterogeneous survival data. Various properties of the proposed mixture of two different distributions are discussed. Maximum likelihood estimations of the parameters are obtained by using the EM algorithm. Illustrative example based on real data are also given.

Keywords: Exponential-Gamma, Exponential-Weibull, Gamma-Weibull, EM Algorithm, Survival Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4081
6773 Motion Recognition Based On Fuzzy WP Feature Extraction Approach

Authors: Keun-Chang Kwak

Abstract:

This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.

Keywords: Motion recognition, fuzzy wavelet packet, Vicon physical data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
6772 DCBOR: A Density Clustering Based on Outlier Removal

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.

Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
6771 Alternative to M-Estimates in Multisensor Data Fusion

Authors: Nga-Viet Nguyen, Georgy Shevlyakov, Vladimir Shin

Abstract:

To solve the problem of multisensor data fusion under non-Gaussian channel noise. The advanced M-estimates are known to be robust solution while trading off some accuracy. In order to improve the estimation accuracy while still maintaining the equivalent robustness, a two-stage robust fusion algorithm is proposed using preliminary rejection of outliers then an optimal linear fusion. The numerical experiments show that the proposed algorithm is equivalent to the M-estimates in the case of uncorrelated local estimates and significantly outperforms the M-estimates when local estimates are correlated.

Keywords: Data fusion, estimation, robustness, M-estimates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1839
6770 Performance Analysis of Evolutionary ANN for Output Prediction of a Grid-Connected Photovoltaic System

Authors: S.I Sulaiman, T.K Abdul Rahman, I. Musirin, S. Shaari

Abstract:

This paper presents performance analysis of the Evolutionary Programming-Artificial Neural Network (EPANN) based technique to optimize the architecture and training parameters of a one-hidden layer feedforward ANN model for the prediction of energy output from a grid connected photovoltaic system. The ANN utilizes solar radiation and ambient temperature as its inputs while the output is the total watt-hour energy produced from the grid-connected PV system. EP is used to optimize the regression performance of the ANN model by determining the optimum values for the number of nodes in the hidden layer as well as the optimal momentum rate and learning rate for the training. The EPANN model is tested using two types of transfer function for the hidden layer, namely the tangent sigmoid and logarithmic sigmoid. The best transfer function, neural topology and learning parameters were selected based on the highest regression performance obtained during the ANN training and testing process. It is observed that the best transfer function configuration for the prediction model is [logarithmic sigmoid, purely linear].

Keywords: Artificial neural network (ANN), Correlation coefficient (R), Evolutionary programming-ANN (EPANN), Photovoltaic (PV), logarithmic sigmoid and tangent sigmoid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
6769 Data Structures and Algorithms of Intelligent Web-Based System for Modular Design

Authors: Ivan C. Mustakerov, Daniela I. Borissova

Abstract:

In recent years, new product development became more and more competitive and globalized, and the designing phase is critical for the product success. The concept of modularity can provide the necessary foundation for organizations to design products that can respond rapidly to market needs. The paper describes data structures and algorithms of intelligent Web-based system for modular design taking into account modules compatibility relationship and given design requirements. The system intelligence is realized by developed algorithms for choice of modules reflecting all system restrictions and requirements. The proposed data structure and algorithms are illustrated by case study of personal computer configuration. The applicability of the proposed approach is tested through a prototype of Web-based system.

Keywords: Data structures, algorithms, intelligent web-based system, modular design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822
6768 Self Watermarking based on Visual Cryptography

Authors: Mahmoud A. Hassan, Mohammed A. Khalili

Abstract:

We are proposing a simple watermarking method based on visual cryptography. The method is based on selection of specific pixels from the original image instead of random selection of pixels as per Hwang [1] paper. Verification information is generated which will be used to verify the ownership of the image without the need to embed the watermark pattern into the original digital data. Experimental results show the proposed method can recover the watermark pattern from the marked data even if some changes are made to the original digital data.

Keywords: Watermarking, visual cryptography, visualthreshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
6767 Metal Ship and Robotic Car: A Hands-On Activity to Develop Scientific and Engineering Skills for High School Students

Authors: Jutharat Sunprasert, Ekapong Hirunsirisawat, Narongrit Waraporn, Somporn Peansukmanee

Abstract:

Metal Ship and Robotic Car is one of the hands-on activities in the course, the Fundamental of Engineering that can be divided into three parts. The first part, the metal ships, was made by using engineering drawings, physics and mathematics knowledge. The second part is where the students learned how to construct a robotic car and control it using computer programming. In the last part, the students had to combine the workings of these two objects in the final testing. This aim of study was to investigate the effectiveness of hands-on activity by integrating Science, Technology, Engineering and Mathematics (STEM) concepts to develop scientific and engineering skills. The results showed that the majority of students felt this hands-on activity lead to an increased confidence level in the integration of STEM. Moreover, 48% of all students engaged well with the STEM concepts. Students could obtain the knowledge of STEM through hands-on activities with the topics science and mathematics, engineering drawing, engineering workshop and computer programming; most students agree and strongly agree with this learning process. This indicated that the hands-on activity: “Metal Ship and Robotic Car” is a useful tool to integrate each aspect of STEM. Furthermore, hands-on activities positively influence a student’s interest which leads to increased learning achievement and also in developing scientific and engineering skills.

Keywords: Hands-on activity, STEM education, computer programming, metal work.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 982
6766 Estimation of Missing or Incomplete Data in Road Performance Measurement Systems

Authors: Kristjan Kuhi, Kati K. Kaare, Ott Koppel

Abstract:

Modern management in most fields is performance based; both planning and implementation of maintenance and operational activities are driven by appropriately defined performance indicators. Continuous real-time data collection for management is becoming feasible due to technological advancements. Outdated and insufficient input data may result in incorrect decisions. When using deterministic models the uncertainty of the object state is not visible thus applying the deterministic models are more likely to give false diagnosis. Constructing structured probabilistic models of the performance indicators taking into consideration the surrounding indicator environment enables to estimate the trustworthiness of the indicator values. It also assists to fill gaps in data to improve the quality of the performance analysis and management decisions. In this paper authors discuss the application of probabilistic graphical models in the road performance measurement and propose a high-level conceptual model that enables analyzing and predicting more precisely future pavement deterioration based on road utilization.

Keywords: Probabilistic graphical models, performance indicators, road performance management, data collection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
6765 Integration of Seismic and Seismological Data Interpretation for Subsurface Structure Identification

Authors: Iftikhar Ahmed Satti, Wan Ismail Wan Yusoff

Abstract:

The structural interpretation of a part of eastern Potwar (Missa Keswal) has been carried out with available seismological, seismic and well data. Seismological data contains both the source parameters and fault plane solution (FPS) parameters and seismic data contains ten seismic lines that were re-interpreted by using well data. Structural interpretation depicts two broad types of fault sets namely, thrust and back thrust faults. These faults together give rise to pop up structures in the study area and also responsible for many structural traps and seismicity. Seismic interpretation includes time and depth contour maps of Chorgali Formation while seismological interpretation includes focal mechanism solution (FMS), depth, frequency, magnitude bar graphs and renewal of Seismotectonic map. The Focal Mechanism Solutions (FMS) that surrounds the study area are correlated with the different geological and structural maps of the area for the determination of the nature of subsurface faults. Results of structural interpretation from both seismic and seismological data show good correlation. It is hoped that the present work will help in better understanding of the variations in the subsurface structure and can be a useful tool for earthquake prediction, planning of oil field and reservoir monitoring.

Keywords: Focal mechanism solution (FMS), Fault plane solution (FPS), Reservoir monitoring, earthquake prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2489
6764 Detailed Mapping of Pyroclastic Flow Deposits by SAR Data Processing for an Active Volcano in the Torrid Zone

Authors: Asep Saepuloh, Katsuaki Koike

Abstract:

Field mapping activity for an active volcano mainly in the Torrid Zone is usually hampered by several problems such as steep terrain and bad atmosphere conditions. In this paper we present a simple solution for such problem by a combination Synthetic Aperture Radar (SAR) and geostatistical methods. By this combination, we could reduce the speckle effect from the SAR data and then estimate roughness distribution of the pyroclastic flow deposits. The main purpose of this study is to detect spatial distribution of new pyroclastic flow deposits termed as P-zone accurately using the β°data from two RADARSAT-1 SAR level-0 data. Single scene of Hyperion data and field observation were used for cross-validation of the SAR results. Mt. Merapi in central Java, Indonesia, was chosen as a study site and the eruptions in May-June 2006 were examined. The P-zones were found in the western and southern flanks. The area size and the longest flow distance were calculated as 2.3 km2 and 6.8 km, respectively. The grain size variation of the P-zone was mapped in detail from fine to coarse deposits regarding the C-band wavelength of 5.6 cm.

Keywords: Geostatistical Method, Mt. Merapi, Pyroclastic, RADARSAT-1.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
6763 Foundation of the Information Model for Connected-Cars

Authors: Hae-Won Seo, Yong-Gu Lee

Abstract:

Recent progress in the next generation of automobile technology is geared towards incorporating information technology into cars. Collectively called smart cars are bringing intelligence to cars that provides comfort, convenience and safety. A branch of smart cars is connected-car system. The key concept in connected-cars is the sharing of driving information among cars through decentralized manner enabling collective intelligence. This paper proposes a foundation of the information model that is necessary to define the driving information for smart-cars. Road conditions are modeled through a unique data structure that unambiguously represent the time variant traffics in the streets. Additionally, the modeled data structure is exemplified in a navigational scenario and usage using UML. Optimal driving route searching is also discussed using the proposed data structure in a dynamically changing road conditions.

Keywords: Connected-car, data modeling, route planning, navigation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
6762 Fault Detection and Identification of COSMED K4b2 Based On PCA and Neural Network

Authors: Jing Zhou, Steven Su, Aihuang Guo

Abstract:

COSMED K4b2 is a portable electrical device designed to test pulmonary functions. It is ideal for many applications that need the measurement of the cardio-respiratory response either in the field or in the lab is capable with the capability to delivery real time data to a sink node or a PC base station with storing data in the memory at the same time. But the actual sensor outputs and data received may contain some errors, such as impulsive noise which can be related to sensors, low batteries, environment or disturbance in data acquisition process. These abnormal outputs might cause misinterpretations of exercise or living activities to persons being monitored. In our paper we propose an effective and feasible method to detect and identify errors in applications by principal component analysis (PCA) and a back propagation (BP) neural network.

Keywords: BP Neural Network, Exercising Testing, Fault Detection and Identification, Principal Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3080
6761 Array Data Transformation for Source Code Obfuscation

Authors: S. Praveen, P. Sojan Lal

Abstract:

Obfuscation is a low cost software protection methodology to avoid reverse engineering and re engineering of applications. Source code obfuscation aims in obscuring the source code to hide the functionality of the codes. This paper proposes an Array data transformation in order to obfuscate the source code which uses arrays. The applications using the proposed data structures force the programmer to obscure the logic manually. It makes the developed obscured codes hard to reverse engineer and also protects the functionality of the codes.

Keywords: Reverse Engineering, Source Code Obfuscation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042