Search results for: sensor data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7926

Search results for: sensor data

5946 MLOps Scaling Machine Learning Lifecycle in an Industrial Setting

Authors: Yizhen Zhao, Adam S. Z. Belloum, Gonc¸alo Maia da Costa, Zhiming Zhao

Abstract:

Machine learning has evolved from an area of academic research to a real-world applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiments. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on Cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.

Keywords: Cloud computing, continuous development, data versioning, DevOps, industrial setting, MLOps, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1072
5945 Dynamic Bayesian Networks Modeling for Inferring Genetic Regulatory Networks by Search Strategy: Comparison between Greedy Hill Climbing and MCMC Methods

Authors: Huihai Wu, Xiaohui Liu

Abstract:

Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.

Keywords: Genetic regulatory network, Dynamic Bayesian network, GSR, MCMC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
5944 Design and Implementation of Secure Electronic Payment System (Client)

Authors: Pyae Pyae Hun

Abstract:

Secure electronic payment system is presented in this paper. This electronic payment system is to be secure for clients such as customers and shop owners. The security architecture of the system is designed by RC5 encryption / decryption algorithm. This eliminates the fraud that occurs today with stolen credit card numbers. The symmetric key cryptosystem RC5 can protect conventional transaction data such as account numbers, amount and other information. This process can be done electronically using RC5 encryption / decryption program written by Microsoft Visual Basic 6.0. There is no danger of any data sent within the system being intercepted, and replaced. The alternative is to use the existing network, and to encrypt all data transmissions. The system with encryption is acceptably secure, but that the level of encryption has to be stepped up, as computing power increases. Results In order to be secure the system the communication between modules is encrypted using symmetric key cryptosystem RC5. The system will use simple user name, password, user ID, user type and cipher authentication mechanism for identification, when the user first enters the system. It is the most common method of authentication in most computer system.

Keywords: A 128-bit block cipher, Microsoft visual basic 6.0, RC5 encryption /decryption algorithm and TCP/IP protocol.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
5943 Bootstrap Confidence Intervals and Parameter Estimation for Zero Inflated Strict Arcsine Model

Authors: Y. N. Phang, E. F. Loh

Abstract:

Zero inflated Strict Arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, maximum likelihood estimation method is used in estimating the parameters for zero inflated strict arcsine model. Bootstrapping is then employed to compute the confidence intervals for the estimated parameters.

Keywords: overdispersed count data, maximum likelihood estimation, simulated annealing, BCa confidence intervals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280
5942 One Hour Ahead Load Forecasting Using Artificial Neural Network for the Western Area of Saudi Arabia

Authors: A. J. Al-Shareef, E. A. Mohamed, E. Al-Judaibi

Abstract:

Load forecasting has become in recent years one of the major areas of research in electrical engineering. Most traditional forecasting models and artificial intelligence neural network techniques have been tried out in this task. Artificial neural networks (ANN) have lately received much attention, and a great number of papers have reported successful experiments and practical tests. This article presents the development of an ANN-based short-term load forecasting model with improved generalization technique for the Regional Power Control Center of Saudi Electricity Company, Western Operation Area (SEC-WOA). The proposed ANN is trained with weather-related data and historical electric load-related data using the data from the calendar years 2001, 2002, 2003, and 2004 for training. The model tested for one week at five different seasons, typically, winter, spring, summer, Ramadan and fall seasons, and the mean absolute average error for one hour-ahead load forecasting found 1.12%.

Keywords: Artificial neural networks, short-term load forecasting, back propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
5941 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization

Authors: Hebberly Ahatlan

Abstract:

The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, Information Technology (IT) and Operational Technology (OT) convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.

Keywords: Digitalization, IT/OT convergence, semantic interoperability, TEIA alliance, VPP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 114
5940 Needs of Omani Children in First Grade during Their Transition from Kindergarten to Primary School: An Ethnographic Study

Authors: Zainab Algharibi, Julie McAdam, Catherine Fagan

Abstract:

The purpose of this paper is to shed light on how Omani children in the first grade experience their needs during their transition to primary school. Theoretically, the paper was built on two perspectives: Dewey's concept of continuity of experience and the boundary objects introduced by Vygotsky (CHAT). The methodology of the study is based on the crucial role of children’s agency which is a very important activity as an educational tool to enhance the child’s participation in the learning process and develop their ability to face various issues in their life. Thus, the data were obtained from 45 children in grade one from four different primary schools using drawing and visual narrative activities, in addition to researcher observations during the start of the first weeks of the academic year for the first grade. As the study dealt with children, all of the necessary ethical laws were followed. This paper is considered original since it seeks to deal with the issue of children's transition from kindergarten to primary school not only in Oman, but in the Arab region. Therefore, it is expected to fill an important gap in this field and present a proposal that will be a door for researchers to enter this research field later. The analysis of drawing and visual narrative was performed according to the social semiotics approach in two phases. The first is to read out the surface message “denotation,” while the second is to go in-depth via the symbolism obtained from children while they talked and drew letters and signs. This stage is known as “signified”; a video was recorded of each child talking about their drawing and expressing themself. Then, the data were organised and classified according to a cross-data network. Regarding the researcher observation analyses, the collected data were analysed according to the "grounded theory". It is based on comparing the recent data collected from observations with data previously encoded by other methods in which children were drawing alongside the visual narrative in the current study, in order to identify the similarities and differences, and also to clarify the meaning of the accessed categories and to identify sub-categories of them with a description of possible links between them. This is a kind of triangulation in data collection. The study came up with a set of findings, the most vital being that the children's greatest interest goes to their social and psychological needs, such as friends, their teacher, and playing. Also, their biggest fears are a new place, a new teacher, and not having friends, while they showed less concern for their need for educational knowledge and skills.

Keywords: Children’s academic needs, children’s social needs, children transition, primary school.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178
5939 Probability Distribution of Rainfall Depth at Hourly Time-Scale

Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman

Abstract:

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2919
5938 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: Data envelopment analysis, interval DEA, efficiency classification, efficiency prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 936
5937 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: Frailty model, latent variables, liver cirrhosis, parametric distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1057
5936 Operational Risk – Scenario Analysis

Authors: Milan Rippel, Petr Teply

Abstract:

This paper focuses on operational risk measurement techniques and on economic capital estimation methods. A data sample of operational losses provided by an anonymous Central European bank is analyzed using several approaches. Loss Distribution Approach and scenario analysis method are considered. Custom plausible loss events defined in a particular scenario are merged with the original data sample and their impact on capital estimates and on the financial institution is evaluated. Two main questions are assessed – What is the most appropriate statistical method to measure and model operational loss data distribution? and What is the impact of hypothetical plausible events on the financial institution? The g&h distribution was evaluated to be the most suitable one for operational risk modeling. The method based on the combination of historical loss events modeling and scenario analysis provides reasonable capital estimates and allows for the measurement of the impact of extreme events on banking operations.

Keywords: operational risk, scenario analysis, economic capital, loss distribution approach, extreme value theory, stress testing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2426
5935 Performance Evaluation of Data Mining Techniques for Predicting Software Reliability

Authors: Pradeep Kumar, Abdul Wahid

Abstract:

Accurate software reliability prediction not only enables developers to improve the quality of software but also provides useful information to help them for planning valuable resources. This paper examines the performance of three well-known data mining techniques (CART, TreeNet and Random Forest) for predicting software reliability. We evaluate and compare the performance of proposed models with Cascade Correlation Neural Network (CCNN) using sixteen empirical databases from the Data and Analysis Center for Software. The goal of our study is to help project managers to concentrate their testing efforts to minimize the software failures in order to improve the reliability of the software systems. Two performance measures, Normalized Root Mean Squared Error (NRMSE) and Mean Absolute Errors (MAE), illustrate that CART model is accurate than the models predicted using Random Forest, TreeNet and CCNN in all datasets used in our study. Finally, we conclude that such methods can help in reliability prediction using real-life failure datasets.

Keywords: Classification, Cascade Correlation Neural Network, Random Forest, Software reliability, TreeNet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838
5934 Effective Collaboration in Product Development via a Common Sharable Ontology

Authors: Sihem Mostefai, Abdelaziz Bouras, Mohamed Batouche

Abstract:

To achieve competitive advantage nowadays, most of the industrial companies are considering that success is sustained to great product development. That is to manage the product throughout its entire lifetime ranging from design, manufacture, operation and destruction. Achieving this goal requires a tight collaboration between partners from a wide variety of domains, resulting in various product data types and formats, as well as different software tools. So far, the lack of a meaningful unified representation for product data semantics has slowed down efficient product development. This paper proposes an ontology based approach to enable such semantic interoperability. Generic and extendible product ontology is described, gathering main concepts pertaining to the mechanical field and the relations that hold among them. The ontology is not exhaustive; nevertheless, it shows that such a unified representation is possible and easily exploitable. This is illustrated thru a case study with an example product and some semantic requests to which the ontology responds quite easily. The study proves the efficiency of ontologies as a support to product data exchange and information sharing, especially in product development environments where collaboration is not just a choice but a mandatory prerequisite.

Keywords: Information exchange, product lifecyclemanagement, product ontology, semantic interoperability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
5933 Biometric Technology in Securing the Internet Using Large Neural Network Technology

Authors: B. Akhmetov, A. Doszhanova, A. Ivanov, T. Kartbayev, A. Malygin

Abstract:

The article examines the methods of protection of citizens' personal data on the Internet using biometric identity authentication technology. It`s celebrated their potential danger due to the threat of loss of base biometric templates. To eliminate the threat of compromised biometric templates is proposed to use neural networks large and extra-large sizes, which will on the one hand securely (Highly reliable) to authenticate a person by his biometrics, and on the other hand make biometrics a person is not available for observation and understanding. This article also describes in detail the transformation of personal biometric data access code. It`s formed the requirements for biometrics converter code for his work with the images of "Insider," "Stranger", all the "Strangers". It`s analyzed the effect of the dimension of neural networks on the quality of converters mystery of biometrics in access code.

Keywords: Biometric security technologies, Conversion of personal biometric data access code, Electronic signature, Large neural networks, quality of converters "Biometrics - the code", the Egovernment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2178
5932 Study of the Process of Climate Change According to Data Simulation Using LARS-WG Software during 2010-2030: Case Study of Semnan Province

Authors: Leila Rashidian

Abstract:

Temperature rise on Earth has had harmful effects on the Earth's surface and has led to change in precipitation patterns all around the world. The present research was aimed to study the process of climate change according to the data simulation in future and compare these parameters with current situation in the studied stations in Semnan province including Garmsar, Shahrood and Semnan. In this regard, LARS-WG software, HADCM3 model and A2 scenario were used for the 2010-2030 period. In this model, climatic parameters such as maximum and minimum temperature, precipitation and radiation were used daily. The obtained results indicated that there will be a 4.4% increase in precipitation in Semnan province compared with the observed data, and in general, there will be a 1.9% increase in temperature. This temperature rise has significant impact on precipitation patterns. Most of precipitation will be raining (torrential rains in some cases). According to the results, from west to east, the country will experience more temperature rise and will be warmer.

Keywords: Climate change, Semnan province, LARS-WG model, climate parameters, HADCM3 model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1150
5931 Estimation of Human Absorbed Dose Using Compartmental Model

Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning to minimize the absorbed dose in vital tissues. In this study, compartmental model was used in order to estimate the human absorbed dose of 177Lu-DOTATOC from the biodistribution data in wild type rats. For this purpose, 177Lu-DOTATOC was prepared under optimized conditions and its biodistribution was studied in male Syrian rats up to 168 h. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. Dosimetric estimation of the complex was performed using radiation absorbed dose assessment resource (RADAR). The biodistribution data showed high accumulation in the adrenal and pancreas as the major expression sites for somatostatin receptor (SSTR). While kidneys as the major route of excretion receive 0.037 mSv/MBq, pancreas and adrenal also obtain 0.039 and 0.028 mSv/MBq. Due to the usage of this method, the points of accumulated activity data were enhanced, and further information of tissues uptake was collected that it will be followed by high (or improved) precision in dosimetric calculations.

Keywords: Compartmental modeling, human absorbed dose, 177Lu-DOTATOC, Syrian rats.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 925
5930 Improvement of Gas Turbine Performance Test in Combine Cycle

Authors: M. Khosravy-el-Hossani, Q. Dorosti

Abstract:

One of the important applications of gas turbines is their utilization for heat recovery steam generator in combine-cycle technology. Exhaust flow and energy are two key parameters for determining heat recovery steam generator performance which are mainly determined by the main gas turbine components performance data. For this reason a method was developed for determining the exhaust energy in the new edition of ASME PTC22. The result of this investigation shows that the method of standard has considerable error. Therefore in this paper a new method is presented for modifying of the performance calculation. The modified method is based on exhaust gas constituent analysis and combustion calculations. The case study presented here by two kind of General Electric gas turbine design data for validation of methodologies. The result shows that the modified method is more precise than the ASME PTC22 method. The exhaust flow calculation deviation from design data is 1.5-2 % by ASME PTC22 method so that the deviation regarding with modified method is 0.3-0.5%. Based on precision of analyzer instruments, the method can be suitable alternative for gas turbine standard performance test. In advance two methods are proposed based on known and unknown fuel in modified method procedure. The result of this paper shows that the difference between the two methods is below than %0.02. In according to reasonable esult of the second procedure (unknown fuel composition), the method can be applied to performance evaluation of gas turbine, so that the measuring cost and data gathering should be reduced.

Keywords: Gas turbine, Performance test code, Combined cycle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2986
5929 PRO-Teaching – Sharing Ideas to Develop Capabilities

Authors: Steve J. Drew, Christopher J. Klopper

Abstract:

In this paper, the action research driven design of a context relevant, developmental peer review of teaching model, its implementation strategy and its impact at an Australian university is presented. PRO-Teaching realizes an innovative process that triangulates contemporaneous teaching quality data from a range of stakeholders including students, discipline academics, learning and teaching expert academics, and teacher reflection to create reliable evidence of teaching quality. Data collected over multiple classroom observations allows objective reporting on development differentials in constructive alignment, peer, and student evaluations. Further innovation is realized in the application of this highly structured developmental process to provide summative evidence of sufficient validity to support claims for professional advancement and learning and teaching awards. Design decision points and contextual triggers are described within the operating domain. Academics and developers seeking to introduce structured peer review of teaching into their organization will find this paper a useful reference.

Keywords: Development loop, Multiple data sources, Objective reporting, Peer review of teaching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
5928 Enhanced Bidirectional Selection Sort

Authors: Jyoti Dua

Abstract:

An algorithm is a well-defined procedure that takes some input in the form of some values, processes them and gives the desired output. It forms the basis of many other algorithms such as searching, pattern matching, digital filters etc., and other applications have been found in database systems, data statistics and processing, data communications and pattern matching. This paper introduces algorithmic “Enhanced Bidirectional Selection” sort which is bidirectional, stable. It is said to be bidirectional as it selects two values smallest from the front and largest from the rear and assigns them to their appropriate locations thus reducing the number of passes by half the total number of elements as compared to selection sort.

Keywords: Bubble sort, cocktail sort, selection sort, heap sort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2372
5927 Decision Support System for Flood Crisis Management using Artificial Neural Network

Authors: Muhammad Aqil, Ichiro Kita, Akira Yano, Nishiyama Soichi

Abstract:

This paper presents an alternate approach that uses artificial neural network to simulate the flood level dynamics in a river basin. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach and evolving graphical feature and can be adopted for any similar situation to predict the flood level. The main data processing includes the gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood level data, to train/test the model using various inputs and to visualize results. The program code consists of a set of files, which can as well be modified to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The running results indicate that the decision support system applied to the flood level seems to have reached encouraging results for the river basin under examination. The comparison of the model predictions with the observed data was satisfactory, where the model is able to forecast the flood level up to 5 hours in advance with reasonable prediction accuracy. Finally, this program may also serve as a tool for real-time flood monitoring and process control.

Keywords: Decision Support System, Neural Network, Flood Level

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
5926 Attenuation in Transferred RF Power to a Biomedical Implant due to the Misalignment Coils

Authors: Batel Noureddine, Mehenni Mohamed, Dekar Lies

Abstract:

In biomedical implant field, a new formula is given for the study of Radio Frequency power attenuation by simultaneous effects of side and angular misalignment of the supply/data transfer coils. A confrontation with the practical measurements done into a Faraday cage, allowed a checking of the obtained theoretical results. The DC supply systems without material connection and the data transmitters used in the case of biomedical implants, can be well dimensioned by taking into account the possibility of power attenuation by misalignment of transfer coils

Keywords: Biomedical implant field, misalignment coils, powerattenuation, transmitter and receiver coils.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
5925 Extracting Attributes for Twitter Hashtag Communities

Authors: Ashwaq Alsulami, Jianhua Shao

Abstract:

Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.

Keywords: Attributed community, attribute detection, community, social network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 506
5924 On the Performance of Information Criteria in Latent Segment Models

Authors: Jaime R. S. Fonseca

Abstract:

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

Keywords: Quantitative Methods, Multivariate Data Analysis, Clustering, Finite Mixture Models, Information Theoretical Criteria, Simulation experiments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
5923 A Growing Natural Gas Approach for Evaluating Quality of Software Modules

Authors: Parvinder S. Sandhu, Sandeep Khimta, Kiranpreet Kaur

Abstract:

The prediction of Software quality during development life cycle of software project helps the development organization to make efficient use of available resource to produce the product of highest quality. “Whether a module is faulty or not" approach can be used to predict quality of a software module. There are numbers of software quality prediction models described in the literature based upon genetic algorithms, artificial neural network and other data mining algorithms. One of the promising aspects for quality prediction is based on clustering techniques. Most quality prediction models that are based on clustering techniques make use of K-means, Mixture-of-Guassians, Self-Organizing Map, Neural Gas and fuzzy K-means algorithm for prediction. In all these techniques a predefined structure is required that is number of neurons or clusters should be known before we start clustering process. But in case of Growing Neural Gas there is no need of predetermining the quantity of neurons and the topology of the structure to be used and it starts with a minimal neurons structure that is incremented during training until it reaches a maximum number user defined limits for clusters. Hence, in this work we have used Growing Neural Gas as underlying cluster algorithm that produces the initial set of labeled cluster from training data set and thereafter this set of clusters is used to predict the quality of test data set of software modules. The best testing results shows 80% accuracy in evaluating the quality of software modules. Hence, the proposed technique can be used by programmers in evaluating the quality of modules during software development.

Keywords: Growing Neural Gas, data clustering, fault prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1864
5922 Prediction of Dissolved Oxygen in Rivers Using a Wang-Mendel Method – Case Study of Au Sable River

Authors: Mahmoud R. Shaghaghian

Abstract:

Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.

Keywords: Dissolved oxygen, Au Sable, fuzzy logic modeling, Wang Mendel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
5921 Influential Parameters in Estimating Soil Properties from Cone Penetrating Test: An Artificial Neural Network Study

Authors: Ahmed G. Mahgoub, Dahlia H. Hafez, Mostafa A. Abu Kiefa

Abstract:

The Cone Penetration Test (CPT) is a common in-situ test which generally investigates a much greater volume of soil more quickly than possible from sampling and laboratory tests. Therefore, it has the potential to realize both cost savings and assessment of soil properties rapidly and continuously. The principle objective of this paper is to demonstrate the feasibility and efficiency of using artificial neural networks (ANNs) to predict the soil angle of internal friction (Φ) and the soil modulus of elasticity (E) from CPT results considering the uncertainties and non-linearities of the soil. In addition, ANNs are used to study the influence of different parameters and recommend which parameters should be included as input parameters to improve the prediction. Neural networks discover relationships in the input data sets through the iterative presentation of the data and intrinsic mapping characteristics of neural topologies. General Regression Neural Network (GRNN) is one of the powerful neural network architectures which is utilized in this study. A large amount of field and experimental data including CPT results, plate load tests, direct shear box, grain size distribution and calculated data of overburden pressure was obtained from a large project in the United Arab Emirates. This data was used for the training and the validation of the neural network. A comparison was made between the obtained results from the ANN's approach, and some common traditional correlations that predict Φ and E from CPT results with respect to the actual results of the collected data. The results show that the ANN is a very powerful tool. Very good agreement was obtained between estimated results from ANN and actual measured results with comparison to other correlations available in the literature. The study recommends some easily available parameters that should be included in the estimation of the soil properties to improve the prediction models. It is shown that the use of friction ration in the estimation of Φ and the use of fines content in the estimation of E considerable improve the prediction models.

Keywords: Angle of internal friction, Cone penetrating test, General regression neural network, Soil modulus of elasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2279
5920 Direct Measurements of Wind Data over 100 Meters above the Ground in the Site of Lendinara, Italy

Authors: A. Dal Monte, M. Raciti Castelli, G. B. Bellato, L. Stevanato, E. Benini

Abstract:

The wind resource in the Italian site of Lendinara (RO) is analyzed through a systematic anemometric campaign performed on the top of the bell tower, at an altitude of over 100 m above the ground. Both the average wind speed and the Weibull distribution are computed. The resulting average wind velocity is in accordance with the numerical predictions of the Italian Wind Atlas, confirming the accuracy of the extrapolation of wind data adopted for the evaluation of wind potential at higher altitudes with respect to the commonly placed measurement stations.

Keywords: Anemometric campaign, wind resource, Weibull distribution, wind atlas

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1957
5919 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm

Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll

Abstract:

This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.

Keywords: Concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 986
5918 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets

Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi

Abstract:

In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.

Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
5917 Model-Free Distributed Control of Dynamical Systems

Authors: Javad Khazaei, Rick S. Blum

Abstract:

Distributed control is an efficient and flexible approach for coordination of multi-agent systems. One of the main challenges in designing a distributed controller is identifying the governing dynamics of the dynamical systems. Data-driven system identification is currently undergoing a revolution. With the availability of high-fidelity measurements and historical data, model-free identification of dynamical systems can facilitate the control design without tedious modeling of high-dimensional and/or nonlinear systems. This paper develops a distributed control design using consensus theory for linear and nonlinear dynamical systems using sparse identification of system dynamics. Compared with existing consensus designs that heavily rely on knowing the detailed system dynamics, the proposed model-free design can accurately capture the dynamics of the system with available measurements and input data and provide guaranteed performance in consensus and tracking problems. Heterogeneous damped oscillators are chosen as examples of dynamical system for validation purposes.

Keywords: Consensus tracking, distributed control, model-free control, sparse identification of dynamical systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 533