Search results for: network information criterion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14744

Search results for: network information criterion

13904 Study of ANFIS and ARIMA Model for Weather Forecasting

Authors: Bandreddy Anand Babu, Srinivasa Rao Mandadi, C. Pradeep Reddy, N. Ramesh Babu

Abstract:

In this paper quickly illustrate the correlation investigation of Auto-Regressive Integrated Moving and Average (ARIMA) and daptive Network Based Fuzzy Inference System (ANFIS) models done by climate estimating. The climate determining is taken from University of Waterloo. The information is taken as Relative Humidity, Ambient Air Temperature, Barometric Pressure and Wind Direction utilized within this paper. The paper is carried out by analyzing the exhibitions are seen by demonstrating of ARIMA and ANIFIS model like with Sum of average of errors. Versatile Network Based Fuzzy Inference System (ANFIS) demonstrating is carried out by Mat lab programming and Auto-Regressive Integrated Moving and Average (ARIMA) displaying is produced by utilizing XLSTAT programming. ANFIS is carried out in Fuzzy Logic Toolbox in Mat Lab programming.

Keywords: ARIMA, ANFIS, fuzzy surmising tool stash, weather forecasting, MATLAB

Procedia PDF Downloads 402
13903 Navigating Government Finance Statistics: Effortless Retrieval and Comparative Analysis through Data Science and Machine Learning

Authors: Kwaku Damoah

Abstract:

This paper presents a methodology and software application (App) designed to empower users in accessing, retrieving, and comparatively exploring data within the hierarchical network framework of the Government Finance Statistics (GFS) system. It explores the ease of navigating the GFS system and identifies the gaps filled by the new methodology and App. The GFS, embodies a complex Hierarchical Network Classification (HNC) structure, encapsulating institutional units, revenues, expenses, assets, liabilities, and economic activities. Navigating this structure demands specialized knowledge, experience, and skill, posing a significant challenge for effective analytics and fiscal policy decision-making. Many professionals encounter difficulties deciphering these classifications, hindering confident utilization of the system. This accessibility barrier obstructs a vast number of professionals, students, policymakers, and the public from leveraging the abundant data and information within the GFS. Leveraging R programming language, Data Science Analytics and Machine Learning, an efficient methodology enabling users to access, navigate, and conduct exploratory comparisons was developed. The machine learning Fiscal Analytics App (FLOWZZ) democratizes access to advanced analytics through its user-friendly interface, breaking down expertise barriers.

Keywords: data science, data wrangling, drilldown analytics, government finance statistics, hierarchical network classification, machine learning, web application.

Procedia PDF Downloads 49
13902 A Collective Intelligence Approach to Safe Artificial General Intelligence

Authors: Craig A. Kaplan

Abstract:

If AGI proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest, and fastest, path to Artificial General Intelligence (AGI) may be to harness the collective intelligence of multiple AI and human agents in an AGI network. This approach has roots in seminal ideas from four of the scientists who founded the field of Artificial Intelligence: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon. Extrapolating key insights from these founders of AI, and combining them with the work of modern researchers, results in a fast and safe path to AGI. The seminal ideas discussed are: 1) Society of Mind (Minsky), 2) Information Theory (Shannon), 3) Problem Solving Theory (Newell & Simon), and 4) Bounded Rationality (Simon). Society of Mind describes a collective intelligence approach that can be used with AI and human agents to create an AGI network. Information theory helps address the critical issue of how an AGI system will increase its intelligence over time. Problem Solving Theory provides a universal framework that AI and human agents can use to communicate efficiently, effectively, and safely. Bounded Rationality helps us better understand not only the capabilities of SuperIntelligent AGI but also how humans can remain relevant in a world where the intelligence of AGI vastly exceeds that of its human creators. Each key idea can be combined with recent work in the fields of Artificial Intelligence, Machine Learning, and Large Language Models to accelerate the development of a working, safe, AGI system.

Keywords: AI Agents, Collective Intelligence, Minsky, Newell, Shannon, Simon, AGI, AGI Safety

Procedia PDF Downloads 68
13901 Smart Forms and Intelligent Transportation Network Patterns, an Integrated Spatial Approach to Smart Cities and Intelligent Transport Systems in India Cities

Authors: Geetanjli Rani

Abstract:

The physical forms and network pattern of the city is expected to be enhanced with the advancement of technology. Reason being, the era of virtualisation and digital urban realm convergence with physical development. By means of comparative Spatial graphics and visuals of cities, the present paper attempts to revisit the very base of efficient physical forms and patterns to sync the emergence of virtual activities. Thus, the present approach to integrate spatial Smartness of Cities and Intelligent Transportation Systems is a brief assessment of smart forms and intelligent transportation network pattern to the dualism of physical and virtual urban activities. Finally, the research brings out that the grid iron pattern, radial, ring-radial, orbital etc. stands to be more efficient, effective and economical transit friendly for users, resource optimisation as well as compact urban and regional systems. Moreover, this paper concludes that the idea of flow and contiguity hidden in such smart forms and intelligent transportation network pattern suits to layering, deployment, installation and development of Intelligent Transportation Systems of Smart Cities such as infrastructure, facilities and services.

Keywords: smart form, smart infrastructure, intelligent transportation network pattern, physical and virtual integration

Procedia PDF Downloads 143
13900 The Integration Challenges of Women Refugees in Sweden from Socio-Cultural Perspective

Authors: Khadijah Saeed Khan

Abstract:

One of the major current societal issues of Swedish society is to integrate newcomer refugees well into the host society. The cultural integration issue is one of the under debated topic in the literature, and this study intends to meet this gap from the Swedish perspective. The purpose of this study is to explore the role and types of cultural landscapes of refugee women in Sweden and how these landscapes help or hinder the settlement process. The cultural landscapes are referred to as a set of multiple cultural activities or practices which refugees perform in a specific context and circumstances (i.e., being in a new country) to seek, share or use relevant information for their settlement. Information plays a vital role in various aspects of newcomers' lives in a new country. This article has an intention to highlight the importance of multiple cultural landscapes as a source of information (regarding employment, language learning, finding accommodation, immigration matters, health concerns, school and education, family matters, and other everyday matters) for refugees to settle down in Sweden. Some relevant theories, such as information landscapes and socio-cultural theories, are considered in this study. A qualitative research design is employed, including semi-structured deep interviews and participatory observation with 20 participants. The initial findings show that the refugee women encounter many information-related and integration-related challenges in Sweden and have built a network of cultural landscapes in which they practice various co-ethnic cultural and religious activities at different times of the year. These landscapes help them to build a sense of belonging with people from their own or similar land and assist them to seek and share relevant information in everyday life in Sweden.

Keywords: cultural integration, cultural landscapes, information, women refugees

Procedia PDF Downloads 126
13899 Quantifying Stability of Online Communities and Its Impact on Disinformation

Authors: Victor Chomel, Maziyar Panahi, David Chavalarias

Abstract:

Misinformation has taken an increasingly worrying place in social media. Propagation patterns are closely linked to the structure of communities. This study proposes a method of community analysis based on a combination of centrality indicators for the network and its main communities. The objective is to establish a link between the stability of the communities over time, the social ascension of its members internally, and the propagation of information in the community. To this end, data from the debates about global warming and political communities on Twitter have been collected, and several tens of millions of tweets and retweets have helped us better understand the structure of these communities. The quantification of this stability allows for the study of the propagation of information of any kind, including disinformation. Our results indicate that the most stable communities over time are the ones that enable the establishment of nodes capturing a large part of the information and broadcasting its opinions. Conversely, communities with a high turnover and social ascendancy only stabilize themselves strongly in the face of adversity and external events but seem to offer a greater diversity of opinions most of the time.

Keywords: community analysis, disinformation, misinformation, Twitter

Procedia PDF Downloads 125
13898 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases

Authors: Hao-Hsiang Ku, Ching-Ho Chi

Abstract:

Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.

Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system

Procedia PDF Downloads 247
13897 Comparative Study on Manet Using Soft Computing Techniques

Authors: Amarjit Singh, Tripatdeep Singh Dua, Vikas Attri

Abstract:

Mobile Ad-hoc Network is a combination of several nodes that create dynamically a specific network without using any base infrastructure. In this study all the mobile nodes can depended upon each other to send any data. Mobile host can pick up data and forwarding to their destination path. Basically MANET depend upon their Quality of Service which is highly constraints to the user. To give better services we need to improve the QOS. In these days MANET QOS requirement to use soft computing techniques. These techniques depend upon their specific requirement and which exists using MANET concepts. Using a soft computing techniques various protocol and algorithms may be considered. In this paper, we provide comparative study review of existing work done in MANET using various kind of soft computing techniques. Our review research is based on their specific protocol or algorithm which provide concern solution of QOS need. We discuss about various protocol through which routing in MANET. In Second section we clear the concepts of Soft Computing and their types. In third section we review the MANET using different kind of soft computing techniques work done before. In forth section we need to understand the concept of QoS requirement which exists in MANET and we done comparative study on different protocol used before and last we conclude the purpose of using MANET with soft computing techniques metrics.

Keywords: mobile ad-hoc network, fuzzy improved genetic approach, neural network, routing protocol, wireless mesh network

Procedia PDF Downloads 332
13896 Information Needs and Information Usage of the Older Person Club’s Members in Bangkok

Authors: Siriporn Poolsuwan

Abstract:

This research aims to explore the information needs, information usages, and problems of information usage of the older people club’s members in Dusit District, Bangkok. There are 12 clubs and 746 club’s members in this district. The research results use for older person service in this district. Data is gathered from 252 club’s members by using questionnaires. The quantitative approach uses in research by percentage, means and standard deviation. The results are as follows (1) The older people need Information for entertainment, occupation and academic in the field of short story, computer work, and religion and morality. (2) The participants use Information from various sources. (3) The Problem of information usage is their language skills because of the older people’s literacy problem.

Keywords: information behavior, older person, information seeking, knowledge discovery and data mining

Procedia PDF Downloads 255
13895 A Failure Criterion for Unsupported Boreholes in Poorly Cemented Granular Formations

Authors: Sam S. Hashemi

Abstract:

The breakage of bonding between sand particles and their dislodgment from the borehole wall are among the main factors resulting in a borehole failure in poorly cemented granular formations. The grain debonding usually precedes the borehole failure and it can be considered as a sign that the onset of the borehole collapse is imminent. Detecting the bonding breakage point and introducing an appropriate failure criterion will play an important role in borehole stability analysis. To study the influence of different factors on the initiation of sand bonding breakage at the borehole wall, a series of laboratory tests was designed and conducted on poorly cemented sand samples. The total absorbed strain energy per volume of material up to the point of the observed particle debonding was computed. The results indicated that the particle bonding breakage point at the borehole wall was reached both before and after the peak strength of the thick-walled hollow cylinder specimens depending on the stress path and cement content. Three different cement contents and two borehole sizes were investigated to study the influence of the bonding strength and scale on the particle dislodgment. Test results showed that the stress path has a significant influence on the onset of the sand bonding breakage. It was shown that for various stress paths, there is a near linear relationship between the absorbed energy and the normal effective mean stress.

Keywords: borehole stability, experimental studies, poorly cemented sands, total absorbed strain energy

Procedia PDF Downloads 192
13894 Jurisdictional Federalism and Formal Federalism: Levels of Political Centralization on American and Brazilian Models

Authors: Henrique Rangel, Alexandre Fadel, Igor De Lazari, Bianca Neri, Carlos Bolonha

Abstract:

This paper promotes a comparative analysis of American and Brazilian models of federalism assuming their levels of political centralization as main criterion. The central problem faced herein is the Brazilian approach of Unitarian regime. Although the hegemony of federative form after 1989, Brazil had a historical frame of political centralization that remains under the 1988 constitutional regime. Meanwhile, United States framed a federalism in which States absorb significant authorities. The hypothesis holds that the amount of alternative criteria of federalization – which can generate political centralization –, and the way they are upheld on judicial review, are crucial to understand the levels of political centralization achieved in each model. To test this hypothesis, the research is conducted by a methodology temporally delimited to 1994-2014 period. Three paradigmatic precedents of U.S. Supreme Court were selected: United States vs. Morrison (2000), on gender-motivated violence, Gonzales vs. Raich (2005), on medical use of marijuana, and United States vs. Lopez (1995), on firearm possession on scholar zones. These most relevant cases over federalism in the recent activity of Supreme Court indicates a determinant parameter of deliberation: the commerce clause. After observe the criterion used to permit or prohibit the political centralization in America, the Brazilian normative context is presented. In this sense, it is possible to identify the eventual legal treatment these controversies could receive in this Country. The decision-making reveals some deliberative parameters, which characterizes each federative model. At the end of research, the precedents of Rehnquist Court promote a broad revival of federalism debate, establishing the commerce clause as a secure criterion to uphold or not the necessity of centralization – even with decisions considered conservative. Otherwise, the Brazilian federalism solves them controversies upon in a formalist fashion, within numerous and comprehensive – sometimes casuistic too – normative devices, oriented to make an intense centralization. The aim of this work is indicate how jurisdictional federalism found in United States can preserve a consistent model with States robustly autonomous, while Brazil gives preference to normative mechanisms designed to starts from centralization.

Keywords: constitutional design, federalism, U.S. Supreme Court, legislative authority

Procedia PDF Downloads 502
13893 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 135
13892 Multilayer Perceptron Neural Network for Rainfall-Water Level Modeling

Authors: Thohidul Islam, Md. Hamidul Haque, Robin Kumar Biswas

Abstract:

Floods are one of the deadliest natural disasters which are very complex to model; however, machine learning is opening the door for more reliable and accurate flood prediction. In this research, a multilayer perceptron neural network (MLP) is developed to model the rainfall-water level relation, in a subtropical monsoon climatic region of the Bangladesh-India border. Our experiments show promising empirical results to forecast the water level for 1 day lead time. Our best performing MLP model achieves 98.7% coefficient of determination with lower model complexity which surpasses previously reported results on similar forecasting problems.

Keywords: flood forecasting, machine learning, multilayer perceptron network, regression

Procedia PDF Downloads 157
13891 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 268
13890 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 102
13889 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 353
13888 Resilience with Spontaneous Volunteers in Disasters-Coordination Using an It System

Authors: Leo Latasch, Mario Di Gennaro

Abstract:

Introduction: The goal of this project was to increase the resilience of the population as well as rescue organizations to make both quality and time-related improvements in handling crises. A helper network was created for this purpose. Methods: Social questions regarding the structure and purpose of helper networks were considered - specifically with regard to helper motivation, the level of commitment and collaboration between populations and agencies. The exchange of information, the coordinated use of volunteers, and the distribution of available resources will be ensured through defined communication and cooperation routines. Helper smartphones will also be used provide a picture of the situation on the ground. Results: The helper network was established and deployed based on the RESIBES information technology system. It consists of a service platform, a web portal and a smartphone app. The service platform is the central element for collaboration between the various rescue organizations, as well as for persons, associations, and companies from the population offering voluntary aid. The platform was used for: Registering helpers and resources and then requesting and assigning it in case of a disaster. These services allow the population's resources to be organized. The service platform also allows for a secure data exchange between services and external systems. Conclusions: The social and technical work priorities have allowed us to cover a full cycle of advance structural work, gaining an overview, damage management, evaluation, and feedback on experiences. This cycle allows experiences gained while handling the crisis to feed back into the cycle and improve preparations and management strategies.

Keywords: coordination, disaster, resilience, volunteers

Procedia PDF Downloads 121
13887 Optimizing the Public Policy Information System under the Environment of E-Government

Authors: Qian Zaijian

Abstract:

E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.

Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems

Procedia PDF Downloads 839
13886 Improving the Statistics Nature in Research Information System

Authors: Rajbir Cheema

Abstract:

In order to introduce an integrated research information system, this will provide scientific institutions with the necessary information on research activities and research results in assured quality. Since data collection, duplication, missing values, incorrect formatting, inconsistencies, etc. can arise in the collection of research data in different research information systems, which can have a wide range of negative effects on data quality, the subject of data quality should be treated with better results. This paper examines the data quality problems in research information systems and presents the new techniques that enable organizations to improve their quality of research information.

Keywords: Research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization

Procedia PDF Downloads 139
13885 Decision Support System for Diagnosis of Breast Cancer

Authors: Oluwaponmile D. Alao

Abstract:

In this paper, two models have been developed to ascertain the best network needed for diagnosis of breast cancer. Breast cancer has been a disease that required the attention of the medical practitioner. Experience has shown that misdiagnose of the disease has been a major challenge in the medical field. Therefore, designing a system with adequate performance for will help in making diagnosis of the disease faster and accurate. In this paper, two models: backpropagation neural network and support vector machine has been developed. The performance obtained is also compared with other previously obtained algorithms to ascertain the best algorithms.

Keywords: breast cancer, data mining, neural network, support vector machine

Procedia PDF Downloads 325
13884 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection

Authors: Ashkan Zakaryazad, Ekrem Duman

Abstract:

A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.

Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent

Procedia PDF Downloads 457
13883 A Type-2 Fuzzy Model for Link Prediction in Social Network

Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi

Abstract:

Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.

Keywords: social network, link prediction, granular computing, type-2 fuzzy sets

Procedia PDF Downloads 308
13882 Fault Detection of Pipeline in Water Distribution Network System

Authors: Shin Je Lee, Go Bong Choi, Jeong Cheol Seo, Jong Min Lee, Gibaek Lee

Abstract:

Water pipe network is installed underground and once equipped; it is difficult to recognize the state of pipes when the leak or burst happens. Accordingly, post management is often delayed after the fault occurs. Therefore, the systematic fault management system of water pipe network is required to prevent the accident and minimize the loss. In this work, we develop online fault detection system of water pipe network using data of pipes such as flow rate or pressure. The transient model describing water flow in pipelines is presented and simulated using Matlab. The fault situations such as the leak or burst can be also simulated and flow rate or pressure data when the fault happens are collected. Faults are detected using statistical methods of fast Fourier transform and discrete wavelet transform, and they are compared to find which method shows the better fault detection performance.

Keywords: fault detection, water pipeline model, fast Fourier transform, discrete wavelet transform

Procedia PDF Downloads 495
13881 Identifying Critical Links of a Transport Network When Affected by a Climatological Hazard

Authors: Beatriz Martinez-Pastor, Maria Nogal, Alan O'Connor

Abstract:

During the last years, the number of extreme weather events has increased. A variety of extreme weather events, including river floods, rain-induced landslides, droughts, winter storms, wildfire, and hurricanes, have threatened and damaged many different regions worldwide. These events have a devastating impact on critical infrastructure systems resulting in high social, economical and environmental costs. These events have a huge impact in transport systems. Since, transport networks are completely exposed to every kind of climatological perturbations, and its performance is closely related with these events. When a traffic network is affected by a climatological hazard, the quality of its service is threatened, and the level of the traffic conditions usually decreases. With the aim of understanding this process, the concept of resilience has become most popular in the area of transport. Transport resilience analyses the behavior of a traffic network when a perturbation takes place. This holistic concept studies the complete process, from the beginning of the perturbation until the total recovery of the system, when the perturbation has finished. Many concepts are included in the definition of resilience, such as vulnerability, redundancy, adaptability, and safety. Once the resilience of a transport network can be evaluated, in this case, the methodology used is a dynamic equilibrium-restricted assignment model that allows the quantification of the concept, the next step is its improvement. Through the improvement of this concept, it will be possible to create transport networks that are able to withstand and have a better performance under the presence of climatological hazards. Analyzing the impact of a perturbation in a traffic network, it is observed that the response of the different links, which are part of the network, can be completely different from one to another. Consequently and due to this effect, many questions arise, as what makes a link more critical before an extreme weather event? or how is it possible to identify these critical links? With this aim, and knowing that most of the times the owners or managers of the transport systems have limited resources, the identification of the critical links of a transport network before extreme weather events, becomes a crucial objective. For that reason, using the available resources in the areas that will generate a higher improvement of the resilience, will contribute to the global development of the network. Therefore, this paper wants to analyze what kind of characteristic makes a link a critical one when an extreme weather event damages a transport network and finally identify them.

Keywords: critical links, extreme weather events, hazard, resilience, transport network

Procedia PDF Downloads 270
13880 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network

Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib

Abstract:

The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.

Keywords: heat exchanger network, synthesis, NLP, optimization

Procedia PDF Downloads 146
13879 Scaling Siamese Neural Network for Cross-Domain Few Shot Learning in Medical Imaging

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Cross-domain learning in the medical field is a research challenge as many conditions, like in oncology imaging, use different imaging modalities. Moreover, in most of the medical learning applications, the sample training size is relatively small. Although few-shot learning (FSL) through the use of a Siamese neural network was able to be trained on a small sample with remarkable accuracy, FSL fails to be effective for use in multiple domains as their convolution weights are set for task-specific applications. In this paper, we are addressing this problem by enabling FSL to possess the ability to shift across domains by designing a two-layer FSL network that can learn individually from each domain and produce a shared features map with extra modulation to be used at the second layer that can recognize important targets from mix domains. Our initial experimentations based on mixed medical datasets like the Medical-MNIST reveal promising results. We aim to continue this research to perform full-scale analytics for testing our cross-domain FSL learning.

Keywords: Siamese neural network, few-shot learning, meta-learning, metric-based learning, thick data transformation and analytics

Procedia PDF Downloads 38
13878 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma

Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu

Abstract:

The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.

Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter

Procedia PDF Downloads 90
13877 A Neural Network Control for Voltage Balancing in Three-Phase Electric Power System

Authors: Dana M. Ragab, Jasim A. Ghaeb

Abstract:

The three-phase power system suffers from different challenging problems, e.g. voltage unbalance conditions at the load side. The voltage unbalance usually degrades the power quality of the electric power system. Several techniques can be considered for load balancing including load reconfiguration, static synchronous compensator and static reactive power compensator. In this work an efficient neural network is designed to control the unbalanced condition in the Aqaba-Qatrana-South Amman (AQSA) electric power system. It is designed for highly enhanced response time of the reactive compensator for voltage balancing. The neural network is developed to determine the appropriate set of firing angles required for the thyristor-controlled reactor to balance the three load voltages accurately and quickly. The parameters of AQSA power system are considered in the laboratory model, and several test cases have been conducted to test and validate the proposed technique capabilities. The results have shown a high performance of the proposed Neural Network Control (NNC) technique for correcting the voltage unbalance conditions at three-phase load based on accuracy and response time.

Keywords: three-phase power system, reactive power control, voltage unbalance factor, neural network, power quality

Procedia PDF Downloads 177
13876 User Selections on Social Network Applications

Authors: C. C. Liang

Abstract:

MSN used to be the most popular application for communicating among social networks, but Facebook chat is now the most popular. Facebook and MSN have similar characteristics, including usefulness, ease-of-use, and a similar function, which is the exchanging of information with friends. Facebook outperforms MSN in both of these areas. However, the adoption of Facebook and abandonment of MSN have occurred for other reasons. Functions can be improved, but users’ willingness to use does not just depend on functionality. Flow status has been established to be crucial to users’ adoption of cyber applications and to affects users’ adoption of software applications. If users experience flow in using software application, they will enjoy using it frequently, and even change their preferred application from an old to this new one. However, no investigation has examined choice behavior related to switching from Facebook to MSN based on a consideration of flow experiences and functions. This investigation discusses the flow experiences and functions of social-networking applications. Flow experience is found to affect perceived ease of use and perceived usefulness; perceived ease of use influences information ex-change with friends, and perceived usefulness; information exchange influences perceived usefulness, but information exchange has no effect on flow experience.

Keywords: consumer behavior, social media, technology acceptance model, flow experience

Procedia PDF Downloads 339
13875 Factors Associated with Weight Loss Maintenance after an Intervention Program

Authors: Filipa Cortez, Vanessa Pereira

Abstract:

Introduction: The main challenge of obesity treatment is long-term weight loss maintenance. The 3 phases method is a weight loss program that combines a low carb and moderately high-protein diet, food supplements and a weekly one-to-one consultation with a certified nutritionist. Sustained weight control is the ultimate goal of phase 3. Success criterion was the minimum loss of 10% of initial weight and its maintenance after 12 months. Objective: The aim of this study was to identify factors associated with successful weight loss maintenance after 12 months at the end of 3 phases method. Methods: The study included 199 subjects that achieved their weight loss goal (phase 3). Weight and body mass index (BMI) were obtained at the baseline and every week until the end of the program. Therapeutic adherence was measured weekly on a Likert scale from 1 to 5. Subjects were considered in compliance with nutritional recommendation and supplementation when their classification was ≥ 4. After 12 months of the method, the current weight and number of previous weight-loss attempts were collected by telephone interview. The statistical significance was assumed at p-values < 0.05. Statistical analyses were performed using SPSS TM software v.21. Results: 65.3% of subjects met the success criterion. The factors which displayed a significant weight loss maintenance prediction were: greater initial percentage weight loss (OR=1.44) during the weight loss intervention and a higher number of consultations in phase 3 (OR=1.10). Conclusion: These findings suggest that the percentage weight loss during the weight loss intervention and the number of consultations in phase 3 may facilitate maintenance of weight loss after the 3 phases method.

Keywords: obesity, weight maintenance, low-carbohydrate diet, dietary supplements

Procedia PDF Downloads 136