Search results for: Bayesian neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5508

Search results for: Bayesian neural network

2988 Coal Mining Safety Monitoring Using Wsn

Authors: Somdatta Saha

Abstract:

The main purpose was to provide an implementable design scenario for underground coal mines using wireless sensor networks (WSNs). The main reason being that given the intricacies in the physical structure of a coal mine, only low power WSN nodes can produce accurate surveillance and accident detection data. The work mainly concentrated on designing and simulating various alternate scenarios for a typical mine and comparing them based on the obtained results to arrive at a final design. In the Era of embedded technology, the Zigbee protocols are used in more and more applications. Because of the rapid development of sensors, microcontrollers, and network technology, a reliable technological condition has been provided for our automatic real-time monitoring of coal mine. The underground system collects temperature, humidity and methane values of coal mine through sensor nodes in the mine; it also collects the number of personnel inside the mine with the help of an IR sensor, and then transmits the data to information processing terminal based on ARM.

Keywords: ARM, embedded board, wireless sensor network (Zigbee)

Procedia PDF Downloads 340
2987 A Study of Adult Lifelong Learning Consulting and Service System in Taiwan

Authors: Wan Jen Chang

Abstract:

Back ground: Taiwan's current adult lifelong learning services have expanded from vocational training to universal lifelong learning. However, both the professional knowledge training of learning guidance and consulting services and the provision of adult online learning consulting service systems still need to be established. Purpose: The purposes of this study are as follows: 1. Analyze the professional training mechanism for cultivating adult lifelong learning consultation and coaching; 2. Explore the feasibility of constructing a system that uses network technology to provide adult learning consultation services. Research design: This study conducts a literature analysis of counseling and coaching policy reports on lifelong learning in European countries and the United States. There are two focus discussions were conducted with 15 lifelong learning scholars, experts and practitioners as research subjects. The following two topics were discussed and suggested: 1. The current situation, needs and professional ability training mechanism of "Adult Lifelong Learning Consulting and Services"; 2. Strategies for establishing an "Adult Lifelong Learning Consulting and Service internet System". Conclusion: 1.Based on adult lifelong learning consulting and service needs, plan a professional knowledge training and certification system.2.Adult lifelong learning consulting and service professional knowledge and skills training should include the use of network technology to provide consulting service skills.3.To establish an adult lifelong learning consultation and service system, the Ministry of Education should promulgate policies and measures at the central level and entrust local governments or private organizations to implement them.4.The adult lifelong learning consulting and service system can combine the national qualifications framework, private sector and NPO to expand learning consulting service partners.

Keywords: adult lifelong learning, profesional knowledge, consulting and service, network system

Procedia PDF Downloads 67
2986 Presenting a Job Scheduling Algorithm Based on Learning Automata in Computational Grid

Authors: Roshanak Khodabakhsh Jolfaei, Javad Akbari Torkestani

Abstract:

As a cooperative environment for problem-solving, it is necessary that grids develop efficient job scheduling patterns with regard to their goals, domains and structure. Since the Grid environments facilitate distributed calculations, job scheduling appears in the form of a critical problem for the management of Grid sources that influences severely on the efficiency for the whole Grid environment. Due to the existence of some specifications such as sources dynamicity and conditions of the network in Grid, some algorithm should be presented to be adjustable and scalable with increasing the network growth. For this purpose, in this paper a job scheduling algorithm has been presented on the basis of learning automata in computational Grid which the performance of its results were compared with FPSO algorithm (Fuzzy Particle Swarm Optimization algorithm) and GJS algorithm (Grid Job Scheduling algorithm). The obtained numerical results indicated the superiority of suggested algorithm in comparison with FPSO and GJS. In addition, the obtained results classified FPSO and GJS in the second and third position respectively after the mentioned algorithm.

Keywords: computational grid, job scheduling, learning automata, dynamic scheduling

Procedia PDF Downloads 343
2985 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 127
2984 Genome-Wide Expression Profiling of Cicer arietinum Heavy Metal Toxicity

Authors: B. S. Yadav, A. Mani, S. Srivastava

Abstract:

Chickpea (Cicer arietinum L.) is an annual, self-pollinating, diploid (2n = 2x = 16) pulse crop that ranks second in world legume production after common bean (Phaseolus vulgaris). ICC 4958 flowers approximately 39 days after sowing under peninsular Indian conditions and the crop matures in less than 90 days in rained environments. The estimated collective yield losses due to abiotic stresses (6.4 million t) have been significantly higher than for biotic stresses (4.8 million t). Most legumes are known to be salt sensitive, and therefore, it is becoming increasingly important to produce cultivars tolerant to high-salinity in addition to other abiotic and biotic stresses for sustainable chickpea production. Our aim was to identify the genes that are involved in the defence mechanism against heavy metal toxicity in chickpea and establish the biological network of heavy metal toxicity in chickpea. ICC4958 variety of chick pea was taken and grown in normal condition and 150µM concentration of different heavy metal salt like CdCl₂, K₂Cr2O₇, NaAsO₂. At 15th day leave samples were collected and stored in RNA Later solution microarray was performed for checking out differential gene expression pattern. Our studies revealed that 111 common genes that involved in defense mechanism were up regulated and 41 genes were commonly down regulated during treatment of 150µM concentration of CdCl₂, K₂Cr₂O₇, and NaAsO₂. Biological network study shows that the genes which are differentially expressed are highly connected and having high betweenness and centrality.

Keywords: abiotic stress, biological network, chickpea, microarray

Procedia PDF Downloads 197
2983 FPGA Implementation of a Marginalized Particle Filter for Delineation of P and T Waves of ECG Signal

Authors: Jugal Bhandari, K. Hari Priya

Abstract:

The ECG signal provides important clinical information which could be used to pretend the diseases related to heart. Accordingly, delineation of ECG signal is an important task. Whereas delineation of P and T waves is a complex task. This paper deals with the Study of ECG signal and analysis of signal by means of Verilog Design of efficient filters and MATLAB tool effectively. It includes generation and simulation of ECG signal, by means of real time ECG data, ECG signal filtering and processing by analysis of different algorithms and techniques. In this paper, we design a basic particle filter which generates a dynamic model depending on the present and past input samples and then produces the desired output. Afterwards, the output will be processed by MATLAB to get the actual shape and accurate values of the ranges of P-wave and T-wave of ECG signal. In this paper, Questasim is a tool of mentor graphics which is being used for simulation and functional verification. The same design is again verified using Xilinx ISE which will be also used for synthesis, mapping and bit file generation. Xilinx FPGA board will be used for implementation of system. The final results of FPGA shall be verified with ChipScope Pro where the output data can be observed.

Keywords: ECG, MATLAB, Bayesian filtering, particle filter, Verilog hardware descriptive language

Procedia PDF Downloads 367
2982 Regional Flood-Duration-Frequency Models for Norway

Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu

Abstract:

Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.

Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV

Procedia PDF Downloads 72
2981 Census and Mapping of Oil Palms Over Satellite Dataset Using Deep Learning Model

Authors: Gholba Niranjan Dilip, Anil Kumar

Abstract:

Conduct of accurate reliable mapping of oil palm plantations and census of individual palm trees is a huge challenge. This study addresses this challenge and developed an optimized solution implemented deep learning techniques on remote sensing data. The oil palm is a very important tropical crop. To improve its productivity and land management, it is imperative to have accurate census over large areas. Since, manual census is costly and prone to approximations, a methodology for automated census using panchromatic images from Cartosat-2, SkySat and World View-3 satellites is demonstrated. It is selected two different study sites in Indonesia. The customized set of training data and ground-truth data are created for this study from Cartosat-2 images. The pre-trained model of Single Shot MultiBox Detector (SSD) Lite MobileNet V2 Convolutional Neural Network (CNN) from the TensorFlow Object Detection API is subjected to transfer learning on this customized dataset. The SSD model is able to generate the bounding boxes for each oil palm and also do the counting of palms with good accuracy on the panchromatic images. The detection yielded an F-Score of 83.16 % on seven different images. The detections are buffered and dissolved to generate polygons demarcating the boundaries of the oil palm plantations. This provided the area under the plantations and also gave maps of their location, thereby completing the automated census, with a fairly high accuracy (≈100%). The trained CNN was found competent enough to detect oil palm crowns from images obtained from multiple satellite sensors and of varying temporal vintage. It helped to estimate the increase in oil palm plantations from 2014 to 2021 in the study area. The study proved that high-resolution panchromatic satellite image can successfully be used to undertake census of oil palm plantations using CNNs.

Keywords: object detection, oil palm tree census, panchromatic images, single shot multibox detector

Procedia PDF Downloads 160
2980 Estimation of Endogenous Brain Noise from Brain Response to Flickering Visual Stimulation Magnetoencephalography Visual Perception Speed

Authors: Alexander N. Pisarchik, Parth Chholak

Abstract:

Intrinsic brain noise was estimated via magneto-encephalograms (MEG) recorded during perception of flickering visual stimuli with frequencies of 6.67 and 8.57 Hz. First, we measured the mean phase difference between the flicker signal and steady-state event-related field (SSERF) in the occipital area where the brain response at the flicker frequencies and their harmonics appeared in the power spectrum. Then, we calculated the probability distribution of the phase fluctuations in the regions of frequency locking and computed its kurtosis. Since kurtosis is a measure of the distribution’s sharpness, we suppose that inverse kurtosis is related to intrinsic brain noise. In our experiments, the kurtosis value varied among subjects from K = 3 to K = 5 for 6.67 Hz and from 2.6 to 4 for 8.57 Hz. The majority of subjects demonstrated leptokurtic kurtosis (K < 3), i.e., the distribution tails approached zero more slowly than Gaussian. In addition, we found a strong correlation between kurtosis and brain complexity measured as the correlation dimension, so that the MEGs of subjects with higher kurtosis exhibited lower complexity. The obtained results are discussed in the framework of nonlinear dynamics and complex network theories. Specifically, in a network of coupled oscillators, phase synchronization is mainly determined by two antagonistic factors, noise, and the coupling strength. While noise worsens phase synchronization, the coupling improves it. If we assume that each neuron and each synapse contribute to brain noise, the larger neuronal network should have stronger noise, and therefore phase synchronization should be worse, that results in smaller kurtosis. The described method for brain noise estimation can be useful for diagnostics of some brain pathologies associated with abnormal brain noise.

Keywords: brain, flickering, magnetoencephalography, MEG, visual perception, perception time

Procedia PDF Downloads 148
2979 The Friendship Network Stability of Preschool Children during One Pedagogical Season

Authors: Yili Wang, Jarmo Kinos, Tuire Palonen, Tarja-Riitta Hurme

Abstract:

This longitudinal study aims to examine how five- and six-year-old children’s peer relationships are formed and fostered during one preschool year in a southwestern Finnish preschool. All 16 kindergarteners participated in the study (at dyad level N=240; i.e., 16 x 15 relationships among the children). The children were divided into four daily groups, based on the table order during the daily routines, and four intervention groups, based on the teachers’ pedagogical plan. During the intervention, one iPad was given to each group in order to stimulate interaction among peers and, thus, enable the children to form new peer relationships. In the data gathering, sociometric nomination techniques were used to investigate the nature (i.e., stability and mutuality) of the peer relationships. The data was collected five times during the year to see what kind of peer relationship changes occurred at the dyad level and the group level, i.e., in establishing and losing friendship ties among the children. Social network analyses were used to analyze the data. The results indicate that the children’s preference for gender segregation was strong compared to age preference and intervention. In all, the number of reciprocal friendship ties and the mutual absence of friendship ties increased towards the end of the year, whereas the number of unilateral friendship ties decreased. This indicates that children’s nominations narrow down; thus, the group structure becomes more crystalized. Instead of extending their friendship networks, children seek stable and mutual relationships with their peers in their middle childhood years. The intervention only had a slightly negative influence on children’s peer relationships.

Keywords: intervention study, peer relationship, preschool education, social network analysis, sociometric ratings

Procedia PDF Downloads 273
2978 Generalized Rough Sets Applied to Graphs Related to Urban Problems

Authors: Mihai Rebenciuc, Simona Mihaela Bibic

Abstract:

Branch of modern mathematics, graphs represent instruments for optimization and solving practical applications in various fields such as economic networks, engineering, network optimization, the geometry of social action, generally, complex systems including contemporary urban problems (path or transport efficiencies, biourbanism, & c.). In this paper is studied the interconnection of some urban network, which can lead to a simulation problem of a digraph through another digraph. The simulation is made univoc or more general multivoc. The concepts of fragment and atom are very useful in the study of connectivity in the digraph that is simulation - including an alternative evaluation of k- connectivity. Rough set approach in (bi)digraph which is proposed in premier in this paper contribute to improved significantly the evaluation of k-connectivity. This rough set approach is based on generalized rough sets - basic facts are presented in this paper.

Keywords: (bi)digraphs, rough set theory, systems of interacting agents, complex systems

Procedia PDF Downloads 243
2977 Fake Accounts Detection in Twitter Based on Minimum Weighted Feature Set

Authors: Ahmed ElAzab, Amira M. Idrees, Mahmoud A. Mahmoud, Hesham Hefny

Abstract:

Social networking sites such as Twitter and Facebook attracts over 500 million users across the world, for those users, their social life, even their practical life, has become interrelated. Their interaction with social networking has affected their life forever. Accordingly, social networking sites have become among the main channels that are responsible for vast dissemination of different kinds of information during real time events. This popularity in Social networking has led to different problems including the possibility of exposing incorrect information to their users through fake accounts which results to the spread of malicious content during life events. This situation can result to a huge damage in the real world to the society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting fake accounts on Twitter. The study determines the minimized set of the main factors that influence the detection of the fake accounts on Twitter, then the determined factors have been applied using different classification techniques, a comparison of the results for these techniques has been performed and the most accurate algorithm is selected according to the accuracy of the results. The study has been compared with different recent research in the same area, this comparison has proved the accuracy of the proposed study. We claim that this study can be continuously applied on Twitter social network to automatically detect the fake accounts, moreover, the study can be applied on different Social network sites such as Facebook with minor changes according to the nature of the social network which are discussed in this paper.

Keywords: fake accounts detection, classification algorithms, twitter accounts analysis, features based techniques

Procedia PDF Downloads 416
2976 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model

Authors: Bokkasam Sasidhar, Ibrahim Aljasser

Abstract:

The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.

Keywords: scheduling, maximal flow problem, multiple arc network model, optimization

Procedia PDF Downloads 402
2975 Power Aware Modified I-LEACH Protocol Using Fuzzy IF Then Rules

Authors: Gagandeep Singh, Navdeep Singh

Abstract:

Due to limited battery of sensor nodes, so energy efficiency found to be main constraint in WSN. Therefore the main focus of the present work is to find the ways to minimize the energy consumption problem and will results; enhancement in the network stability period and life time. Many researchers have proposed different kind of the protocols to enhance the network lifetime further. This paper has evaluated the issues which have been neglected in the field of the WSNs. WSNs are composed of multiple unattended ultra-small, limited-power sensor nodes. Sensor nodes are deployed randomly in the area of interest. Sensor nodes have limited processing, wireless communication and power resource capabilities Sensor nodes send sensed data to sink or Base Station (BS). I-LEACH gives adaptive clustering mechanism which very efficiently deals with energy conservations. This paper ends up with the shortcomings of various adaptive clustering based WSNs protocols.

Keywords: WSN, I-Leach, MATLAB, sensor

Procedia PDF Downloads 275
2974 Microbiological Analysis, Cytotoxic and Genotoxic Effects from Material Captured in PM2.5 and PM10 Filters Used in the Aburrá Valley Air Quality Monitoring Network (Colombia)

Authors: Carmen E. Zapata, Juan Bautista, Olga Montoya, Claudia Moreno, Marisol Suarez, Alejandra Betancur, Duvan Nanclares, Natalia A. Cano

Abstract:

This study aims to evaluate the diversity of microorganisms in filters PM2.5 and PM10; and determine the genotoxic and cytotoxic activity of the complex mixture present in PM2.5 filters used in the Aburrá Valley Air Quality Monitoring Network (Colombia). The research results indicate that particulate matter PM2.5 of different monitoring stations are bacteria; however, this study of detection of bacteria and their phylogenetic relationship is not complete evidence to connect the microorganisms with pathogenic or degrading activities of compounds present in the air. Additionally, it was demonstrated the damage induced by the particulate material in the cell membrane, lysosomal and endosomal membrane and in the mitochondrial metabolism; this damage was independent of the PM2.5 concentrations in almost all the cases.

Keywords: cytotoxic, genotoxic, microbiological analysis, PM10, PM2.5

Procedia PDF Downloads 348
2973 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes

Authors: J. J. Vargas, N. Prieto, L. A. Toro

Abstract:

Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.

Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method

Procedia PDF Downloads 374
2972 Integration of Wireless Sensor Networks and Radio Frequency Identification (RFID): An Assesment

Authors: Arslan Murtaza

Abstract:

RFID (Radio Frequency Identification) and WSN (Wireless sensor network) are two significant wireless technologies that have extensive diversity of applications and provide limitless forthcoming potentials. RFID is used to identify existence and location of objects whereas WSN is used to intellect and monitor the environment. Incorporating RFID with WSN not only provides identity and location of an object but also provides information regarding the condition of the object carrying the sensors enabled RFID tag. It can be widely used in stock management, asset tracking, asset counting, security, military, environmental monitoring and forecasting, healthcare, intelligent home, intelligent transport vehicles, warehouse management, and precision agriculture. This assessment presents a brief introduction of RFID, WSN, and integration of WSN and RFID, and then applications related to both RFID and WSN. This assessment also deliberates status of the projects on RFID technology carried out in different computing group projects to be taken on WSN and RFID technology.

Keywords: wireless sensor network, RFID, embedded sensor, Wi-Fi, Bluetooth, integration, time saving, cost efficient

Procedia PDF Downloads 334
2971 Message Framework for Disaster Management: An Application Model for Mines

Authors: A. Baloglu, A. Çınar

Abstract:

Different tools and technologies were implemented for Crisis Response and Management (CRM) which is generally using available network infrastructure for information exchange. Depending on type of disaster or crisis, network infrastructure could be affected and it could not be able to provide reliable connectivity. Thus any tool or technology that depends on the connectivity could not be able to fulfill its functionalities. As a solution, a new message exchange framework has been developed. Framework provides offline/online information exchange platform for CRM Information Systems (CRMIS) and it uses XML compression and packet prioritization algorithms and is based on open source web technologies. By introducing offline capabilities to the web technologies, framework will be able to perform message exchange on unreliable networks. The experiments done on the simulation environment provide promising results on low bandwidth networks (56kbps and 28.8 kbps) with up to 50% packet loss and the solution is to successfully transfer all the information on these low quality networks where the traditional 2 and 3 tier applications failed.

Keywords: crisis response and management, XML messaging, web services, XML compression, mining

Procedia PDF Downloads 339
2970 Distribution Planning with Renewable Energy Units Based on Improved Honey Bee Mating Optimization

Authors: Noradin Ghadimi, Nima Amjady, Oveis Abedinia, Roza Poursoleiman

Abstract:

This paper proposed an Improved Honey Bee Mating Optimization (IHBMO) for a planning paradigm for network upgrade. The proposed technique is a new meta-heuristic algorithm which inspired by mating of the honey bee. The paradigm is able to select amongst several choices equi-cost one assuring the optimum in terms of voltage profile, considering various scenarios of DG penetration and load demand. The distributed generation (DG) has created a challenge and an opportunity for developing various novel technologies in power generation. DG prepares a multitude of services to utilities and consumers, containing standby generation, peaks chopping sufficiency, base load generation. The proposed algorithm is applied over the 30 lines, 28 buses power system. The achieved results demonstrate the good efficiency of the DG using the proposed technique in different scenarios.

Keywords: distributed generation, IHBMO, renewable energy units, network upgrade

Procedia PDF Downloads 487
2969 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework

Authors: Mayada Al Meghari

Abstract:

Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.

Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance

Procedia PDF Downloads 119
2968 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning

Authors: Kaushik Sathupadi, Sandesh Achar

Abstract:

Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.

Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)

Procedia PDF Downloads 20
2967 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing

Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca de Marchi

Abstract:

This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes available a larger monitored area. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary chars the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.

Keywords: data compression, ultrasonic communication, guided waves, FEM analysis

Procedia PDF Downloads 124
2966 Self in Networks: Public Sphere in the Era of Globalisation

Authors: Sanghamitra Sadhu

Abstract:

A paradigm shift from capitalism to information technology is discerned in the era globalisation. The idea of public sphere, which was theorized in terms of its decline in the wake of the rise of commercial mass media has now emerged as a transnational or global sphere with the discourse being dominated by the ‘network society’. In other words, the dynamic of globalisation has brought about ‘a spatial turn’ in the social and political sciences which is also manifested in the public sphere, Especially the global public sphere. The paper revisits the Habermasian concept of the public sphere and focuses on the various social networking sites with their plausibility to create a virtual global public sphere. Situating Habermas’s notion of the bourgeois public sphere in the present context of global public sphere, it considers the changing dimensions of the public sphere across time and examines the concept of the ‘public’ with its shifting transformation from the concrete collective to the fluid ‘imagined’ category. The paper addresses the problematic of multimodal self-portraiture in the social networking sites as well as various online diaries/journals with an attempt to explore the nuances of the networked self.

Keywords: globalisation, network society, public sphere, self-fashioning, identity, autonomy

Procedia PDF Downloads 416
2965 Blockchain’s Feasibility in Military Data Networks

Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam

Abstract:

Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.

Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing

Procedia PDF Downloads 138
2964 Design an Architectural Model for Deploying Wireless Sensor Network to Prevent Forest Fire

Authors: Saurabh Shukla, G. N. Pandey

Abstract:

The fires have become the most serious disasters to forest resources and the human environment. In recent years, due to climate change, human activities and other factors the frequency of forest fires has increased considerably. The monitoring and prevention of forest fires have now become a global concern for forest fire prevention organizations. Currently, the methods for forest fire prevention largely consist of patrols, observation from watch towers. Thus, software like deployment of the wireless sensor network to prevent forest fire is being developed to get a better estimate of the temperature and humidity prospects. Now days, wireless sensor networks are beginning to be deployed at an accelerated pace. It is not unrealistic to expect that in coming years the world will be covered with wireless sensor networks. This new technology has lots of unlimited potentials and can be used for numerous application areas including environmental, medical, military, transportation, entertainment, crisis management, homeland defense, and smart spaces.

Keywords: deployment, sensors, wireless sensor networks, forest fires

Procedia PDF Downloads 436
2963 Network Impact of a Social Innovation Initiative in Rural Areas of Southern Italy

Authors: A. M. Andriano, M. Lombardi, A. Lopolito, M. Prosperi, A. Stasi, E. Iannuzzi

Abstract:

In according to the scientific debate on the definition of Social Innovation (SI), the present paper identifies SI as new ideas (products, services, and models) that simultaneously meet social needs and create new social relationships or collaborations. This concept offers important tools to unravel the difficult condition for the agricultural sector in marginalized areas, characterized by the abandonment of activities, low level of farmer education, and low generational renewal, hampering new territorial strategies addressed at and integrated and sustainable development. Models of SI in agriculture, starting from bottom up approach or from the community, are considered to represent the driving force of an ecological and digital revolution. A system based on SI may be able to grasp and satisfy individual and social needs and to promote new forms of entrepreneurship. In this context, Vazapp ('Go Hoeing') is an emerging SI model in southern Italy that promotes solutions for satisfying needs of farmers and facilitates their relationships (creation of network). The Vazapp’s initiative, considered in this study, is the Contadinners ('Farmer’s dinners'), a dinner held at farmer’s house where stakeholders living in the surrounding area know each other and are able to build a network for possible future professional collaborations. The aim of the paper is to identify the evolution of farmers’ relationships, both quantitatively and qualitatively, because of the Contadinner’s initiative organized by Vazapp. To this end, the study adopts the Social Network Analysis (SNA) methodology by using UCINET (Version 6.667) software to analyze the relational structure. Data collection was realized through a questionnaire distributed to 387 participants in the twenty 'Contadinners', held from February 2016 to June 2018. The response rate to the survey was about 50% of farmers. The elaboration data was focused on different aspects, such as: a) the measurement of relational reciprocity among the farmers using the symmetrize method of answers; b) the measurement of the answer reliability using the dichotomize method; c) the description of evolution of social capital using the cohesion method; d) the clustering of the Contadinners' participants in followers and not-followers of Vazapp to evaluate its impact on the local social capital. The results concern the effectiveness of this initiative in generating trustworthy relationships within the rural area of southern Italy, typically affected by individualism and mistrust. The number of relationships represents the quantitative indicator to define the dimension of the network development; while the typologies of relationships (from simple friendship to formal collaborations, for branding new cooperation initiatives) represents the qualitative indicator that offers a diversified perspective of the network impact. From the analysis carried out, Vazapp’s initiative represents surely a virtuous SI model to catalyze the relationships within the rural areas and to develop entrepreneurship based on the real needs of the community.

Keywords:

Procedia PDF Downloads 111
2962 Interaction between Space Syntax and Agent-Based Approaches for Vehicle Volume Modelling

Authors: Chuan Yang, Jing Bie, Panagiotis Psimoulis, Zhong Wang

Abstract:

Modelling and understanding vehicle volume distribution over the urban network are essential for urban design and transport planning. The space syntax approach was widely applied as the main conceptual and methodological framework for contemporary vehicle volume models with the help of the statistical method of multiple regression analysis (MRA). However, the MRA model with space syntax variables shows a limitation in vehicle volume predicting in accounting for the crossed effect of the urban configurational characters and socio-economic factors. The aim of this paper is to construct models by interacting with the combined impact of the street network structure and socio-economic factors. In this paper, we present a multilevel linear (ML) and an agent-based (AB) vehicle volume model at an urban scale interacting with space syntax theoretical framework. The ML model allowed random effects of urban configurational characteristics in different urban contexts. And the AB model was developed with the incorporation of transformed space syntax components of the MRA models into the agents’ spatial behaviour. Three models were implemented in the same urban environment. The ML model exhibit superiority over the original MRA model in identifying the relative impacts of the configurational characters and macro-scale socio-economic factors that shape vehicle movement distribution over the city. Compared with the ML model, the suggested AB model represented the ability to estimate vehicle volume in the urban network considering the combined effects of configurational characters and land-use patterns at the street segment level.

Keywords: space syntax, vehicle volume modeling, multilevel model, agent-based model

Procedia PDF Downloads 145
2961 An Analytic Network Process Approach towards Academic Staff Selection

Authors: Nasrullah khan

Abstract:

Today business environment is very dynamic and most of organizations are in tough competition for their added values and sustainable hold in market. To achieve such objectives, organizations must have dynamic and creative people as optimized process. To get these people, there should strong human resource management system in organizations. There are multiple approaches have been devised in literature to hire more job relevant and more suitable people. This study proposed an ANP (Analytic Network Process) approach to hire faculty members for a university system. This study consists of two parts. In fist part, a through literature survey and universities interview are conducted in order to find the common criteria for the selection of academic staff. In second part the available candidates are prioritized on the basis of the relative values of these criteria. According to results the GRE & foreign language, GPA and research paper writing were most important factors for the selection of academic staff.

Keywords: creative people, ANP, academic staff, business environment

Procedia PDF Downloads 415
2960 Digital Customer Relationship Management on Service Delivery Performance

Authors: Reuben Kinyuru Njuguna, Martin Mabuya Njuguna

Abstract:

Digital platforms, such as The Internet, and the advent of digital marketing strategies, have led to many changes in the marketing of goods and services. These have resulted in improved service quality, enhanced customer relations, productivity gains, marketing transaction cost reductions, improved customer service and flexibility in fulfilling customers’ changing needs and lifestyles. Consequently, the purpose of this study was to determine the effect of digital marketing practices on the financial performance of mobile network operators in the telecommunications industry in Kenya. The objectives of the study were to establish how digital customer relationship management strategies on performance of mobile network operators in Kenya. The study used an explanatory cross-sectional survey research design, while the target population was made up of from the 4 major mobile network operators in Kenya, namely Safaricom Limited, Airtel Networks Kenya Limited, Finserve Africa Limited and Telkom Kenya Limited. Sampling strategy was stratified sampling with a sample size of 97 respondents. Digital customer relationship strategies were seen to influence firm performance, through enhancing convenience, building trust, encouraging growth in market share through creating sustainable relationships, building commitment with customers, enhancing customer retention and customer satisfaction. Digital customer relationship management were seen to maximize gross profits by increasing customer satisfaction, loyalty and retention. The study recommended upscaling the use of digital customer relationship management strategies to further enhance firm performance, given their great potential in this regard.

Keywords: customer relationship management, customer service delivery, performance, customer satisfaction

Procedia PDF Downloads 238
2959 Bit Error Rate (BER) Performance of Coherent Homodyne BPSK-OCDMA Network for Multimedia Applications

Authors: Morsy Ahmed Morsy Ismail

Abstract:

In this paper, the structure of a coherent homodyne receiver for the Binary Phase Shift Keying (BPSK) Optical Code Division Multiple Access (OCDMA) network is introduced based on the Multi-Length Weighted Modified Prime Code (ML-WMPC) for multimedia applications. The Bit Error Rate (BER) of this homodyne detection is evaluated as a function of the number of active users and the signal to noise ratio for different code lengths according to the multimedia application such as audio, voice, and video. Besides, the Mach-Zehnder interferometer is used as an external phase modulator in homodyne detection. Furthermore, the Multiple Access Interference (MAI) and the receiver noise in a shot-noise limited regime are taken into consideration in the BER calculations.

Keywords: OCDMA networks, bit error rate, multiple access interference, binary phase-shift keying, multimedia

Procedia PDF Downloads 175