Search results for: Neural Networks Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18792

Search results for: Neural Networks Model

16602 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network

Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka

Abstract:

Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.

Keywords: aggregation, consumption, data gathering, efficiency

Procedia PDF Downloads 476
16601 Mobile Network Users Amidst Ultra-Dense Networks in 5G Using an Improved Coordinated Multipoint (CoMP) Technology

Authors: Johnson O. Adeogo, Ayodele S. Oluwole, O. Akinsanmi, Olawale J. Olaluyi

Abstract:

In this 5G network, very high traffic density in densely populated areas, most especially in densely populated areas, is one of the key requirements. Radiation reduction becomes one of the major concerns to secure the future life of mobile network users in ultra-dense network areas using an improved coordinated multipoint technology. Coordinated Multi-Point (CoMP) is based on transmission and/or reception at multiple separated points with improved coordination among them to actively manage the interference for the users. Small cells have two major objectives: one, they provide good coverage and/or performance. Network users can maintain a good quality signal network by directly connecting to the cell. Two is using CoMP, which involves the use of multiple base stations (MBS) to cooperate by transmitting and/or receiving at the same time in order to reduce the possibility of electromagnetic radiation increase. Therefore, the influence of the screen guard with rubber condom on the mobile transceivers as one major piece of equipment radiating electromagnetic radiation was investigated by mobile network users amidst ultra-dense networks in 5g. The results were compared with the same mobile transceivers without screen guards and rubber condoms under the same network conditions. The 5 cm distance from the mobile transceivers was measured with the help of a ruler, and the intensity of Radio Frequency (RF) radiation was measured using an RF meter. The results show that the intensity of radiation from various mobile transceivers without screen guides and condoms was higher than the mobile transceivers with screen guides and condoms when call conversation was on at both ends.

Keywords: ultra-dense networks, mobile network users, 5g, coordinated multi-point.

Procedia PDF Downloads 78
16600 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 226
16599 Forecasting Models for Steel Demand Uncertainty Using Bayesian Methods

Authors: Watcharin Sangma, Onsiri Chanmuang, Pitsanu Tongkhow

Abstract:

A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.

Keywords: forecasting model, steel demand uncertainty, hierarchical Bayesian framework, exponential smoothing method

Procedia PDF Downloads 338
16598 Developing Fuzzy Logic Model for Reliability Estimation: Case Study

Authors: Soroor K. H. Al-Khafaji, Manal Mohammad Abed

Abstract:

The research aim of this paper is to evaluate the reliability of a complex engineering system and to design a fuzzy model for the reliability estimation. The designed model has been applied on Vegetable Oil Purification System (neutralization system) to help the specialist user based on the concept of FMEA (Failure Mode and Effect Analysis) to estimate the reliability of the repairable system at the vegetable oil industry. The fuzzy model has been used to predict the system reliability for a future time period, depending on a historical database for the two past years. The model can help to specify the system malfunctions and to predict its reliability during a future period in more accurate and reasonable results compared with the results obtained by the traditional method of reliability estimation.

Keywords: fuzzy logic, reliability, repairable systems, FMEA

Procedia PDF Downloads 595
16597 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 176
16596 Developing a Systems Dynamics Model for Security Management

Authors: Kuan-Chou Chen

Abstract:

This paper will demonstrate a simulation model of an information security system by using the systems dynamic approach. The relationships in the system model are designed to be simple and functional and do not necessarily represent any particular information security environments. The purpose of the paper aims to develop a generic system dynamic information security system model with implications on information security research. The interrelated and interdependent relationships of five primary sectors in the system dynamic model will be presented in this paper. The integrated information security systems model will include (1) information security characteristics, (2) users, (3) technology, (4) business functions, and (5) policy and management. Environments, attacks, government and social culture will be defined as the external sector. The interactions within each of these sectors will be depicted by system loop map as well. The proposed system dynamic model will not only provide a conceptual framework for information security analysts and designers but also allow information security managers to remove the incongruity between the management of risk incidents and the management of knowledge and further support information security managers and decision makers the foundation for managerial actions and policy decisions.

Keywords: system thinking, information security systems, security management, simulation

Procedia PDF Downloads 408
16595 Effectiveness of New Digital Tools on Implementing Quality Management System: An Exploratory Study of French Companies

Authors: Takwa Belwakess

Abstract:

With the wave of the digitization that invades the modern world, communication tools took their place in the world of business. As for organizations, being part of the digital era necessarily involves an evolution of the management style, mainly in processes management, knowing also as quality management system (QMS). For more than 50 years quality management standards have been adopted by organizations to prove their operational and financial performances. We believe that achieving a high-level of communication can lead to better quality management and greater customer satisfaction, which is essential to make sure long-term competitiveness. In this paper, a questionnaire survey was developed to investigate the use of collaboration tools such as Content Management System and Social Networks. Data from more than 100 companies based in France was analyzed, the results show that adopting new digital communication tools while applying quality management practices over a reasonable period, contributed to delivering a better implementation of the QMS for a better business performance.

Keywords: communication tools, content management system, digital, effectiveness, French companies, quality management system, quality management practices, social networks

Procedia PDF Downloads 247
16594 A Sociocybernetics Data Analysis Using Causality in Tourism Networks

Authors: M. Lloret-Climent, J. Nescolarde-Selva

Abstract:

The aim of this paper is to propose a mathematical model to determine invariant sets, set covering, orbits and, in particular, attractors in the set of tourism variables. Analysis was carried out based on a pre-designed algorithm and applying our interpretation of chaos theory developed in the context of General Systems Theory. This article sets out the causal relationships associated with tourist flows in order to enable the formulation of appropriate strategies. Our results can be applied to numerous cases. For example, in the analysis of tourist flows, these findings can be used to determine whether the behaviour of certain groups affects that of other groups and to analyse tourist behaviour in terms of the most relevant variables. Unlike statistical analyses that merely provide information on current data, our method uses orbit analysis to forecast, if attractors are found, the behaviour of tourist variables in the immediate future.

Keywords: attractor, invariant set, tourist flows, orbits, social responsibility, tourism, tourist variables

Procedia PDF Downloads 497
16593 Location Quotients Model in Turkey’s Provinces and Nuts II Regions

Authors: Semih Sözer

Abstract:

One of the most common issues in economic systems is understanding characteristics of economic activities in cities and regions. Although there are critics to economic base models in conceptual and empirical aspects, these models are useful tools to examining the economic structure of a nation, regions or cities. This paper uses one of the methodologies of economic base models namely the location quotients model. Data for this model includes employment numbers of provinces and NUTS II regions in Turkey. Time series of data covers the years of 1990, 2000, 2003, and 2009. Aim of this study is finding which sectors are export-base and which sectors are import-base in provinces and regions. Model results show that big provinces or powerful regions (population, size etc.) mostly have basic sectors in their economic system. However, interesting facts came from different sectors in different provinces and regions in the model results.

Keywords: economic base, location quotients model, regional economics, regional development

Procedia PDF Downloads 414
16592 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm

Authors: Annalakshmi G., Sakthivel Murugan S.

Abstract:

This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.

Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization

Procedia PDF Downloads 147
16591 Media Richness Perspective on Web 2.0 Usage for Knowledge Creation: The Case of the Cocoa Industry in Ghana

Authors: Albert Gyamfi

Abstract:

Cocoa plays critical role in the socio-economic development of Ghana. Meanwhile, smallholder farmers most of whom are illiterate dominate the industry. According to the cocoa-based agricultural knowledge and information system (AKIS) model knowledge is created and transferred to the industry between three key actors: cocoa researchers, extension experts, and cocoa farmers. Dwelling on the SECI model, the media richness theory (MRT), and the AKIS model, a conceptual model of web 2.0-based AKIS model (AKIS 2.0) is developed and used to assess the possible effects of social media usage for knowledge creation in the Ghanaian cocoa industry. A mixed method approach with a survey questionnaire was employed, and a second-order multi-group structural equation model (SEM) was used to analyze the data. The study concludes that the use of web 2.0 applications for knowledge creation would lead to sustainable interactions among the key knowledge actors for effective knowledge creation in the cocoa industry in Ghana.

Keywords: agriculture, cocoa, knowledge, media, web 2.0

Procedia PDF Downloads 314
16590 Vehicle Routing Problem Considering Alternative Roads under Triple Bottom Line Accounting

Authors: Onur Kaya, Ilknur Tukenmez

Abstract:

In this study, we consider vehicle routing problems on networks with alternative direct links between nodes, and we analyze a multi-objective problem considering the financial, environmental and social objectives in this context. In real life, there might exist several alternative direct roads between two nodes, and these roads might have differences in terms of their lengths and durations. For example, a road might be shorter than another but might require longer time due to traffic and speed limits. Similarly, some toll roads might be shorter or faster but require additional payment, leading to higher costs. We consider such alternative links in our problem and develop a mixed integer linear programming model that determines which alternative link to use between two nodes, in addition to determining the optimal routes for different vehicles, depending on the model objectives and constraints. We consider the minimum cost routing as the financial objective for the company, minimizing the CO2 emissions and gas usage as the environmental objectives, and optimizing the driver working conditions/working hours, and minimizing the risks of accidents as the social objectives. With these objective functions, we aim to determine which routes, and which alternative links should be used in addition to the speed choices on each link. We discuss the results of the developed vehicle routing models and compare their results depending on the system parameters.

Keywords: vehicle routing, alternative links between nodes, mixed integer linear programming, triple bottom line accounting

Procedia PDF Downloads 391
16589 NFResNet: Multi-Scale and U-Shaped Networks for Deblurring

Authors: Tanish Mittal, Preyansh Agrawal, Esha Pahwa, Aarya Makwana

Abstract:

Multi-Scale and U-shaped Networks are widely used in various image restoration problems, including deblurring. Keeping in mind the wide range of applications, we present a comparison of these architectures and their effects on image deblurring. We also introduce a new block called as NFResblock. It consists of a Fast Fourier Transformation layer and a series of modified Non-Linear Activation Free Blocks. Based on these architectures and additions, we introduce NFResnet and NFResnet+, which are modified multi-scale and U-Net architectures, respectively. We also use three differ-ent loss functions to train these architectures: Charbonnier Loss, Edge Loss, and Frequency Reconstruction Loss. Extensive experiments on the Deep Video Deblurring dataset, along with ablation studies for each component, have been presented in this paper. The proposed architectures achieve a considerable increase in Peak Signal to Noise (PSNR) ratio and Structural Similarity Index (SSIM) value.

Keywords: multi-scale, Unet, deblurring, FFT, resblock, NAF-block, nfresnet, charbonnier, edge, frequency reconstruction

Procedia PDF Downloads 109
16588 Deep Learning Application for Object Image Recognition and Robot Automatic Grasping

Authors: Shiuh-Jer Huang, Chen-Zon Yan, C. K. Huang, Chun-Chien Ting

Abstract:

Since the vision system application in industrial environment for autonomous purposes is required intensely, the image recognition technique becomes an important research topic. Here, deep learning algorithm is employed in image system to recognize the industrial object and integrate with a 7A6 Series Manipulator for object automatic gripping task. PC and Graphic Processing Unit (GPU) are chosen to construct the 3D Vision Recognition System. Depth Camera (Intel RealSense SR300) is employed to extract the image for object recognition and coordinate derivation. The YOLOv2 scheme is adopted in Convolution neural network (CNN) structure for object classification and center point prediction. Additionally, image processing strategy is used to find the object contour for calculating the object orientation angle. Then, the specified object location and orientation information are sent to robotic controller. Finally, a six-axis manipulator can grasp the specific object in a random environment based on the user command and the extracted image information. The experimental results show that YOLOv2 has been successfully employed to detect the object location and category with confidence near 0.9 and 3D position error less than 0.4 mm. It is useful for future intelligent robotic application in industrial 4.0 environment.

Keywords: deep learning, image processing, convolution neural network, YOLOv2, 7A6 series manipulator

Procedia PDF Downloads 222
16587 Predicting Trapezoidal Weir Discharge Coefficient Using Evolutionary Algorithm

Authors: K. Roushanger, A. Soleymanzadeh

Abstract:

Weirs are structures often used in irrigation techniques, sewer networks and flood protection. However, the hydraulic behavior of this type of weir is complex and difficult to predict accurately. An accurate flow prediction over a weir mainly depends on the proper estimation of discharge coefficient. In this study, the Genetic Expression Programming (GEP) approach was used for predicting trapezoidal and rectangular sharp-crested side weirs discharge coefficient. Three different performance indexes are used as comparing criteria for the evaluation of the model’s performances. The obtained results approved capability of GEP in prediction of trapezoidal and rectangular side weirs discharge coefficient. The results also revealed the influence of downstream Froude number for trapezoidal weir and upstream Froude number for rectangular weir in prediction of the discharge coefficient for both of side weirs.

Keywords: discharge coefficient, genetic expression programming, trapezoidal weir

Procedia PDF Downloads 375
16586 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks

Authors: Afnan Al-Romi, Iman Al-Momani

Abstract:

The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.

Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN

Procedia PDF Downloads 309
16585 Modeling Heat-Related Mortality Based on Greenhouse Emissions in OECD Countries

Authors: Anderson Ngowa Chembe, John Olukuru

Abstract:

Greenhouse emissions by human activities are known to irreversibly increase global temperatures through the greenhouse effect. This study seeks to propose a mortality model with sensitivity to heat-change effects as one of the underlying parameters in the model. As such, the study sought to establish the relationship between greenhouse emissions and mortality indices in five OECD countries (USA, UK, Japan, Canada & Germany). Upon the establishment of the relationship using correlation analysis, an additional parameter that accounts for the sensitivity of heat-changes to mortality rates was incorporated in the Lee-Carter model. Based on the proposed model, new parameter estimates were calculated using iterative algorithms for optimization. Finally, the goodness of fit for the original Lee-Carter model and the proposed model were compared using deviance comparison. The proposed model provides a better fit to mortality rates especially in USA, UK and Germany where the mortality indices have a strong positive correlation with the level of greenhouse emissions. The results of this study are of particular importance to actuaries, demographers and climate-risk experts who seek to use better mortality-modeling techniques in the wake of heat effects caused by increased greenhouse emissions.

Keywords: climate risk, greenhouse emissions, Lee-Carter model, OECD

Procedia PDF Downloads 325
16584 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia

Authors: Carol Anne Hargreaves

Abstract:

A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.

Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system

Procedia PDF Downloads 140
16583 Design Channel Non Persistent CSMA MAC Protocol Model for Complex Wireless Systems Based on SoC

Authors: Ibrahim A. Aref, Tarek El-Mihoub, Khadiga Ben Musa

Abstract:

This paper presents Carrier Sense Multiple Access (CSMA) communication model based on SoC design methodology. Such model can be used to support the modelling of the complex wireless communication systems, therefore use of such communication model is an important technique in the construction of high performance communication. SystemC has been chosen because it provides a homogeneous design flow for complex designs (i.e. SoC and IP based design). We use a swarm system to validate CSMA designed model and to show how advantages of incorporating communication early in the design process. The wireless communication created through the modeling of CSMA protocol that can be used to achieve communication between all the agents and to coordinate access to the shared medium (channel).

Keywords: systemC, modelling, simulation, CSMA

Procedia PDF Downloads 408
16582 A Deep Learning Based Integrated Model For Spatial Flood Prediction

Authors: Vinayaka Gude Divya Sampath

Abstract:

The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.

Keywords: deep learning, disaster management, flood prediction, urban flooding

Procedia PDF Downloads 128
16581 Analyze of Nanoscale Materials and Devices for Future Communication and Telecom Networks in the Gas Refinery

Authors: Mohamad Bagher Heidari, Hefzollah Mohammadian

Abstract:

New discoveries in materials on the nanometer-length scale are expected to play an important role in addressing ongoing and future challenges in the field of communication. Devices and systems for ultra-high speed short and long range communication links, portable and power efficient computing devices, high-density memory and logics, ultra-fast interconnects, and autonomous and robust energy scavenging devices for accessing ambient intelligence and needed information will critically depend on the success of next-generation emerging nonmaterials and devices. This article presents some exciting recent developments in nonmaterials that have the potential to play a critical role in the development and transformation of future intelligent communication and telecom networks in the gas refinery. The industry is benefiting from nanotechnology advances with numerous applications including those in smarter sensors, logic elements, computer chips, memory storage devices, optoelectronics.

Keywords: nonmaterial, intelligent communication, nanoscale, nanophotonic, telecom

Procedia PDF Downloads 317
16580 Model of Transhipment and Routing Applied to the Cargo Sector in Small and Medium Enterprises of Bogotá, Colombia

Authors: Oscar Javier Herrera Ochoa, Ivan Dario Romero Fonseca

Abstract:

This paper presents a design of a model for planning the distribution logistics operation. The significance of this work relies on the applicability of this fact to the analysis of small and medium enterprises (SMEs) of dry freight in Bogotá. Two stages constitute this implementation: the first one is the place where optimal planning is achieved through a hybrid model developed with mixed integer programming, which considers the transhipment operation based on a combined load allocation model as a classic transshipment model; the second one is the specific routing of that operation through the heuristics of Clark and Wright. As a result, an integral model is obtained to carry out the step by step planning of the distribution of dry freight for SMEs in Bogotá. In this manner, optimum assignments are established by utilizing transshipment centers with that purpose of determining the specific routing based on the shortest distance traveled.

Keywords: transshipment model, mixed integer programming, saving algorithm, dry freight transportation

Procedia PDF Downloads 207
16579 A Model for Predicting Organic Compounds Concentration Change in Water Associated with Horizontal Hydraulic Fracturing

Authors: Ma Lanting, S. Eguilior, A. Hurtado, Juan F. Llamas Borrajo

Abstract:

Horizontal hydraulic fracturing is a technology to increase natural gas flow and improve productivity in the low permeability formation. During this drilling operation tons of flowback and produced water which contains many organic compounds return to the surface with a potential risk of influencing the surrounding environment and human health. A mathematical model is urgently needed to represent organic compounds in water transportation process behavior and the concentration change with time throughout the hydraulic fracturing operation life cycle. A comprehensive model combined Organic Matter Transport Dynamic Model with Two-Compartment First-order Model Constant (TFRC) Model has been established to quantify the organic compounds concentration. This algorithm model is composed of two transportation parts based on time factor. For the fast part, the curve fitting technique is applied using flowback water data from the Marcellus shale gas site fracturing and the coefficients of determination (R2) from all analyzed compounds demonstrate a high experimental feasibility of this numerical model. Furthermore, along a decade of drilling the concentration ratio curves have been estimated by the slow part of this model. The result shows that the larger value of Koc in chemicals, the later maximum concentration in water will reach, as well as all the maximum concentrations percentage would reach up to 90% of initial concentration from shale formation within a long sufficient period.

Keywords: model, shale gas, concentration, organic compounds

Procedia PDF Downloads 209
16578 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.

Keywords: agile methodology, health analytics, unified process model, UML

Procedia PDF Downloads 490
16577 Deep Learning Based Text to Image Synthesis for Accurate Facial Composites in Criminal Investigations

Authors: Zhao Gao, Eran Edirisinghe

Abstract:

The production of an accurate sketch of a suspect based on a verbal description obtained from a witness is an essential task for most criminal investigations. The criminal investigation system employs specifically trained professional artists to manually draw a facial image of the suspect according to the descriptions of an eyewitness for subsequent identification. Within the advancement of Deep Learning, Recurrent Neural Networks (RNN) have shown great promise in Natural Language Processing (NLP) tasks. Additionally, Generative Adversarial Networks (GAN) have also proven to be very effective in image generation. In this study, a trained GAN conditioned on textual features such as keywords automatically encoded from a verbal description of a human face using an RNN is used to generate photo-realistic facial images for criminal investigations. The intention of the proposed system is to map corresponding features into text generated from verbal descriptions. With this, it becomes possible to generate many reasonably accurate alternatives to which the witness can use to hopefully identify a suspect from. This reduces subjectivity in decision making both by the eyewitness and the artist while giving an opportunity for the witness to evaluate and reconsider decisions. Furthermore, the proposed approach benefits law enforcement agencies by reducing the time taken to physically draw each potential sketch, thus increasing response times and mitigating potentially malicious human intervention. With publically available 'CelebFaces Attributes Dataset' (CelebA) and additionally providing verbal description as training data, the proposed architecture is able to effectively produce facial structures from given text. Word Embeddings are learnt by applying the RNN architecture in order to perform semantic parsing, the output of which is fed into the GAN for synthesizing photo-realistic images. Rather than the grid search method, a metaheuristic search based on genetic algorithms is applied to evolve the network with the intent of achieving optimal hyperparameters in a fraction the time of a typical brute force approach. With the exception of the ‘CelebA’ training database, further novel test cases are supplied to the network for evaluation. Witness reports detailing criminals from Interpol or other law enforcement agencies are sampled on the network. Using the descriptions provided, samples are generated and compared with the ground truth images of a criminal in order to calculate the similarities. Two factors are used for performance evaluation: The Structural Similarity Index (SSIM) and the Peak Signal-to-Noise Ratio (PSNR). A high percentile output from this performance matrix should attribute to demonstrating the accuracy, in hope of proving that the proposed approach can be an effective tool for law enforcement agencies. The proposed approach to criminal facial image generation has potential to increase the ratio of criminal cases that can be ultimately resolved using eyewitness information gathering.

Keywords: RNN, GAN, NLP, facial composition, criminal investigation

Procedia PDF Downloads 144
16576 In situ Polymerization and Properties of Biobased Polyurethane/Epoxy Interpenetrating Network Nanocomposites

Authors: Aiswarea Mathew, Smita Mohanty, Jr., S. K. Nayak

Abstract:

Polyurethane networks based on castor oil (CO) as a renewable resource polyol were synthesized. Polyurethane/epoxy resin interpenetrating network nanocomposites containing modified montmorillonite organoclay (C30B-PU/EP nanocomposites) were prepared by an in situ intercalation method. The conventional spectroscopic characterization of the synthesized samples using FT-IR confirms the existence of the proposed castor oil based PU structure and also showed that strong interactions existed between C30B and EP/PU matrix. The dispersion degree of C30B in EP/PU matrix was characterized by X-Ray diffraction (XRD) method. Scanning electronic microscopy analysis showed that the interpenetrating process of PU and EP increases the exfoliation degree of C30B, and it improves the compatibility and the phase structure of polyurethane/epoxy resin interpenetrating polymer networks (PU/EP IPNs). The thermal stability improves compared to the polyurethane when the PU/EP IPN is formed. Mechanical properties including the Young’s modulus and tensile strength reflected marked improvement with addition of C30B.

Keywords: castor oil, epoxy, montmorillonite, polyurethane

Procedia PDF Downloads 384
16575 Frequency Distribution and Assertive Object Theory: An Exploration of the Late Bronze Age Italian Ceramic Landscape

Authors: Sara Fioretti

Abstract:

In the 2nd millennium BCE, maritime networks became essential to the Mediterranean lifestyle, creating an interconnected world. This phenomenon of interconnected cultures has often been misinterpreted as an “effect” of the Mycenaean “influence” without considering the complexity and role of regional and cross-cultural exchanges. This paper explores the socio-economic relationships, in both cross-cultural and potentially inter-regional settings, present within the archaeological repertoire of the southern Italian Late Bronze Age (LBA 1600 -1140 BCE). The emergence of economic relations within the connectivity of the regional settlements is explored through ceramic contexts found in the case studies Punta di Zambrone, Broglio di Trebisacce, and Nuraghe Antigori. This work-in-progress research is situated in the shifting theoretical views of the last ten years that discuss the Late Bronze Age’s connectivity through Social Networks, Entanglement, and Assertive Objects combined with a comparative statistical study of ceramic frequency distribution. Applying these theoretical frameworks with a quantitative approach demonstrates the specific regional economic relationships that shaped the cultural interactions of the Late Bronze Age. Through this intersection of theory and statistical analysis, the case studies establish a small percentage of pottery as imported, whilst assertive productions have a relatively higher quantity. Overall, the majority still adheres to regional Italian traditions. Therefore, we can dissect the rhizomatic relationships cultivated by the Italian coasts and Mycenaeans and their roles within their networks through the intersection of theoretical and statistical analysis. This research offers a new perspective on the connectivity of the Late Bronze Age relational structures.

Keywords: late bronze age, mediterranean archaeology, exchanges and trade, frequency distribution of ceramic assemblages

Procedia PDF Downloads 25
16574 Analysis of the Temperature Dependence of Local Avalanche Compact Model for Bipolar Transistors

Authors: Robert Setekera, Ramses van der Toorn

Abstract:

We present an extensive analysis of the temperature dependence of the local avalanche model used in most of the modern compact models for bipolar transistors. This local avalanche model uses the Chynoweth's empirical law for ionization coefficient to define the generation of the avalanche current in terms of the local electric field. We carry out the model analysis using DC-measurements taken on both Si and advanced SiGe bipolar transistors. For the advanced industrial SiGe-HBTs, we consider both high-speed and high-power devices (both NPN and PNP transistors). The limitations of the local avalanche model in modeling the temperature dependence of the avalanche current mostly in the weak avalanche region are demonstrated. In addition, the model avalanche parameters are analyzed to see if they are in agreement with semiconductor device physics.

Keywords: avalanche multiplication, avalanche current, bipolar transistors, compact modeling, electric field, impact ionization, local avalanche

Procedia PDF Downloads 608
16573 Special Case of Trip Distribution Model and Its Use for Estimation of Detailed Transport Demand in the Czech Republic

Authors: Jiri Dufek

Abstract:

The national model of the Czech Republic has been modified in a detailed way to get detailed travel demand in the municipality level (cities, villages over 300 inhabitants). As a technique for this detailed modelling, three-dimensional procedure for calibrating gravity models, was used. Besides of zone production and attraction, which is usual in gravity models, the next additional parameter for trip distribution was introduced. Usually it is called by “third dimension”. In the model, this parameter is a demand between regions. The distribution procedure involved calculation of appropriate skim matrices and its multiplication by three coefficients obtained by iterative balancing of production, attraction and third dimension. This type of trip distribution was processed in R-project and the results were used in the Czech Republic transport model, created in PTV Vision. This process generated more precise results in local level od the model (towns, villages)

Keywords: trip distribution, three dimension, transport model, municipalities

Procedia PDF Downloads 110