Search results for: On Demand Distance Vector Routing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6204

Search results for: On Demand Distance Vector Routing

5964 Metric Dimension on Line Graph of Honeycomb Networks

Authors: M. Hussain, Aqsa Farooq

Abstract:

Let G = (V,E) be a connected graph and distance between any two vertices a and b in G is a−b geodesic and is denoted by d(a, b). A set of vertices W resolves a graph G if each vertex is uniquely determined by its vector of distances to the vertices in W. A metric dimension of G is the minimum cardinality of a resolving set of G. In this paper line graph of honeycomb network has been derived and then we calculated the metric dimension on line graph of honeycomb network.

Keywords: Resolving set, Metric dimension, Honeycomb network, Line graph

Procedia PDF Downloads 149
5963 Route Planning for Optimization Approach PSO_GA Sharing System (Scooter Sharing-Public Transportation) with Hybrid Optimization Approach PSO_GA

Authors: Mohammad Ali Farrokhpour

Abstract:

In the current decade and sustainable transportation systems, scooter sharing has attracted widespread attention as an environmentally-friendly means of public transportation which can help develop public transportation. The combination of scooters and subway in the area of sustainable transportation systems can provide a great many opportunities for developing access to public transportation. Of the challenges which have arisen and initiated discussions of interest about the implementation of a scooter-subway system to replace personal vehicles is the issue of routing in the aforementioned system. This has been chosen as the main subject of the present paper. Thus, the present paper provides an account for routing in this system. Because the issue of routing includes multiple factors such as time, costs, traffic, green spaces, etc., the above-mentioned problem is considered to be a multi-objective NP-hard optimization problem. For this purpose, the hybrid optimization approach of PSO-GA has been put forward in the present paper for the provided answers to be of higher accuracy and validity than those of normal optimization methods. The results obtained from modeling and problem solving for the case study in the MATLAB software are indicative of the efficiency and desirability of the model and the proposed approach for solving the model

Keywords: route planning, scooter sharing, public transportation, sharing system

Procedia PDF Downloads 52
5962 Secure Distance Bounding Protocol on Ultra-WideBand Based Mapping Code

Authors: Jamel Miri, Bechir Nsiri, Ridha Bouallegue

Abstract:

Ultra WidBand-IR physical layer technology has seen a great development during the last decade which makes it a promising candidate for short range wireless communications, as they bring considerable benefits in terms of connectivity and mobility. However, like all wireless communication they suffer from vulnerabilities in terms of security because of the open nature of the radio channel. To face these attacks, distance bounding protocols are the most popular counter measures. In this paper, we presented a protocol based on distance bounding to thread the most popular attacks: Distance Fraud, Mafia Fraud and Terrorist fraud. In our work, we study the way to adapt the best secure distance bounding protocols to mapping code of ultra-wideband (TH-UWB) radios. Indeed, to ameliorate the performances of the protocol in terms of security communication in TH-UWB, we combine the modified protocol to ultra-wideband impulse radio technology (IR-UWB). The security and the different merits of the protocols are analyzed.

Keywords: distance bounding, mapping code ultrawideband, terrorist fraud, physical layer technology

Procedia PDF Downloads 258
5961 Selecting the Best Sub-Region Indexing the Images in the Case of Weak Segmentation Based on Local Color Histograms

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

Color Histogram is considered as the oldest method used by CBIR systems for indexing images. In turn, the global histograms do not include the spatial information; this is why the other techniques coming later have attempted to encounter this limitation by involving the segmentation task as a preprocessing step. The weak segmentation is employed by the local histograms while other methods as CCV (Color Coherent Vector) are based on strong segmentation. The indexation based on local histograms consists of splitting the image into N overlapping blocks or sub-regions, and then the histogram of each block is computed. The dissimilarity between two images is reduced, as consequence, to compute the distance between the N local histograms of the both images resulting then in N*N values; generally, the lowest value is taken into account to rank images, that means that the lowest value is that which helps to designate which sub-region utilized to index images of the collection being asked. In this paper, we make under light the local histogram indexation method in the hope to compare the results obtained against those given by the global histogram. We address also another noteworthy issue when Relying on local histograms namely which value, among N*N values, to trust on when comparing images, in other words, which sub-region among the N*N sub-regions on which we base to index images. Based on the results achieved here, it seems that relying on the local histograms, which needs to pose an extra overhead on the system by involving another preprocessing step naming segmentation, does not necessary mean that it produces better results. In addition to that, we have proposed here some ideas to select the local histogram on which we rely on to encode the image rather than relying on the local histogram having lowest distance with the query histograms.

Keywords: CBIR, color global histogram, color local histogram, weak segmentation, Euclidean distance

Procedia PDF Downloads 333
5960 An Energy Holes Avoidance Routing Protocol for Underwater Wireless Sensor Networks

Authors: A. Khan, H. Mahmood

Abstract:

In Underwater Wireless Sensor Networks (UWSNs), sensor nodes close to water surface (final destination) are often preferred for selection as forwarders. However, their frequent selection makes them depleted of their limited battery power. In consequence, these nodes die during early stage of network operation and create energy holes where forwarders are not available for packets forwarding. These holes severely affect network throughput. As a result, system performance significantly degrades. In this paper, a routing protocol is proposed to avoid energy holes during packets forwarding. The proposed protocol does not require the conventional position information (localization) of holes to avoid them. Localization is cumbersome; energy is inefficient and difficult to achieve in underwater environment where sensor nodes change their positions with water currents. Forwarders with the lowest water pressure level and the maximum number of neighbors are preferred to forward packets. These two parameters together minimize packet drop by following the paths where maximum forwarders are available. To avoid interference along the paths with the maximum forwarders, a packet holding time is defined for each forwarder. Simulation results reveal superior performance of the proposed scheme than the counterpart technique.

Keywords: energy holes, interference, routing, underwater

Procedia PDF Downloads 375
5959 A Selection Approach: Discriminative Model for Nominal Attributes-Based Distance Measures

Authors: Fang Gong

Abstract:

Distance measures are an indispensable part of many instance-based learning (IBL) and machine learning (ML) algorithms. The value difference metrics (VDM) and inverted specific-class distance measure (ISCDM) are among the top-performing distance measures that address nominal attributes. VDM performs well in some domains owing to its simplicity and poorly in others that exist missing value and non-class attribute noise. ISCDM, however, typically works better than VDM on such domains. To maximize their advantages and avoid disadvantages, in this paper, a selection approach: a discriminative model for nominal attributes-based distance measures is proposed. More concretely, VDM and ISCDM are built independently on a training dataset at the training stage, and the most credible one is recorded for each training instance. At the test stage, its nearest neighbor for each test instance is primarily found by any of VDM and ISCDM and then chooses the most reliable model of its nearest neighbor to predict its class label. It is simply denoted as a discriminative distance measure (DDM). Experiments are conducted on the 34 University of California at Irvine (UCI) machine learning repository datasets, and it shows DDM retains the interpretability and simplicity of VDM and ISCDM but significantly outperforms the original VDM and ISCDM and other state-of-the-art competitors in terms of accuracy.

Keywords: distance measure, discriminative model, nominal attributes, nearest neighbor

Procedia PDF Downloads 85
5958 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 340
5957 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 25
5956 Agricultural Extension Workers’ Education in Indonesia - Roles of Distance Education

Authors: Adhi Susilo

Abstract:

This paper addresses the roles of distance education in the agricultural extension workers’ education. Agriculture plays an important role in both poverty reduction and economic growth. The technology of agriculture in the developing world should change continuously to keep pace with rising populations and rapidly changing social, economic, and environmental conditions. Therefore, agricultural extension workers should have several competencies in order to carry out their duties properly. One of the essential competencies that they must possess is the professional competency that is directly related to their duties in carrying out extension activities. Such competency can be acquired through studying at Universitas Terbuka (UT). With its distance learning system, agricultural extension workers can study at UT without leaving their duties. This paper presenting sociological analysis and lessons learnt from the specific context of Indonesia. Diversities in geographic, demographic, social cultural and economic conditions of the country provide specific challenges for its distance education practice and the process of social transformation to which distance education can contribute. Extension officers used distance education for personal benefits and increased professional productivity. An increase in awareness is important for the further adoption of distance learning for extension purposes. Organizations in both the public and private sector must work to increase knowledge of ICTs for the benefit of stakeholders. The use of ICTs can increase productivity for extensions officers and expand educational opportunities for learners. The use of distance education by extension to disseminate educational materials around the world is widespread. Increasing awareness and use of distance learning can lead to more productive relationships between extension officers and agricultural stakeholders.

Keywords: agricultural extension, demographic and geographic condition, distance education, ICTs

Procedia PDF Downloads 479
5955 Issues in Travel Demand Forecasting

Authors: Huey-Kuo Chen

Abstract:

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper.

Keywords: travel choices, B algorithm, entropy maximization, dynamic traffic assignment

Procedia PDF Downloads 417
5954 A Reinforcement Learning Approach for Evaluation of Real-Time Disaster Relief Demand and Network Condition

Authors: Ali Nadi, Ali Edrissi

Abstract:

Relief demand and transportation links availability is the essential information that is needed for every natural disaster operation. This information is not in hand once a disaster strikes. Relief demand and network condition has been evaluated based on prediction method in related works. Nevertheless, prediction seems to be over or under estimated due to uncertainties and may lead to a failure operation. Therefore, in this paper a stochastic programming model is proposed to evaluate real-time relief demand and network condition at the onset of a natural disaster. To address the time sensitivity of the emergency response, the proposed model uses reinforcement learning for optimization of the total relief assessment time. The proposed model is tested on a real size network problem. The simulation results indicate that the proposed model performs well in the case of collecting real-time information.

Keywords: disaster management, real-time demand, reinforcement learning, relief demand

Procedia PDF Downloads 269
5953 Hybrid Anomaly Detection Using Decision Tree and Support Vector Machine

Authors: Elham Serkani, Hossein Gharaee Garakani, Naser Mohammadzadeh, Elaheh Vaezpour

Abstract:

Intrusion detection systems (IDS) are the main components of network security. These systems analyze the network events for intrusion detection. The design of an IDS is through the training of normal traffic data or attack. The methods of machine learning are the best ways to design IDSs. In the method presented in this article, the pruning algorithm of C5.0 decision tree is being used to reduce the features of traffic data used and training IDS by the least square vector algorithm (LS-SVM). Then, the remaining features are arranged according to the predictor importance criterion. The least important features are eliminated in the order. The remaining features of this stage, which have created the highest level of accuracy in LS-SVM, are selected as the final features. The features obtained, compared to other similar articles which have examined the selected features in the least squared support vector machine model, are better in the accuracy, true positive rate, and false positive. The results are tested by the UNSW-NB15 dataset.

Keywords: decision tree, feature selection, intrusion detection system, support vector machine

Procedia PDF Downloads 228
5952 Numerical Investigation of Poling Vector Angle on Adaptive Sandwich Plate Deflection

Authors: Alireza Pouladkhan, Mohammad Yavari Foroushani, Ali Mortazavi

Abstract:

This paper presents a finite element model for a sandwich plate containing a piezoelectric core. A sandwich plate with a piezoelectric core is constructed using the shear mode of piezoelectric materials. The orientation of poling vector has a significant effect on deflection and stress induced in the piezo-actuated adaptive sandwich plate. In the present study, the influence of this factor for a clamped-clamped-free-free and simple-simple-free-free square sandwich plate is investigated using Finite Element Method. The study uses ABAQUS (v.6.7) software to derive the finite element model of the sandwich plate. By using this model, the study gives the influences of the poling vector angle on the response of the smart structure and determines the maximum transverse displacement and maximum stress induced.

Keywords: finite element method, sandwich plate, poling vector, piezoelectric materials, smart structure, electric enthalpy

Procedia PDF Downloads 208
5951 Treating On-Demand Bonds as Cash-In-Hand: Analyzing the Use of “Unconscionability” as a Ground for Challenging Claims for Payment under On-Demand Bonds

Authors: Asanga Gunawansa, Shenella Fonseka

Abstract:

On-demand bonds, also known as unconditional bonds, are commonplace in the construction industry as a means of safeguarding the employer from any potential non-performance by a contractor. On-demand bonds may be obtained from commercial banks, and they serve as an undertaking by the issuing bank to honour payment on demand without questioning and/or considering any dispute between the employer and the contractor in relation to the underlying contract. Thus, whether or not a breach had occurred under the underlying contract, which triggers the demand for encashment by the employer, is not a question the bank needs to be concerned with. As a result, an unconditional bond allows the beneficiary to claim the money almost without any condition. Thus, an unconditional bond is as good as cash-in-hand. In the past, establishing fraud on the part of the employer, of which the bank had knowledge, was the only ground on which a bank could dishonour a claim made under an on-demand bond. However, recent jurisprudence in common law countries shows that courts are beginning to consider unconscionable conduct on the part of the employer in claiming under an on-demand bond as a ground that contractors could rely on the prevent the banks from honouring such claims. This has created uncertainty in connection with on-demand bonds and their liquidity. This paper analyzes recent judicial decisions in four common law jurisdictions, namely, England, Singapore, Hong Kong, and Sri Lanka, to identify the scope of using the concept of “unconscionability” as a ground for preventing unreasonable claims for encashment of on-demand bonds. The objective of this paper is to argue that on-demand bonds have lost their effectiveness as “cash-in-hand” and that this is, in fact, an advantage and not an impediment to international commerce, as the purpose of such bonds should not be to provide for illegal and unconscionable conduct by the beneficiaries.

Keywords: fraud, performance guarantees, on-demand bonds, unconscionability

Procedia PDF Downloads 66
5950 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types

Authors: Qianxi Lv, Junying Liang

Abstract:

Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.

Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity

Procedia PDF Downloads 139
5949 Advantages and Disadvantages of Distance Learning in Comparison with Full-time Teaching from the Perspective of Chinese University Students

Authors: Daniel Ecler

Abstract:

The aim of this paper was to find out how Chinese university students perceive distance learning compared to full-time teaching, to reveal its advantages and disadvantages, and to try to find what elements could be implemented in regular full-time teaching in order to make it more effective. Recent events have shown that online teaching has a significant role to play in the field of education and needs to be given increased attention and scrutiny. For this purpose, a research survey was conducted using semi-structured questionnaires, which aimed to determine the attitudes of Chinese university students to the phenomenon of distance learning. The results of this survey revealed that most students prefer distance learning to full-time teaching, mainly because it gives them more freedom to participate in teaching, regardless of the environment in which they are currently located. In conclusion, it is necessary to mention that the possibility to participate virtually in teaching from anywhere is a huge advantage that could become part of regular teaching in the future. However, further research into this issue will be necessary.

Keywords: distance learning, full-time teaching, Chinese college students, cultural background

Procedia PDF Downloads 147
5948 Optimized Cluster Head Selection Algorithm Based on LEACH Protocol for Wireless Sensor Networks

Authors: Wided Abidi, Tahar Ezzedine

Abstract:

Low-Energy Adaptive Clustering Hierarchy (LEACH) has been considered as one of the effective hierarchical routing algorithms that optimize energy and prolong the lifetime of network. Since the selection of Cluster Head (CH) in LEACH is carried out randomly, in this paper, we propose an approach of electing CH based on LEACH protocol. In other words, we present a formula for calculating the threshold responsible for CH election. In fact, we adopt three principle criteria: the remaining energy of node, the number of neighbors within cluster range and the distance between node and CH. Simulation results show that our proposed approach beats LEACH protocol in regards of prolonging the lifetime of network and saving residual energy.

Keywords: wireless sensors networks, LEACH protocol, cluster head election, energy efficiency

Procedia PDF Downloads 294
5947 An Efficient Robot Navigation Model in a Multi-Target Domain amidst Static and Dynamic Obstacles

Authors: Michael Ayomoh, Adriaan Roux, Oyindamola Omotuyi

Abstract:

This paper presents an efficient robot navigation model in a multi-target domain amidst static and dynamic workspace obstacles. The problem is that of developing an optimal algorithm to minimize the total travel time of a robot as it visits all target points within its task domain amidst unknown workspace obstacles and finally return to its initial position. In solving this problem, a classical algorithm was first developed to compute the optimal number of paths to be travelled by the robot amidst the network of paths. The principle of shortest distance between robot and targets was used to compute the target point visitation order amidst workspace obstacles. Algorithm premised on the standard polar coordinate system was developed to determine the length of obstacles encountered by the robot hence giving room for a geometrical estimation of the total surface area occupied by the obstacle especially when classified as a relevant obstacle i.e. obstacle that lies in between a robot and its potential visitation point. A stochastic model was developed and used to estimate the likelihood of a dynamic obstacle bumping into the robot’s navigation path and finally, the navigation/obstacle avoidance algorithm was hinged on the hybrid virtual force field (HVFF) method. Significant modelling constraints herein include the choice of navigation path to selected target points, the possible presence of static obstacles along a desired navigation path and the likelihood of encountering a dynamic obstacle along the robot’s path and the chances of it remaining at this position as a static obstacle hence resulting in a case of re-routing after routing. The proposed algorithm demonstrated a high potential for optimal solution in terms of efficiency and effectiveness.

Keywords: multi-target, mobile robot, optimal path, static obstacles, dynamic obstacles

Procedia PDF Downloads 259
5946 ADP Approach to Evaluate the Blood Supply Network of Ontario

Authors: Usama Abdulwahab, Mohammed Wahab

Abstract:

This paper presents the application of uncapacitated facility location problems (UFLP) and 1-median problems to support decision making in blood supply chain networks. A plethora of factors make blood supply-chain networks a complex, yet vital problem for the regional blood bank. These factors are rapidly increasing demand; criticality of the product; strict storage and handling requirements; and the vastness of the theater of operations. As in the UFLP, facilities can be opened at any of $m$ predefined locations with given fixed costs. Clients have to be allocated to the open facilities. In classical location models, the allocation cost is the distance between a client and an open facility. In this model, the costs are the allocation cost, transportation costs, and inventory costs. In order to address this problem the median algorithm is used to analyze inventory, evaluate supply chain status, monitor performance metrics at different levels of granularity, and detect potential problems and opportunities for improvement. The Euclidean distance data for some Ontario cities (demand nodes) are used to test the developed algorithm. Sitation software, lagrangian relaxation algorithm, and branch and bound heuristics are used to solve this model. Computational experiments confirm the efficiency of the proposed approach. Compared to the existing modeling and solution methods, the median algorithm approach not only provides a more general modeling framework but also leads to efficient solution times in general.

Keywords: approximate dynamic programming, facility location, perishable product, inventory model, blood platelet, P-median problem

Procedia PDF Downloads 478
5945 Joint Optimization of Carsharing Stations with Vehicle Relocation and Demand Selection

Authors: Jiayuan Wu. Lu Hu

Abstract:

With the development of the sharing economy and mobile technology, carsharing becomes more popular. In this paper, we focus on the joint optimization of one-way station-based carsharing systems. We model the problem as an integer linear program with six elements: station locations, station capacity, fleet size, initial vehicle allocation, vehicle relocation, and demand selection. A greedy-based heuristic is proposed to address the model. Firstly, initialization based on the location variables relaxation using Gurobi solver is conducted. Then, according to the profit margin and demand satisfaction of each station, the number of stations is downsized iteratively. This method is applied to real data from Chengdu, Sichuan taxi data, and it’s efficient when dealing with a large scale of candidate stations. The result shows that with vehicle relocation and demand selection, the profit and demand satisfaction of carsharing systems are increased.

Keywords: one-way carsharing, location, vehicle relocation, demand selection, greedy algorithm

Procedia PDF Downloads 95
5944 Using AI to Advance Factory Planning: A Case Study to Identify Success Factors of Implementing an AI-Based Demand Planning Solution

Authors: Ulrike Dowie, Ralph Grothmann

Abstract:

Rational planning decisions are based upon forecasts. Precise forecasting has, therefore, a central role in business. The prediction of customer demand is a prime example. This paper introduces recurrent neural networks to model customer demand and combines the forecast with uncertainty measures to derive decision support of the demand planning department. It identifies and describes the keys to the successful implementation of an AI-based solution: bringing together data with business knowledge, AI methods, and user experience, and applying agile software development practices.

Keywords: agile software development, AI project success factors, deep learning, demand forecasting, forecast uncertainty, neural networks, supply chain management

Procedia PDF Downloads 137
5943 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 91
5942 Economic Stability in a Small Open Economy with Income Effect on Leisure Demand

Authors: Yu-Shan Hsu

Abstract:

This paper studies a two-sector growth model with a technology of social constant returns and with a utility that features either a zero or a positive income effect on the demand for leisure. The purpose is to investigate how the existence of aggregate instability or equilibrium indeterminacy depends on both the intensity of the income effect on the demand for leisure and the value of the labor supply elasticity. The main finding is that when there is a factor intensity reversal between the private perspective and the social perspective, indeterminacy arises even if the utility has a positive income effect on leisure demand. Moreover, we find that a smaller value of the labor supply elasticity increases the range of the income effect on leisure demand and thus increases the possibility of equilibrium indeterminacy. JEL classification: E3; O41

Keywords: indeterminacy, non-separable preferences, income effect, labor supply elasticity

Procedia PDF Downloads 141
5941 Investigation and Monitoring Method of Vector Density in Kaohsiung City

Authors: Chiu-Wen Chang, I-Yun Chang, Wei-Ting Chen, Hui-Ping Ho, Chao-Ying Pan, Joh-Jong Huang

Abstract:

Dengue is a ‘community disease’ or ‘environmental disease’, as long as the environment exist suitable container (including natural and artificial) for mosquito breeding, once the virus invade will lead to the dengue epidemic. Surveillance of vector density is critical to effective infectious disease control and play an important role in monitoring the dynamics of mosquitoes in community, such as mosquito species, density, distribution area. The objective of this study was to examine the relationship in vector density survey (Breteau index, Adult index, House index, Container index, and Larvae index) form 2014 to 2016 in Kaohsiung City and evaluate the effects of introducing the Breeding Elimination and Appraisal Team (hereinafter referred to as BEAT) as an intervention measure on eliminating dengue vector breeding site started from May 2016. BEAT were performed on people who were suspected of contracting dengue fever, a surrounding area measuring 50 meters by 50 meters was demarcated as the emergency prevention and treatment zone. BEAT would perform weekly vector mosquito inspections and vector mosquito inspections in regions with a high Gravitrap index and assign a risk assessment index to each region. These indices as well as the prevention and treatment results were immediately reported to epidemic prevention-related units every week. The results indicated that, vector indices from 2014 to 2016 showed no statistically significant differences in the Breteau index, adult index, and house index (p > 0.05) but statistically significant differences in the container index and larvae index (p <0.05). After executing the integrated elimination work, container index and larvae index are statistically significant different from 2014 to 2016 in the (p < 0.05). A post hoc test indicated that the container index of 2014 (M = 12.793) was significantly higher than that of 2016 (M = 7.631), and that the larvae index of 2015 (M = 34.065) was significantly lower than that of 2014 (M = 66.867). The results revealed that effective vector density surveillance could highlight the focus breeding site and then implement the immediate control action (BEAT), which successfully decreased the vector density and the risk of dengue epidemic.

Keywords: Breteau index, dengue control, monitoring method, vector density

Procedia PDF Downloads 152
5940 A Hybrid Algorithm Based on Greedy Randomized Adaptive Search Procedure and Chemical Reaction Optimization for the Vehicle Routing Problem with Hard Time Windows

Authors: Imen Boudali, Marwa Ragmoun

Abstract:

The Vehicle Routing Problem with Hard Time Windows (VRPHTW) is a basic distribution management problem that models many real-world problems. The objective of the problem is to deliver a set of customers with known demands on minimum-cost vehicle routes while satisfying vehicle capacity and hard time windows for customers. In this paper, we propose to deal with our optimization problem by using a new hybrid stochastic algorithm based on two metaheuristics: Chemical Reaction Optimization (CRO) and Greedy Randomized Adaptive Search Procedure (GRASP). The first method is inspired by the natural process of chemical reactions enabling the transformation of unstable substances with excessive energy to stable ones. During this process, the molecules interact with each other through a series of elementary reactions to reach minimum energy for their existence. This property is embedded in CRO to solve the VRPHTW. In order to enhance the population diversity throughout the search process, we integrated the GRASP in our method. Simulation results on the base of Solomon’s benchmark instances show the very satisfactory performances of the proposed approach.

Keywords: Benchmark Problems, Combinatorial Optimization, Vehicle Routing Problem with Hard Time Windows, Meta-heuristics, Hybridization, GRASP, CRO

Procedia PDF Downloads 372
5939 Performance Evaluation of a Prioritized, Limited Multi-Server Processor-Sharing System that Includes Servers with Various Capacities

Authors: Yoshiaki Shikata, Nobutane Hanayama

Abstract:

We present a prioritized, limited multi-server processor sharing (PS) system where each server has various capacities, and N (≥2) priority classes are allowed in each PS server. In each prioritized, limited server, different service ratio is assigned to each class request, and the number of requests to be processed is limited to less than a certain number. Routing strategies of such prioritized, limited multi-server PS systems that take into account the capacity of each server are also presented, and a performance evaluation procedure for these strategies is discussed. Practical performance measures of these strategies, such as loss probability, mean waiting time, and mean sojourn time, are evaluated via simulation. In the PS server, at the arrival (or departure) of a request, the extension (shortening) of the remaining sojourn time of each request receiving service can be calculated by using the number of requests of each class and the priority ratio. Utilising a simulation program which executes these events and calculations, the performance of the proposed prioritized, limited multi-server PS rule can be analyzed. From the evaluation results, most suitable routing strategy for the loss or waiting system is clarified.

Keywords: processor sharing, multi-server, various capacity, N-priority classes, routing strategy, loss probability, mean sojourn time, mean waiting time, simulation

Procedia PDF Downloads 303
5938 Determination of Safety Distance Around Gas Pipelines Using Numerical Methods

Authors: Omid Adibi, Nategheh Najafpour, Bijan Farhanieh, Hossein Afshin

Abstract:

Energy transmission pipelines are one of the most vital parts of each country which several strict laws have been conducted to enhance the safety of these lines and their vicinity. One of these laws is the safety distance around high pressure gas pipelines. Safety distance refers to the minimum distance from the pipeline where people and equipment do not confront with serious damages. In the present study, safety distance around high pressure gas transmission pipelines were determined by using numerical methods. For this purpose, gas leakages from cracked pipeline and created jet fires were simulated as continuous ignition, three dimensional, unsteady and turbulent cases. Numerical simulations were based on finite volume method and turbulence of flow was considered using k-ω SST model. Also, the combustion of natural gas and air mixture was applied using the eddy dissipation method. The results show that, due to the high pressure difference between pipeline and environment, flow chocks in the cracked area and velocity of the exhausted gas reaches to sound speed. Also, analysis of the incident radiation results shows that safety distances around 42 inches high pressure natural gas pipeline based on 5 and 15 kW/m2 criteria are 205 and 272 meters, respectively.

Keywords: gas pipelines, incident radiation, numerical simulation, safety distance

Procedia PDF Downloads 297
5937 Applying Arima Data Mining Techniques to ERP to Generate Sales Demand Forecasting: A Case Study

Authors: Ghaleb Y. Abbasi, Israa Abu Rumman

Abstract:

This paper modeled sales history archived from 2012 to 2015 bulked in monthly bins for five products for a medical supply company in Jordan. The sales forecasts and extracted consistent patterns in the sales demand history from the Enterprise Resource Planning (ERP) system were used to predict future forecasting and generate sales demand forecasting using time series analysis statistical technique called Auto Regressive Integrated Moving Average (ARIMA). This was used to model and estimate realistic sales demand patterns and predict future forecasting to decide the best models for five products. Analysis revealed that the current replenishment system indicated inventory overstocking.

Keywords: ARIMA models, sales demand forecasting, time series, R code

Procedia PDF Downloads 352
5936 A Data-Mining Model for Protection of FACTS-Based Transmission Line

Authors: Ashok Kalagura

Abstract:

This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.

Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC

Procedia PDF Downloads 399
5935 The Role of Interactive White Boards towards Achieving Transactional Learning in the Context of Open Distance Learning

Authors: M. Van Zyl, M. H. A. Combrinck, E. J. Spamer

Abstract:

Due to the need for higher education in South Africa, the country experiences a rapid growth in open distance learning, especially in rural areas. It is difficult for people to enrol fulltime at contact universities, owing to work and financial constraints. The Unit for Open Distance Learning (UODL) at the North-West University (NWU), Potchefstroom campus, South Africa was established in 2013 with its main function to deliver open distance learning programmes to 30 000 students from the Faculties of Education Sciences, Theology and Health Sciences. With the use of interactive whiteboards (IWBs), the NWU and UODL are now able to deliver lectures to students concurrently at 60 regional open learning centres across Southern Africa as well as to an unlimited number of individuals with Internet access worldwide. Although IWBs are not new, our initiative is to use them more extensively in order to create more contact between lecturers and students. To be able to ensure and enhance quality education it is vital to determine students’ perceptions on the delivery of programmes by means of IWBs. Therefore, the aim of the study is to explore students’ perceptions for the use of IWBs in the delivery of programmes in terms of Moore’s Theory of Transactional Distance.

Keywords: interactive white board, open distance learning, technology, transactional learning

Procedia PDF Downloads 424