Search results for: Time series model.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12699

Search results for: Time series model.

8829 Physical Properties and Resistant Starch Content of Rice Flour Residues Hydrolyzed by α-Amylase

Authors: Waranya Pongpaiboon, Warangkana Srichamnong, Supat Chaiyakul

Abstract:

Enzymatic modification of rice flour can produce highly functional derivatives use in food industries. This study aimed to evaluate the physical properties and resistant starch content of rice flour residues hydrolyzed by α-amylase. Rice flour hydrolyzed by α-amylase (60 and 300 u/g) for 1, 24 and 48 hours were investigated. Increasing enzyme concentration and hydrolysis time resulted in decreased rice flour residue’s lightness (L*) but increased redness (a*) and yellowness (b*) of rice flour residues. The resistant starch content and peak viscosity increased when hydrolysis time increased. Pasting temperature, trough viscosity, breakdown, final viscosity, setback and peak time of the hydrolyzed flours were not significantly different (p>0.05). The morphology of native flour was smooth without observable pores and polygonal with sharp angles and edges. However, after hydrolysis, granules with a slightly rough and porous surface were observed and a rough and porous surface was increased with increasing hydrolyzed time. The X-ray diffraction patterns of native flour showed A-type configuration, which hydrolyzed flour showed almost 0% crystallinity indicated that both amorphous and crystalline structures of starch were simultaneously hydrolyzed by α-amylase.

Keywords: α-Amylase, Enzymatic hydrolysis, Pasting properties, Resistant starch

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3096
8828 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers

Authors: Quanru Pan

Abstract:

How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.

Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963
8827 Software Maintenance Severity Prediction with Soft Computing Approach

Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.

Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
8826 Face Detection in Color Images using Color Features of Skin

Authors: Fattah Alizadeh, Saeed Nalousi, Chiman Savari

Abstract:

Because of increasing demands for security in today-s society and also due to paying much more attention to machine vision, biometric researches, pattern recognition and data retrieval in color images, face detection has got more application. In this article we present a scientific approach for modeling human skin color, and also offer an algorithm that tries to detect faces within color images by combination of skin features and determined threshold in the model. Proposed model is based on statistical data in different color spaces. Offered algorithm, using some specified color threshold, first, divides image pixels into two groups: skin pixel group and non-skin pixel group and then based on some geometric features of face decides which area belongs to face. Two main results that we received from this research are as follow: first, proposed model can be applied easily on different databases and color spaces to establish proper threshold. Second, our algorithm can adapt itself with runtime condition and its results demonstrate desirable progress in comparison with similar cases.

Keywords: face detection, skin color modeling, color, colorfulimages, face recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2313
8825 Multiple Object Tracking using Particle Swarm Optimization

Authors: Chen-Chien Hsu, Guo-Tang Dai

Abstract:

This paper presents a particle swarm optimization (PSO) based approach for multiple object tracking based on histogram matching. To start with, gray-level histograms are calculated to establish a feature model for each of the target object. The difference between the gray-level histogram corresponding to each particle in the search space and the target object is used as the fitness value. Multiple swarms are created depending on the number of the target objects under tracking. Because of the efficiency and simplicity of the PSO algorithm for global optimization, target objects can be tracked as iterations continue. Experimental results confirm that the proposed PSO algorithm can rapidly converge, allowing real-time tracking of each target object. When the objects being tracked move outside the tracking range, global search capability of the PSO resumes to re-trace the target objects.

Keywords: multiple object tracking, particle swarm optimization, gray-level histogram, image

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4100
8824 Finite Element Modeling of Rotating Mixing of Toothpaste

Authors: Inamullah Bhatti, Ahsanullah Baloch, Khadija Qureshi

Abstract:

The objective of this research is to examine the shear thinning behaviour of mixing flow of non-Newtonian fluid like toothpaste in the dissolution container with rotating stirrer. The problem under investigation is related to the chemical industry. Mixing of fluid is performed in a cylindrical container with rotating stirrer, where stirrer is eccentrically placed on the lid of the container. For the simulation purpose the associated motion of the fluid is considered as revolving of the container, with stick stirrer. For numerical prediction, a time-stepping finite element algorithm in a cylindrical polar coordinate system is adopted based on semi-implicit Taylor-Galerkin/pressure-correction scheme. Numerical solutions are obtained for non-Newtonian fluids employing power law model. Variations with power law index have been analysed, with respect to the flow structure and pressure drop.

Keywords: finite element simulation, mixing fluid, rheology, rotating flow, toothpaste

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2257
8823 Fast and Robust Long-term Tracking with Effective Searching Model

Authors: Thang V. Kieu, Long P. Nguyen

Abstract:

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Keywords: Correlation filter, long-term tracking, random fern, real-time tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
8822 Knowledge Discovery and Data Mining Techniques in Textile Industry

Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler

Abstract:

This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.

Keywords: Data mining, textile production, decision trees, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539
8821 An Adaptive Model for Blind Image Restoration using Bayesian Approach

Authors: S.K. Satpathy, S.K. Nayak, K. K. Nagwanshi, S. Panda, C. Ardil

Abstract:

Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.

Keywords: Image Restoration, Probability DensityFunction (PDF), Neural Networks, Bayesian Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247
8820 Developing a Model for the Relation between Heritage and Place Identity

Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi

Abstract:

In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical  phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.

Keywords: heritage, Inter-disciplinary study, Place identity, planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
8819 Using Speech Emotion Recognition as a Longitudinal Biomarker for Alzheimer’s Disease

Authors: Yishu Gong, Liangliang Yang, Jianyu Zhang, Zhengyu Chen, Sihong He, Xusheng Zhang, Wei Zhang

Abstract:

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that affects millions of people worldwide and is characterized by cognitive decline and behavioral changes. People living with Alzheimer’s disease often find it hard to complete routine tasks. However, there are limited objective assessments that aim to quantify the difficulty of certain tasks for AD patients compared to non-AD people. In this study, we propose to use speech emotion recognition (SER), especially the frustration level as a potential biomarker for quantifying the difficulty patients experience when describing a picture. We build an SER model using data from the IEMOCAP dataset and apply the model to the DementiaBank data to detect the AD/non-AD group difference and perform longitudinal analysis to track the AD disease progression. Our results show that the frustration level detected from the SER model can possibly be used as a cost-effective tool for objective tracking of AD progression in addition to the Mini-Mental State Examination (MMSE) score.

Keywords: Alzheimer’s disease, Speech Emotion Recognition, longitudinal biomarker, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 274
8818 Optimization of Wire EDM Parameters for Fabrication of Micro Channels

Authors: Gurinder Singh Brar, Sarbjeet Singh, Harry Garg

Abstract:

Wire Electric Discharge Machining (WEDM) is thermal machining process capable of machining very hard electrically conductive material irrespective of their hardness. WEDM is being widely used to machine micro scale parts with the high dimensional accuracy and surface finish. The objective of this paper is to optimize the process parameters of wire EDM to fabricate the micro channels and to calculate the surface finish and material removal rate of micro channels fabricated using wire EDM. The material used is aluminum 6061 alloy. The experiments were performed using CNC wire cut electric discharge machine. The effect of various parameters of WEDM like pulse on time (TON) with the levels (100, 150, 200), pulse off time (TOFF) with the levels (25, 35, 45) and current (IP) with the levels (105, 110, 115) were investigated to study the effect on output parameter i.e. Surface Roughness and Material Removal Rate (MRR). Each experiment was conducted under different conditions of pulse on time, pulse off time and peak current. For material removal rate, TON and Ip were the most significant process parameter. MRR increases with the increase in TON and Ip and decreases with the increase in TOFF. For surface roughness, TON and Ip have the maximum effect and TOFF was found out to be less effective.

Keywords: Micro Channels, Wire Electric Discharge Machining (WEDM), Metal Removal Rate (MRR), Surface Finish.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2699
8817 Error Correction Method for 2D Ultra-Wideband Indoor Wireless Positioning System Using Logarithmic Error Model

Authors: Phornpat Chewasoonthorn, Surat Kwanmuang

Abstract:

Indoor positioning technologies have been evolved rapidly. They augment the Global Positioning System (GPS) which requires line-of-sight to the sky to track the location of people or objects. In this study, we developed an error correction method for an indoor real-time location system (RTLS) based on an ultra-wideband (UWB) sensor from Decawave. Multiple stationary nodes (anchor) were installed throughout the workspace. The distance between stationary and moving nodes (tag) can be measured using a two-way-ranging (TWR) scheme. The result has shown that the uncorrected ranging error from the sensor system can be as large as 1 m. To reduce ranging error and thus increase positioning accuracy, we present an online correction algorithm using the Kalman filter. The results from experiments have shown that the system can reduce ranging error down to 5 cm.

Keywords: Indoor positioning, ultra-wideband, error correction, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 535
8816 A New Approach for Counting Passersby Utilizing Space-Time Images

Authors: A. Elmarhomy, S. Karungaru, K. Terada

Abstract:

Understanding the number of people and the flow of the persons is useful for efficient promotion of the institution managements and company-s sales improvements. This paper introduces an automated method for counting passerby using virtualvertical measurement lines. The process of recognizing a passerby is carried out using an image sequence obtained from the USB camera. Space-time image is representing the human regions which are treated using the segmentation process. To handle the problem of mismatching, different color space are used to perform the template matching which chose automatically the best matching to determine passerby direction and speed. A relation between passerby speed and the human-pixel area is used to distinguish one or two passersby. In the experiment, the camera is fixed at the entrance door of the hall in a side viewing position. Finally, experimental results verify the effectiveness of the presented method by correctly detecting and successfully counting them in order to direction with accuracy of 97%.

Keywords: counting passersby, virtual-vertical measurement line, passerby speed, space-time image

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414
8815 Target Concept Selection by Property Overlap in Ontology Population

Authors: Seong-Bae Park, Sang-Soo Kim, Sewook Oh, Zooyl Zeong, Hojin Lee, Seong Rae Park

Abstract:

An ontology is widely used in many kinds of applications as a knowledge representation tool for domain knowledge. However, even though an ontology schema is well prepared by domain experts, it is tedious and cost-intensive to add instances into the ontology. The most confident and trust-worthy way to add instances into the ontology is to gather instances from tables in the related Web pages. In automatic populating of instances, the primary task is to find the most proper concept among all possible concepts within the ontology for a given table. This paper proposes a novel method for this problem by defining the similarity between the table and the concept using the overlap of their properties. According to a series of experiments, the proposed method achieves 76.98% of accuracy. This implies that the proposed method is a plausible way for automatic ontology population from Web tables.

Keywords: Ontology population, domain knowledge consolidation, target concept selection, property overlap.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
8814 Nigerian Football System: Examining Micro-Level Practices against a Global Model for Integrated Development of Mass and Elite Sport

Authors: I. Derek Kaka’an, P. Smolianov, S. Dion, C. Schoen, J. Norberg, C. G. Iortimah

Abstract:

This study examines the current state of football in Nigeria to identify the country's practices, which could be useful internationally, and to determine areas for improvement. Over 200 sources of literature on sport delivery systems in successful sports nations were analyzed to construct a globally applicable model of elite football integrated with mass participation, comprising of the following three levels: macro (socio-economic, cultural, legislative, and organizational), meso (infrastructures, personnel, and services enabling sports programs) and micro level (operations, processes, and methodologies for the development of individual athletes). The model has received scholarly validation and has shown to be a framework for program analysis that is not culturally bound. It has recently been utilized for further understanding such sports systems as US rugby, tennis, soccer, swimming, and volleyball, as well as Dutch and Russian swimming. A questionnaire was developed using the above-mentioned model. Survey questions were validated by 12 experts including academicians, executives from sports governing bodies, football coaches, and administrators. To identify best practices and determine areas for improvement of football in Nigeria, 116 coaches completed the questionnaire. Useful exemplars and possible improvements were further identified through semi-structured discussions with 10 Nigerian football administrators and experts. Finally, a content analysis of the Nigeria Football Federation's website and organizational documentation was conducted. This paper focuses on the micro level of Nigerian football delivery, particularly talent search and development as well as advanced athlete preparation and support. Results suggested that Nigeria could share such progressive practices as the provision of football programs in all schools and full-time coaches paid by governments based on the level of coach education. Nigerian football administrators and coaches could provide better football services affordable for all, where success in mass and elite sports is guided by science focused on athletes' needs. Better implemented could be international best practices such as lifelong guidelines for health and excellence of everyone and integration of fitness tests into player development and ranking as done in best Dutch, English, French, Russian, Spanish, and other European clubs; integration of educational and competitive events for elite and developing athletes as well as fans as done at the 2018 World Cup Russia; and academies with multi-stage athlete nurturing as done by Ajax in Africa as well as Barcelona FC and other top clubs expanding across the world. The methodical integration of these practices into the balanced development of mass and elite football will help contribute to international sports success as well as national health, education, crime control, and social harmony in Nigeria.

Keywords: Football, high performance, mass participation, Nigeria, sport development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156
8813 Analytical Design of IMC-PID Controller for Ideal Decoupling Embedded in Multivariable Smith Predictor Control System

Authors: Le Hieu Giang, Truong Nguyen Luan Vu, Le Linh

Abstract:

In this paper, the analytical tuning rules of IMC-PID controller are presented for the multivariable Smith predictor that involved the ideal decoupling. Accordingly, the decoupler is first introduced into the multivariable Smith predictor control system by a well-known approach of ideal decoupling, which is compactly extended for general nxn multivariable processes and the multivariable Smith predictor controller is then obtained in terms of the multiple single-loop Smith predictor controllers. The tuning rules of PID controller in series with filter are found by using Maclaurin approximation. Many multivariable industrial processes are employed to demonstrate the simplicity and effectiveness of the presented method. The simulation results show the superior performances of presented method in compared with the other methods.

Keywords: Ideal decoupler, IMC-PID controller, multivariable Smith predictor, Maclaurin approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1389
8812 Constant Order Predictor Corrector Method for the Solution of Modeled Problems of First Order IVPs of ODEs

Authors: A. A. James, A. O. Adesanya, M. R. Odekunle, D. G. Yakubu

Abstract:

This paper examines the development of one step, five hybrid point method for the solution of first order initial value problems. We adopted the method of collocation and interpolation of power series approximate solution to generate a continuous linear multistep method. The continuous linear multistep method was evaluated at selected grid points to give the discrete linear multistep method. The method was implemented using a constant order predictor of order seven over an overlapping interval. The basic properties of the derived corrector was investigated and found to be zero stable, consistent and convergent. The region of absolute stability was also investigated. The method was tested on some numerical experiments and found to compete favorably with the existing methods.

Keywords: Interpolation, Approximate Solution, Collocation, Differential system, Half step, Converges, Block method, Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
8811 Robust Image Transmission Over Time-varying Channels using Hierarchical Joint Source Channel Coding

Authors: Hatem. Elmeddeb, Noureddine, Hamdi, Ammar. Bouallègue

Abstract:

In this paper, a joint source-channel coding (JSCC) scheme for time-varying channels is presented. The proposed scheme uses hierarchical framework for both source encoder and transmission via QAM modulation. Hierarchical joint source channel codes with hierarchical QAM constellations are designed to track the channel variations which yields to a higher throughput by adapting certain parameters of the receiver to the channel variation. We consider the problem of still image transmission over time-varying channels with channel state information (CSI) available at 1) receiver only and 2) both transmitter and receiver being informed about the state of the channel. We describe an algorithm that optimizes hierarchical source codebooks by minimizing the distortion due to source quantizer and channel impairments. Simulation results, based on image representation, show that, the proposed hierarchical system outperforms the conventional schemes based on a single-modulator and channel optimized source coding.

Keywords: Channel-optimized VQ (COVQ), joint optimization, QAM, hierarchical systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425
8810 Simulating Dynamics of Thoracolumbar Spine Derived from Life MOD under Haptic Forces

Authors: K. T. Huynh, I. Gibson, W. F. Lu, B. N. Jagdish

Abstract:

In this paper, the construction of a detailed spine model is presented using the LifeMOD Biomechanics Modeler. The detailed spine model is obtained by refining spine segments in cervical, thoracic and lumbar regions into individual vertebra segments, using bushing elements representing the intervertebral discs, and building various ligamentous soft tissues between vertebrae. In the sagittal plane of the spine, constant force will be applied from the posterior to anterior during simulation to determine dynamic characteristics of the spine. The force magnitude is gradually increased in subsequent simulations. Based on these recorded dynamic properties, graphs of displacement-force relationships will be established in terms of polynomial functions by using the least-squares method and imported into a haptic integrated graphic environment. A thoracolumbar spine model with complex geometry of vertebrae, which is digitized from a resin spine prototype, will be utilized in this environment. By using the haptic technique, surgeons can touch as well as apply forces to the spine model through haptic devices to observe the locomotion of the spine which is computed from the displacement-force relationship graphs. This current study provides a preliminary picture of our ongoing work towards building and simulating bio-fidelity scoliotic spine models in a haptic integrated graphic environment whose dynamic properties are obtained from LifeMOD. These models can be helpful for surgeons to examine kinematic behaviors of scoliotic spines and to propose possible surgical plans before spine correction operations.

Keywords: Haptic interface, LifeMOD, spine modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
8809 Examining Effects of Electronic Market Functions on Decrease in Product Unit Cost and Response Time to Customer

Authors: Maziyar Nouraee

Abstract:

Electronic markets in recent decades contribute remarkably in business transactions. Many organizations consider traditional ways of trade non-economical and therefore they do trade only through electronic markets. There are different categorizations of electronic markets functions. In one classification, functions of electronic markets are categorized into classes as information, transactions, and value added. In the present paper, effects of the three classes on the two major elements of the supply chain management are measured. The two elements are decrease in the product unit cost and reduction in response time to the customer. The results of the current research show that among nine minor elements related to the three classes of electronic markets functions, six factors and three factors influence on reduction of the product unit cost and reduction of response time to the customer, respectively.

Keywords: Electronic Commerce, Electronic Market, B2B Trade, Supply Chain Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
8808 Enhancing Performance of Bluetooth Piconets Using Priority Scheduling and Exponential Back-Off Mechanism

Authors: Dharmendra Chourishi “Maitraya”, Sridevi Seshadri

Abstract:

Bluetooth is a personal wireless communication technology and is being applied in many scenarios. It is an emerging standard for short range, low cost, low power wireless access technology. Current existing MAC (Medium Access Control) scheduling schemes only provide best-effort service for all masterslave connections. It is very challenging to provide QoS (Quality of Service) support for different connections due to the feature of Master Driven TDD (Time Division Duplex). However, there is no solution available to support both delay and bandwidth guarantees required by real time applications. This paper addresses the issue of how to enhance QoS support in a Bluetooth piconet. The Bluetooth specification proposes a Round Robin scheduler as possible solution for scheduling the transmissions in a Bluetooth Piconet. We propose an algorithm which will reduce the bandwidth waste and enhance the efficiency of network. We define token counters to estimate traffic of real-time slaves. To increase bandwidth utilization, a back-off mechanism is then presented for best-effort slaves to decrease the frequency of polling idle slaves. Simulation results demonstrate that our scheme achieves better performance over the Round Robin scheduling.

Keywords: Piconet, Medium Access Control, Polling algorithm, Scheduling, QoS, Time Division Duplex (TDD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
8807 Understanding Factors Influencing E-Government Implementation in Saudi Arabia from an Organizational Perspective

Authors: M. Alassim, M. Alfayad, E. Abbott-Halpin

Abstract:

The purpose of this paper is to explore the organizational factors influencing the implementation of the e-government project within the public sector in Saudi Arabia. This project (also known as the Yesser programme) was established in Saudi Arabia in 2005 to control the e-government transformation process. The aims of the project are to provide a collaborative environment for government organizations to implement e-government and increase effectiveness and efficiency within the public sector. This paper sheds light on the organizational factors that have delayed implementation and achievement of the government’s vision and plans for Yesser. A qualitative approach was employed to understand those factors, by conducting a series of interviews with government officials for the data collection required. The analysis of the data uncovered seven organizational factors that are needed to advance implementation of the e-government project in Saudi Arabia and other similar states.

Keywords: E-government, e-transformation, ICT, Saudi Arabia, Yesser.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1028
8806 Mean Shift-based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work, we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2006
8805 A Game-Based Product Modelling Environment for Non-Engineer

Authors: Guolong Zhong, Venkatesh Chennam Vijay, Ilias Oraifige

Abstract:

In the last 20 years, Knowledge Based Engineering (KBE) has shown its advantages in product development in different engineering areas such as automation, mechanical, civil and aerospace engineering in terms of digital design automation and cost reduction by automating repetitive design tasks through capturing, integrating, utilising and reusing the existing knowledge required in various aspects of the product design. However, in primary design stages, the descriptive information of a product is discrete and unorganized while knowledge is in various forms instead of pure data. Thus, it is crucial to have an integrated product model which can represent the entire product information and its associated knowledge at the beginning of the product design. One of the shortcomings of the existing product models is a lack of required knowledge representation in various aspects of product design and its mapping to an interoperable schema. To overcome the limitation of the existing product model and methodologies, two key factors are considered. First, the product model must have well-defined classes that can represent the entire product information and its associated knowledge. Second, the product model needs to be represented in an interoperable schema to ensure a steady data exchange between different product modelling platforms and CAD software. This paper introduced a method to provide a general product model as a generative representation of a product, which consists of the geometry information and non-geometry information, through a product modelling framework. The proposed method for capturing the knowledge from the designers through a knowledge file provides a simple and efficient way of collecting and transferring knowledge. Further, the knowledge schema provides a clear view and format on the data that needed to be gathered in order to achieve a unified knowledge exchange between different platforms. This study used a game-based platform to make product modelling environment accessible for non-engineers. Further the paper goes on to test use case based on the proposed game-based product modelling environment to validate the effectiveness among non-engineers.

Keywords: Game-based learning, knowledge based engineering, product modelling, design automation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 743
8804 An Exploratory Study in Nursing Education: Factors Influencing Nursing Students’ Acceptance of Mobile Learning

Authors: R. Abdulrahman, A. Eardley, A. Soliman

Abstract:

The proliferation in the development of mobile learning (m-learning) has played a vital role in the rapidly growing electronic learning market. This relatively new technology can help to encourage the development of in learning and to aid knowledge transfer a number of areas, by familiarizing students with innovative information and communications technologies (ICT). M-learning plays a substantial role in the deployment of learning methods for nursing students by using the Internet and portable devices to access learning resources ‘anytime and anywhere’. However, acceptance of m-learning by students is critical to the successful use of m-learning systems. Thus, there is a need to study the factors that influence student’s intention to use m-learning. This paper addresses this issue. It outlines the outcomes of a study that evaluates the unified theory of acceptance and use of technology (UTAUT) model as applied to the subject of user acceptance in relation to m-learning activity in nurse education. The model integrates the significant components across eight prominent user acceptance models. Therefore, a standard measure is introduced with core determinants of user behavioural intention. The research model extends the UTAUT in the context of m-learning acceptance by modifying and adding individual innovativeness (II) and quality of service (QoS) to the original structure of UTAUT. The paper goes on to add the factors of previous experience (of using mobile devices in similar applications) and the nursing students’ readiness (to use the technology) to influence their behavioural intentions to use m-learning. This study uses a technique called ‘convenience sampling’ which involves student volunteers as participants in order to collect numerical data. A quantitative method of data collection was selected and involves an online survey using a questionnaire form. This form contains 33 questions to measure the six constructs, using a 5-point Likert scale. A total of 42 respondents participated, all from the Nursing Institute at the Armed Forces Hospital in Saudi Arabia. The gathered data were then tested using a research model that employs the structural equation modelling (SEM), including confirmatory factor analysis (CFA). The results of the CFA show that the UTAUT model has the ability to predict student behavioural intention and to adapt m-learning activity to the specific learning activities. It also demonstrates satisfactory, dependable and valid scales of the model constructs. This suggests further analysis to confirm the model as a valuable instrument in order to evaluate the user acceptance of m-learning activity.

Keywords: Mobile learning, nursing institute, unified theory of acceptance and use of technology model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1206
8803 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant

Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan

Abstract:

The most important process of the water treatment plant process is coagulation, which uses alum and poly aluminum chloride (PACL). Therefore, determining the dosage of alum and PACL is the most important factor to be prescribed. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for chemical dose prediction, as used for coagulation, such as alum and PACL, with input data consisting of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of the Bangkhen Water Treatment Plant (BKWTP), under the authority of the Metropolitan Waterworks Authority of Thailand. The data were collected from 1 January 2019 to 31 December 2019 in order to cover the changing seasons of Thailand. The input data of ANN are divided into three groups: training set, test set, and validation set. The coefficient of determination and the mean absolute errors of the alum model are 0.73, 3.18 and the PACL model are 0.59, 3.21, respectively.

Keywords: Soft jar test, jar test, water treatment plant process, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 664
8802 Integration of Asian Stock Markets

Authors: Noor A. Auzairy, Rubi Ahmad, Catherine S.F. Ho, Ros Z. Z. Sapian

Abstract:

This paper is to explore the relationship and the level of stock market integration of the Asian countries, primarily concentrating on Malaysia, Thailand, Indonesia, and South Korea, with the world from January 1997 to December 2009. The degree of short-run and long-run stock market integration of those Asian countries are analyzed in order to determine the significance of series of regional and world financial crises, liberalization policies and other financial reforms in influencing the level of stock market integration. To test for cointegration, this paper applies coefficient correlation, univariate regression analyses, cointegration tests, and vector autoregressive models (VAR) by using the four Asian stock markets main indices and the MSCI World index. The empirical findings from this work reveal that there is no long-run stock market integration for the four countries and the world market. However, there is short run integration.

Keywords: Asia, integration, relationship, stock market.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2478
8801 Enhanced Shell Sorting Algorithm

Authors: Basit Shahzad, Muhammad Tanvir Afzal

Abstract:

Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the elements being sorted to minimize the complexity and time as compared to insertion sort. Shell sort improves the efficiency of insertion sort by quickly shifting values to their destination. Average sort time is O(n1.25), while worst-case time is O(n1.5). It performs certain iterations. In each iteration it swaps some elements of the array in such a way that in last iteration when the value of h is one, the number of swaps will be reduced. Donald L. Shell invented a formula to calculate the value of ?h?. this work focuses to identify some improvement in the conventional Shell sort algorithm. ''Enhanced Shell Sort algorithm'' is an improvement in the algorithm to calculate the value of 'h'. It has been observed that by applying this algorithm, number of swaps can be reduced up to 60 percent as compared to the existing algorithm. In some other cases this enhancement was found faster than the existing algorithms available.

Keywords: Algorithm, Computation, Shell, Sorting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3136
8800 Convergence Analysis of an Alternative Gradient Algorithm for Non-Negative Matrix Factorization

Authors: Chenxue Yang, Mao Ye, Zijian Liu, Tao Li, Jiao Bao

Abstract:

Non-negative matrix factorization (NMF) is a useful computational method to find basis information of multivariate nonnegative data. A popular approach to solve the NMF problem is the multiplicative update (MU) algorithm. But, it has some defects. So the columnwisely alternating gradient (cAG) algorithm was proposed. In this paper, we analyze convergence of the cAG algorithm and show advantages over the MU algorithm. The stability of the equilibrium point is used to prove the convergence of the cAG algorithm. A classic model is used to obtain the equilibrium point and the invariant sets are constructed to guarantee the integrity of the stability. Finally, the convergence conditions of the cAG algorithm are obtained, which help reducing the evaluation time and is confirmed in the experiments. By using the same method, the MU algorithm has zero divisor and is convergent at zero has been verified. In addition, the convergence conditions of the MU algorithm at zero are similar to that of the cAG algorithm at non-zero. However, it is meaningless to discuss the convergence at zero, which is not always the result that we want for NMF. Thus, we theoretically illustrate the advantages of the cAG algorithm.

Keywords: Non-negative matrix factorizations, convergence, cAG algorithm, equilibrium point, stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697