Search results for: Gaussian processes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1874

Search results for: Gaussian processes

1454 Ambient Intelligence in the Production and Retail Sector: Emerging Opportunities and Potential Pitfalls

Authors: Carsten Röcker

Abstract:

This paper provides an introduction into the evolution of information and communication technology and illustrates its usage in the work domain. The paper is sub-divided into two parts. The first part gives an overview over the different phases of information processing in the work domain. It starts by charting the past and present usage of computers in work environments and shows current technological trends, which are likely to influence future business applications. The second part starts by briefly describing, how the usage of computers changed business processes in the past, and presents first Ambient Intelligence applications based on identification and localization information, which are already used in the production and retail sector. Based on current systems and prototype applications, the paper gives an outlook of how Ambient Intelligence technologies could change business processes in the future.

Keywords: Ambient Intelligence, Ubiquitous Computing, Business Applications, Radio Frequency Identification (RFID)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
1453 Developing a Web-Based Tender Evaluation System Based on Fuzzy Multi-Attributes Group Decision Making for Nigerian Public Sector Tendering

Authors: Bello Abdullahi, Yahaya M. Ibrahim, Ahmed D. Ibrahim, Kabir Bala

Abstract:

Public sector tendering has traditionally been conducted using manual paper-based processes which are known to be inefficient, less transparent and more prone to manipulations and errors. The advent of the Internet and the World Wide Web has led to the development of numerous e-Tendering systems that addressed some of the problems associated with the manual paper-based tendering system. However, most of these systems rarely support the evaluation of tenders and where they do it is mostly based on the single decision maker which is not suitable in public sector tendering, where for the sake of objectivity, transparency, and fairness, it is required that the evaluation is conducted through a tender evaluation committee. Currently, in Nigeria, the public tendering process in general and the evaluation of tenders, in particular, are largely conducted using manual paper-based processes. Automating these manual-based processes to digital-based processes can help in enhancing the proficiency of public sector tendering in Nigeria. This paper is part of a larger study to develop an electronic tendering system that supports the whole tendering lifecycle based on Nigerian procurement law. Specifically, this paper presents the design and implementation of part of the system that supports group evaluation of tenders based on a technique called fuzzy multi-attributes group decision making. The system was developed using Object-Oriented methodologies and Unified Modelling Language and hypothetically applied in the evaluation of technical and financial proposals submitted by bidders. The system was validated by professionals with extensive experiences in public sector procurement. The results of the validation showed that the system called NPS-eTender has an average rating of 74% with respect to correct and accurate modelling of the existing manual tendering domain and an average rating of 67.6% with respect to its potential to enhance the proficiency of public sector tendering in Nigeria. Thus, based on the results of the validation, the automation of the evaluation process to support tender evaluation committee is achievable and can lead to a more proficient public sector tendering system.

Keywords: e-Tendering, e-Procurement, public tendering, tender evaluation, tender evaluation committee, web-based group decision support system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 684
1452 VaR Forecasting in Times of Increased Volatility

Authors: Ivo Jánský, Milan Rippel

Abstract:

The paper evaluates several hundred one-day-ahead VaR forecasting models in the time period between the years 2004 and 2009 on data from six world stock indices - DJI, GSPC, IXIC, FTSE, GDAXI and N225. The models model mean using the ARMA processes with up to two lags and variance with one of GARCH, EGARCH or TARCH processes with up to two lags. The models are estimated on the data from the in-sample period and their forecasting accuracy is evaluated on the out-of-sample data, which are more volatile. The main aim of the paper is to test whether a model estimated on data with lower volatility can be used in periods with higher volatility. The evaluation is based on the conditional coverage test and is performed on each stock index separately. The primary result of the paper is that the volatility is best modelled using a GARCH process and that an ARMA process pattern cannot be found in analyzed time series.

Keywords: VaR, risk analysis, conditional volatility, garch, egarch, tarch, moving average process, autoregressive process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390
1451 Smart Surveillance using PDA

Authors: Basem Mustafa Abd. Amer , Syed Abdul Rahman Al-Attas

Abstract:

The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS technique

Keywords: Surveillance, PDA, Motion Detection, ImageProcessing , Background Subtraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
1450 Environmental Potentials within the Production of Asphalt Mixtures

Authors: Florian Gschösser, Walter Purrer

Abstract:

The paper shows examples for the (environmental) optimization of production processes for asphalt mixtures applied for typical road pavements in Austria and Switzerland. The conducted “from-cradle-to-gate” LCA firstly analyzes the production one cubic meter of asphalt and secondly all material production processes for exemplary highway pavements applied in Austria and Switzerland. It is shown that environmental impacts can be reduced by the application of reclaimed asphalt pavement (RAP) and by the optimization of specific production characteristics, e.g. the reduction of the initial moisture of the mineral aggregate and the reduction of the mixing temperature by the application of low-viscosity and foam bitumen. The results of the LCA study demonstrate reduction potentials per cubic meter asphalt of up to 57 % (Global Warming Potential–GWP) and 77 % (Ozone depletion–ODP). The analysis per square meter of asphalt pavement determined environmental potentials of up to 40 % (GWP) and 56 % (ODP).

Keywords: Asphalt mixtures, environmental potentials, life cycle assessment, material production.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056
1449 Automatic Musical Genre Classification Using Divergence and Average Information Measures

Authors: Hassan Ezzaidi, Jean Rouat

Abstract:

Recently many research has been conducted to retrieve pertinent parameters and adequate models for automatic music genre classification. In this paper, two measures based upon information theory concepts are investigated for mapping the features space to decision space. A Gaussian Mixture Model (GMM) is used as a baseline and reference system. Various strategies are proposed for training and testing sessions with matched or mismatched conditions, long training and long testing, long training and short testing. For all experiments, the file sections used for testing are never been used during training. With matched conditions all examined measures yield the best and similar scores (almost 100%). With mismatched conditions, the proposed measures yield better scores than the GMM baseline system, especially for the short testing case. It is also observed that the average discrimination information measure is most appropriate for music category classifications and on the other hand the divergence measure is more suitable for music subcategory classifications.

Keywords: Audio feature, information measures, music genre.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544
1448 The Potential Use of Nanofilters to Supply Potable Water in Persian Gulf and Oman Sea Watershed Basin

Authors: Sara Zamani, Mojtaba Fazeli, Abdollah Rashidi Mehrabadi

Abstract:

In a world worried about water resources with the shadow of drought and famine looming all around, the quality of water is as important as its quantity. The source of all concerns is the constant reduction of per capita quality water for different uses. Iran With an average annual precipitation of 250 mm compared to the 800 mm world average, Iran is considered a water scarce country and the disparity in the rainfall distribution, the limitations of renewable resources and the population concentration in the margins of desert and water scarce areas have intensified the problem. The shortage of per capita renewable freshwater and its poor quality in large areas of the country, which have saline, brackish or hard water resources, and the profusion of natural and artificial pollutant have caused the deterioration of water quality. Among methods of treatment and use of these waters one can refer to the application of membrane technologies, which have come into focus in recent years due to their great advantages. This process is quite efficient in eliminating multi-capacity ions; and due to the possibilities of production at different capacities, application as treatment process in points of use, and the need for less energy in comparison to Reverse Osmosis processes, it can revolutionize the water and wastewater sector in years to come. The article studied the different capacities of water resources in the Persian Gulf and Oman Sea watershed basins, and processes the possibility of using nanofiltration process to treat brackish and non-conventional waters in these basins.

Keywords: Membrane processes, saline waters, brackish waters, hard waters, zoning water quality in the Persian Gulf and the Oman Sea Watershed area, nanofiltration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
1447 Time Comparative Simulator for Distributed Process Scheduling Algorithms

Authors: Nazleeni Samiha Haron, Anang Hudaya Muhamad Amin, Mohd Hilmi Hasan, Izzatdin Abdul Aziz, Wirdhayu Mohd Wahid

Abstract:

In any distributed systems, process scheduling plays a vital role in determining the efficiency of the system. Process scheduling algorithms are used to ensure that the components of the system would be able to maximize its utilization and able to complete all the processes assigned in a specified period of time. This paper focuses on the development of comparative simulator for distributed process scheduling algorithms. The objectives of the works that have been carried out include the development of the comparative simulator, as well as to implement a comparative study between three distributed process scheduling algorithms; senderinitiated, receiver-initiated and hybrid sender-receiver-initiated algorithms. The comparative study was done based on the Average Waiting Time (AWT) and Average Turnaround Time (ATT) of the processes involved. The simulation results show that the performance of the algorithms depends on the number of nodes in the system.

Keywords: Distributed Systems, Load Sharing, Process Scheduling, AWT and ATT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
1446 Proteolysis in Serbian Traditional Dry Fermented Sausage Petrovská klobása as Influenced by Different Ripening Processes

Authors: P. M. Ikonić, T. A. Tasić, L. S. Petrović, S. B. Škaljac, M. R. Jokanović, V. M. Tomović, B. V. Šojić, N. R. Džinić, A. M. Torbica, B. B. Ikonić

Abstract:

The aim of the study was to determine how different ripening processes (traditional vs. industrial) influenced the proteolysis in traditional Serbian dry-fermented sausage Petrovská klobása. The obtained results indicated more intensive pH decline (0.7 units after 9 days) in industrially ripened products (I), what had a positive impact on drying process and proteolytic changes in these samples. Thus, moisture content in I sausages was lower at each sampling time, amounting 24.7% at the end of production period (90 days). Likewise, the process of proteolysis was more pronounced in I samples, resulting in higher contents of non-protein nitrogen (NPN) and free amino acids nitrogen (FAAN), as well as in faster and more intensive degradation of myosin (≈220 kDa), actin (≈45 kDa) and other polypeptides during processing. Consequently, the appearance and accumulation of several protein fragments were registered.

Keywords: Dry-fermented sausage, Petrovská klobása, proteolysis, ripening process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4327
1445 Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm

Authors: Badarch Tuyatsetseg

Abstract:

This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.

Keywords: A mixture of lognormal distributions, modeling call holding times, public safety network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
1444 Considering Aerosol Processes in Nuclear Transport Package Containment Safety Cases

Authors: Andrew Cummings, Rhianne Boag, Sarah Bryson, Gordon Turner

Abstract:

Packages designed for transport of radioactive material must satisfy rigorous safety regulations specified by the International Atomic Energy Agency (IAEA). Higher Activity Waste (HAW) transport packages have to maintain containment of their contents during normal and accident conditions of transport (NCT and ACT). To ensure containment criteria is satisfied these packages are required to be leak-tight in all transport conditions to meet allowable activity release rates. Package design safety reports are the safety cases that provide the claims, evidence and arguments to demonstrate that packages meet the regulations and once approved by the competent authority (in the UK this is the Office for Nuclear Regulation) a licence to transport radioactive material is issued for the package(s). The standard approach to demonstrating containment in the RWM transport safety case is set out in BS EN ISO 12807. In this document a method for measuring a leak rate from the package is explained by way of a small interspace test volume situated between two O-ring seals on the underside of the package lid. The interspace volume is pressurised and a pressure drop measured. A small interspace test volume makes the method more sensitive enabling the measurement of smaller leak rates. By ascertaining the activity of the contents, identifying a releasable fraction of material and by treating that fraction of material as a gas, allowable leak rates for NCT and ACT are calculated. The adherence to basic safety principles in ISO12807 is very pessimistic and current practice in the demonstration of transport safety, which is accepted by the UK regulator. It is UK government policy that management of HAW will be through geological disposal. It is proposed that the intermediate level waste be transported to the geological disposal facility (GDF) in large cuboid packages. This poses a challenge for containment demonstration because such packages will have long seals and therefore large interspace test volumes. There is also uncertainty on the releasable fraction of material within the package ullage space. This is because the waste may be in many different forms which makes it difficult to define the fraction of material released by the waste package. Additionally because of the large interspace test volume, measuring the calculated leak rates may not be achievable. For this reason a justification for a lower releasable fraction of material is sought. This paper considers the use of aerosol processes to reduce the releasable fraction for both NCT and ACT. It reviews the basic coagulation and removal processes and applies the dynamic aerosol balance equation. The proposed solution includes only the most well understood physical processes namely; Brownian coagulation and gravitational settling. Other processes have been eliminated either on the basis that they would serve to reduce the release to the environment further (pessimistically in keeping with the essence of nuclear transport safety cases) or that they are not credible in the conditions of transport considered.

Keywords: Aerosol processes, Brownian coagulation, gravitational settling, transport regulations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
1443 An Intelligent System Framework for Generating Activity List of a Project Using WBS Mind map and Semantic Network

Authors: H. Iranmanesh, M. Madadi

Abstract:

Work Breakdown Structure (WBS) is one of the most vital planning processes of the project management since it is considered to be the fundamental of other processes like scheduling, controlling, assigning responsibilities, etc. In fact WBS or activity list is the heart of a project and omission of a simple task can lead to an irrecoverable result. There are some tools in order to generate a project WBS. One of the most powerful tools is mind mapping which is the basis of this article. Mind map is a method for thinking together and helps a project manager to stimulate the mind of project team members to generate project WBS. Here we try to generate a WBS of a sample project involving with the building construction using the aid of mind map and the artificial intelligence (AI) programming language. Since mind map structure can not represent data in a computerized way, we convert it to a semantic network which can be used by the computer and then extract the final WBS from the semantic network by the prolog programming language. This method will result a comprehensive WBS and decrease the probability of omitting project tasks.

Keywords: Expert System, Mind map, Semantic network, Work breakdown structure,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2559
1442 Redesigning Business Processes: A Method Based on Simulation and Process Mining Techniques

Authors: Zahra Mohammadnazari, Fateme Rostambeygi, Fatemeh Dehrouyeh, Hwang Ki-Soon, Amir Aghsami

Abstract:

Corporations have always prioritized efforts to examine and improve processes. Various metrics, such as the cost and time required to implement the process and can be specified in this regard. Process improvement can be defined as an improvement of these indicators. This is accomplished by looking at prospective adjustments to the current executive process model or the resources allotted to it. Research has been conducted in this paper to the improve the procurement process and aims to explore assessment prospects in the project using a combination of process mining and simulation (benefiting from Play-In and Play-Out methodologies). To run the simulation, we will need to complete the control flow diagram, institution settings, resource settings, and activity settings. The process of mining event logs yields the process control flow. However, both the entry of institutions and the distribution of resources must be modeled. The rate of admission of institutions and the distribution of time for the implementation of activities will be determined in the next step.

Keywords: Business reengineering, Petri net, process-based simulation, process mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 418
1441 Finite Element Analysis of the Blanking and Stamping Processes of Nuclear Fuel Spacer Grids

Authors: R. O. Santos, L. P. Moreira, M. C. Cardoso

Abstract:

Spacer grid assembly supporting the nuclear fuel rods is an important concern in the design of structural components of a Pressurized Water Reactor (PWR). The spacer grid is composed by springs and dimples which are formed from a strip sheet by means of blanking and stamping processes. In this paper, the blanking process and tooling parameters are evaluated by means of a 2D plane-strain finite element model in order to evaluate the punch load and quality of the sheared edges of Inconel 718 strips used for nuclear spacer grids. A 3D finite element model is also proposed to predict the tooling loads resulting from the stamping process of a preformed Inconel 718 strip and to analyse the residual stress effects upon the spring and dimple design geometries of a nuclear spacer grid.

Keywords: Blanking process, damage model, finite element modelling, Inconel 718, spacer grids, stamping process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2750
1440 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

Authors: Amir Hajian, Sepehr Damavandinejadmonfared

Abstract:

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Keywords: Biometrics, finger vein recognition, Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
1439 A Norm-based Approach for Profiling Business Knowledge

Authors: Nazmona Mat Ali, Kecheng Liu

Abstract:

Knowledge is a key asset for any organisation to sustain competitive advantages, but it is difficult to identify and represent knowledge which is needed to perform activities in business processes. The effective knowledge management and support for relevant business activities definitely gives a huge impact to the performance of the organisation as a whole. This is because that knowledge have the functions of directing, coordinating and controlling actions within business processes. The study has introduced organisational morphology, a norm-based approach by applying semiotic theories which emphasise on the representation of knowledge in norms. This approach is concerned with the identification of activities into three categories: substantive, communication and control activities. All activities are directed by norms; hence three types of norms exist; each is associated to a category of activities. The paper describes the approach briefly and illustrates the application of this approach through a case study of academic activities in higher education institutions. The result of the study shows that the approach provides an effective way to profile business knowledge and the profile enables the understanding and specification of business requirements of an organisation.

Keywords: Business knowledge, Business process, Norms, Semiotics, Organisational morphology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
1438 Morphing Human Faces: Automatic Control Points Selection and Color Transition

Authors: Stephen Karungaru, Minoru Fukumi, Norio Akamatsu

Abstract:

In this paper, we propose a morphing method by which face color images can be freely transformed. The main focus of this work is the transformation of one face image to another. This method is fully automatic in that it can morph two face images by automatically detecting all the control points necessary to perform the morph. A face detection neural network, edge detection and medium filters are employed to detect the face position and features. Five control points, for both the source and target images, are then extracted based on the facial features. Triangulation method is then used to match and warp the source image to the target image using the control points. Finally color interpolation is done using a color Gaussian model that calculates the color for each particular frame depending on the number of frames used. A real coded Genetic algorithm is used in both the image warping and color blending steps to assist in step size decisions and speed up the morphing. This method results in ''very smooth'' morphs and is fast to process.

Keywords: color transition, genetic algorithms morphing, warping

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2783
1437 Optimized Fuzzy Control by Particle Swarm Optimization Technique for Control of CSTR

Authors: Saeed Vaneshani, Hooshang Jazayeri-Rad

Abstract:

Fuzzy logic control (FLC) systems have been tested in many technical and industrial applications as a useful modeling tool that can handle the uncertainties and nonlinearities of modern control systems. The main drawback of the FLC methodologies in the industrial environment is challenging for selecting the number of optimum tuning parameters. In this paper, a method has been proposed for finding the optimum membership functions of a fuzzy system using particle swarm optimization (PSO) algorithm. A synthetic algorithm combined from fuzzy logic control and PSO algorithm is used to design a controller for a continuous stirred tank reactor (CSTR) with the aim of achieving the accurate and acceptable desired results. To exhibit the effectiveness of proposed algorithm, it is used to optimize the Gaussian membership functions of the fuzzy model of a nonlinear CSTR system as a case study. It is clearly proved that the optimized membership functions (MFs) provided better performance than a fuzzy model for the same system, when the MFs were heuristically defined.

Keywords: continuous stirred tank reactor (CSTR), fuzzy logiccontrol (FLC), membership function(MF), particle swarmoptimization (PSO)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3163
1436 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: Welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 755
1435 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based On Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focusses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: Auto-ID, Construction Logistic, Fuzzy, Monitoring, RFID, Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
1434 Automated Buffer Box Assembly Cell Concept for the Canadian Used Fuel Packing Plant

Authors: Dimitrie Marinceu, Alan Murchison

Abstract:

The Canadian Used Fuel Container (UFC) is a mid-size hemispherical headed copper coated steel container measuring 2.5 meters in length and 0.5 meters in diameter containing 48 used fuel bundles. The contained used fuel produces significant gamma radiation requiring automated assembly processes to complete the assembly. The design throughput of 2,500 UFCs per year places constraints on equipment and hot cell design for repeatability, speed of processing, robustness and recovery from upset conditions. After UFC assembly, the UFC is inserted into a Buffer Box (BB). The BB is made from adequately pre-shaped blocks (lower and upper block) and Highly Compacted Bentonite (HCB) material. The blocks are practically ‘sandwiching’ the UFC between them after assembly. This paper identifies one possible approach for the BB automatic assembly cell and processes. Automation of the BB assembly will have a significant positive impact on nuclear safety, quality, productivity, and reliability.

Keywords: Used fuel packing plant, automatic assembly cell, used fuel container, buffer box, deep geological repository.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
1433 Optimal Power Allocation for the Proposed Asymmetric Turbo Code for 3G Systems

Authors: K. Ramasamy, B. Balamuralithara, Mohammad Umar Siddiqi

Abstract:

We proposed a new class of asymmetric turbo encoder for 3G systems that performs well in both “water fall" and “error floor" regions in [7]. In this paper, a modified (optimal) power allocation scheme for the different bits of new class of asymmetric turbo encoder has been investigated to enhance the performance. The simulation results and performance bound for proposed asymmetric turbo code with modified Unequal Power Allocation (UPA) scheme for the frame length, N=400, code rate, r=1/3 with Log-MAP decoder over Additive White Gaussian Noise (AWGN) channel are obtained and compared with the system with typical UPA and without UPA. The performance tests are extended over AWGN channel for different frame size to verify the possibility of implementation of the modified UPA scheme for the proposed asymmetric turbo code. From the performance results, it is observed that the proposed asymmetric turbo code with modified UPA performs better than the system without UPA and with typical UPA and it provides a coding gain of 0.4 to 0.52dB.

Keywords: Asymmetric turbo code, Generator polynomial, Interleaver, UPA, WCDMA, cdma2000.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
1432 Weigh-in-Motion Data Analysis Software for Developing Traffic Data for Mechanistic Empirical Pavement Design

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

Currently, there are few user friendly Weigh-in- Motion (WIM) data analysis softwares available which can produce traffic input data for the recently developed AASHTOWare pavement Mechanistic-Empirical (ME) design software. However, these softwares have only rudimentary Quality Control (QC) processes. Therefore, they cannot properly deal with erroneous WIM data. As the pavement performance is highly sensible to the quality of WIM data, it is highly recommended to use more refined QC process on raw WIM data to get a good result. This study develops a userfriendly software, which can produce traffic input for the ME design software. This software takes the raw data (Class and Weight data) collected from the WIM station and processes it with a sophisticated QC procedure. Traffic data such as traffic volume, traffic distribution, axle load spectra, etc. can be obtained from this software; which can directly be used in the ME design software.

Keywords: Weigh-in-motion, software, axle load spectra, traffic distribution, AASHTOWare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
1431 On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel

Authors: Tatjana Eitrich, Bruno Lang

Abstract:

This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.

Keywords: Support Vector Machine Training, Multi-ParameterKernels, Shared Memory Parallel Computing, Large Data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
1430 Evolutionary of Prostate Cancer Stem Cells in Prostate Duct

Authors: Zachariah Sinkala

Abstract:

A systems approach model for prostate cancer in prostate duct, as a sub-system of the organism is developed. It is accomplished in two steps. First this research work starts with a nonlinear system of coupled Fokker-Plank equations which models continuous process of the system like motion of cells. Then extended to PDEs that include discontinuous processes like cell mutations, proliferation and deaths. The discontinuous processes is modeled by using intensity poisson processes. The model incorporates the features of the prostate duct. The system of PDEs spatial coordinate is along the proximal distal axis. Its parameters depend on features of the prostate duct. The movement of cells is biased towards distal region and mutations of prostate cancer cells is localized in the proximal region. Numerical solutions of the full system of equations are provided, and are exhibit traveling wave fronts phenomena. This motivates the use of the standard transformation to derive a canonically related system of ODEs for traveling wave solutions. The results obtained show persistence of prostate cancer by showing that the non-negative cone for the traveling wave system is time invariant. The traveling waves have a unique global attractor is proved also. Biologically, the global attractor verifies that evolution of prostate cancer stem cells exhibit the avascular tumor growth. These numerical solutions show that altering prostate stem cell movement or mutation of prostate cancer cells lead to avascular tumor. Conclusion with comments on clinical implications of the model is discussed.

Keywords: Fokker-Plank equations, global attractor, stem cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
1429 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: Connected components, Embrace threads, Local weighted kernel, Structuring element.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128
1428 Diversity and Public Decision Making

Authors: Karin Hansson, Göran Cars, Mats Danielson, Love Ekenberg, Aron Larsson

Abstract:

Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.

Keywords: Negotiation games, Agenda setting, Multi-criteria decision analysis, Elicitation method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
1427 Treatment of Chrome Tannery Wastewater by Biological Process - A Mini Review

Authors: Supriyo Goswami, Debabrata Mazumder

Abstract:

Chrome tannery wastewater causes serious environmental hazard due to its high pollution potential. As a result, rigorous treatment is necessary for abatement of pollution from this type of wastewater. There are many research studies on chrome tannery wastewater treatment in the field of physical, chemical, and biological methods. In general, biological treatment process is found ineffective for direct application because of adverse effects by toxic chromium, sulphide, chloride etc. However, biological methods were employed mainly for a few sub processes generating significant amount of organic matter and without chromium, chlorides etc. In this context the present paper reviews the characteristics feature and pollution potential of wastewater generated from chrome tannery units and treatment of the same. The different biological processes used earlier and their chronological development for treatment of the chrome tannery wastewater are thoroughly reviewed in this paper. In this regard, the scope of hybrid bioreactor - an advanced technology option has also been explored, as this kind of treatment is well suited for the wastewater having inhibitory substances. 

Keywords: Composite tannery wastewater, biological treatment, Hybrid bioreactor, Organic removal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4180
1426 Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform

Authors: Mohamed S. El-Mahallawy, Attalah Hashad, Hazem Hassan Ali, Heba Sami Zaky

Abstract:

This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.

Keywords: AWGN, DWT, PSNR, Watermarking, VideoQuality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
1425 Social Business Process Management and Business Process Management Maturity

Authors: Dalia Suša Vugec, Vesna Bosilj Vukšić, Ljubica Milanović Glavan

Abstract:

Business process management (BPM) is a well-known holistic discipline focused on managing business processes with the intention of achieving higher level of BPM maturity and better organizational performance. In recent period, traditional BPM faced some of its limitations like model-reality divide and lost innovation. Following latest trends, as an attempt to overcome the issues of traditional BPM, there has been an introduction of applying the principles of social software in managing business processes which led to the development of social BPM. However, there are not many authors or studies dealing with this topic so this study aims to contribute to that literature gap and to examine the link between the level of BPM maturity and the usage of social BPM. To meet these objectives, a survey within the companies with more than 50 employees has been conducted. The results reveal that the usage of social BPM is higher within the companies which achieved higher level of BPM maturity. This paper provides an overview, analysis and discussion of collected data regarding BPM maturity and social BPM within the observed companies and identifies the main social BPM principles.

Keywords: Business process management, BPM maturity, process performance index, social BPM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589