Search results for: and processing time.
5872 Multi-Rate Exact Discretization based on Diagonalization of a Linear System - A Multiple-Real-Eigenvalue Case
Authors: T. Sakamoto, N. Hori
Abstract:
A multi-rate discrete-time model, whose response agrees exactly with that of a continuous-time original at all sampling instants for any sampling periods, is developed for a linear system, which is assumed to have multiple real eigenvalues. The sampling rates can be chosen arbitrarily and individually, so that their ratios can even be irrational. The state space model is obtained as a combination of a linear diagonal state equation and a nonlinear output equation. Unlike the usual lifted model, the order of the proposed model is the same as the number of sampling rates, which is less than or equal to the order of the original continuous-time system. The method is based on a nonlinear variable transformation, which can be considered as a generalization of linear similarity transformation, which cannot be applied to systems with multiple eigenvalues in general. An example and its simulation result show that the proposed multi-rate model gives exact responses at all sampling instants.Keywords: Multi-rate discretization, linear systems, triangularization, similarity transformation, diagonalization, exponential transformation, multiple eigenvalues
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13705871 Dynamic Traffic Simulation for Traffic Congestion Problem Using an Enhanced Algorithm
Authors: Wong Poh Lee, Mohd. Azam Osman, Abdullah Zawawi Talib, Ahmad Izani Md. Ismail
Abstract:
Traffic congestion has become a major problem in many countries. One of the main causes of traffic congestion is due to road merges. Vehicles tend to move slower when they reach the merging point. In this paper, an enhanced algorithm for traffic simulation based on the fluid-dynamic algorithm and kinematic wave theory is proposed. The enhanced algorithm is used to study traffic congestion at a road merge. This paper also describes the development of a dynamic traffic simulation tool which is used as a scenario planning and to forecast traffic congestion level in a certain time based on defined parameter values. The tool incorporates the enhanced algorithm as well as the two original algorithms. Output from the three above mentioned algorithms are measured in terms of traffic queue length, travel time and the total number of vehicles passing through the merging point. This paper also suggests an efficient way of reducing traffic congestion at a road merge by analyzing the traffic queue length and travel time.Keywords: Dynamic, fluid-dynamic, kinematic wave theory, simulation, traffic congestion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31515870 Coupling Time-Domain Analysis for Dynamic Positioning during S-Lay Installation
Authors: Sun Li-ping, Zhu Jian-xun, Liu Sheng-nan
Abstract:
In order to study the performance of dynamic positioning system during S-lay operations, dynamic positioning system is simulated with the hull-stinger-pipe coupling effect. The roller of stinger is simulated by the generalized elastic contact theory. The stinger is composed of Morrison members. Force on pipe is calculated by lumped mass method. Time domain of fully coupled barge model is analyzed combining with PID controller, Kalman filter and allocation of thrust using Sequential Quadratic Programming method. It is also analyzed that the effect of hull wave frequency motion on pipe-stinger coupling force and dynamic positioning system. Besides, it is studied that how S-lay operations affect the dynamic positioning accuracy. The simulation results are proved to be available by checking pipe stress with API criterion. The effect of heave and yaw motion cannot be ignored on hull-stinger-pipe coupling force and dynamic positioning system. It is important to decrease the barge’s pitch motion and lay pipe in head sea in order to improve safety of the S-lay installation and dynamic positioning.
Keywords: S-lay operation, dynamic positioning, coupling motion; time domain, allocation of thrust.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27835869 Segmentation of Noisy Digital Images with Stochastic Gradient Kernel
Authors: Abhishek Neogi, Jayesh Verma, Pinaki Pratim Acharjya
Abstract:
Image segmentation and edge detection is a fundamental section in image processing. In case of noisy images Edge Detection is very less effective if we use conventional Spatial Filters like Sobel, Prewitt, LOG, Laplacian etc. To overcome this problem we have proposed the use of Stochastic Gradient Mask instead of Spatial Filters for generating gradient images. The present study has shown that the resultant images obtained by applying Stochastic Gradient Masks appear to be much clearer and sharper as per Edge detection is considered.Keywords: Image segmentation, edge Detection, noisy images, spatialfilters, stochastic gradient kernel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15315868 Recursive Algorithms for Image Segmentation Based on a Discriminant Criterion
Authors: Bing-Fei Wu, Yen-Lin Chen, Chung-Cheng Chiu
Abstract:
In this study, a new criterion for determining the number of classes an image should be segmented is proposed. This criterion is based on discriminant analysis for measuring the separability among the segmented classes of pixels. Based on the new discriminant criterion, two algorithms for recursively segmenting the image into determined number of classes are proposed. The proposed methods can automatically and correctly segment objects with various illuminations into separated images for further processing. Experiments on the extraction of text strings from complex document images demonstrate the effectiveness of the proposed methods.1
Keywords: image segmentation, multilevel thresholding, clustering, discriminant analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20435867 Effect of Fermentation Time on Xanthan Gum Production from Sugar Beet Molasses
Authors: Marzieh Moosavi- Nasab, Safoora Pashangeh, Maryam Rafsanjani
Abstract:
Xanthan gum is a microbial polysaccharide of great commercial significance. The purpose of this study was to select the optimum fermentation time for xanthan gum production by Xanthomonas campestris (NRRL-B-1459) using 10% sugar beet molasses as a carbon source. The pre-heating of sugar beet molasses and the supplementation of the medium were investigated in order to improve xanthan gum production. Maximum xanthan gum production in fermentation media (9.02 g/l) was observed after 4 days shaking incubation at 25°C and 240 rpm agitation speed. A solution of 10% sucrose was used as a control medium. Results indicated that the optimum period for xanthan gum production in this condition was 4 days.Keywords: Biomass, Molasses, Xanthan gum, Xanthomonascampestris
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37465866 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems
Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan
Abstract:
Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.Keywords: Data mining, hybrid storage system, recurrent neural network, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17385865 Dynamic Analysis of Porous Media Using Finite Element Method
Authors: M. Pasbani Khiavi, A. R. M. Gharabaghi, K. Abedi
Abstract:
The mechanical behavior of porous media is governed by the interaction between its solid skeleton and the fluid existing inside its pores. The interaction occurs through the interface of gains and fluid. The traditional analysis methods of porous media, based on the effective stress and Darcy's law, are unable to account for these interactions. For an accurate analysis, the porous media is represented in a fluid-filled porous solid on the basis of the Biot theory of wave propagation in poroelastic media. In Biot formulation, the equations of motion of the soil mixture are coupled with the global mass balance equations to describe the realistic behavior of porous media. Because of irregular geometry, the domain is generally treated as an assemblage of fmite elements. In this investigation, the numerical formulation for the field equations governing the dynamic response of fluid-saturated porous media is analyzed and employed for the study of transient wave motion. A finite element model is developed and implemented into a computer code called DYNAPM for dynamic analysis of porous media. The weighted residual method with 8-node elements is used for developing of a finite element model and the analysis is carried out in the time domain considering the dynamic excitation and gravity loading. Newmark time integration scheme is developed to solve the time-discretized equations which are an unconditionally stable implicit method Finally, some numerical examples are presented to show the accuracy and capability of developed model for a wide variety of behaviors of porous media.
Keywords: Dynamic analysis, Interaction, Porous media, time domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18825864 A Constitutive Model for Time-Dependent Behavior of Clay
Authors: T. N. Mac, B. Shahbodaghkhan, N. Khalili
Abstract:
A new elastic-viscoplastic (EVP) constitutive model is proposed for the analysis of time-dependent behavior of clay. The proposed model is based on the bounding surface plasticity and the concept of viscoplastic consistency framework to establish continuous transition from plasticity to rate dependent viscoplasticity. Unlike the overstress based models, this model will meet the consistency condition in formulating the constitutive equation for EVP model. The procedure of deriving the constitutive relationship is also presented. Simulation results and comparisons with experimental data are then presented to demonstrate the performance of the model.
Keywords: Bounding surface, consistency theory, constitutive model, viscosity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27515863 Performance Enhancement of Motion Estimation Using SSE2 Technology
Authors: Trung Hieu Tran, Hyo-Moon Cho, Sang-Bock Cho
Abstract:
Motion estimation is the most computationally intensive part in video processing. Many fast motion estimation algorithms have been proposed to decrease the computational complexity by reducing the number of candidate motion vectors. However, these studies are for fast search algorithms themselves while almost image and video compressions are operated with software based. Therefore, the timing constraints for running these motion estimation algorithms not only challenge for the video codec but also overwhelm for some of processors. In this paper, the performance of motion estimation is enhanced by using Intel's Streaming SIMD Extension 2 (SSE2) technology with Intel Pentium 4 processor.Keywords: Motion Estimation, Full Search, Three StepSearch, MMX/SSE/SSE2 Technologies, SIMD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21095862 Online Partial Discharge Source Localization and Characterization Using Non-Conventional Method
Authors: Ammar Anwar Khan, Nissar R. Wani, Nazar Malik, Abdulrehman Al-Arainy, and Saad Alghuwainem
Abstract:
Power cables are vulnerable to failure due to aging or defects that occur with the passage of time under continuous operation and loading stresses. PD detection and characterization provide information on the location, nature, form and extent of the degradation. As a result, PD monitoring has become an important part of condition based maintenance (CBM) program among power utilities. Online partial discharge (PD) localization of defect sources in power cable system is possible using the time of flight method. The information regarding the time difference between the main and reflected pulses and cable length can help in locating the partial discharge source along the cable length. However, if the length of the cable is not known and the defect source is located at the extreme ends of the cable or in the middle of the cable, then double ended measurement is required to indicate the location of PD source. Use of multiple sensors can also help in discriminating the cable PD or local/ external PD. This paper presents the experience and results from online partial discharge measurements conducted in the laboratory and the challenges in partial discharge source localization.Keywords: Power cables, partial discharge localization, HFCT, condition based monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28305861 Improving Decision Support for Organ Transplant
Authors: I. McCulloh, A. Placona, D. Stewart, D. Gause, K. Kiernan, M. Stuart, C. Zinner, L. Cartwright
Abstract:
We find in our data that an alarming number of viable deceased donor kidneys are discarded every year in the US, while waitlisted candidates are dying every day. We observe as many as 85% of transplanted organs are refused at least once for a patient that scored higher on the match list. There are hundreds of clinical variables involved in making a clinical transplant decision and there is rarely an ideal match. Decision makers exhibit an optimism bias where they may refuse an organ offer assuming a better match is imminent. We propose a semi-parametric Cox proportional hazard model, augmented by an accelerated failure time model based on patient-specific suitable organ supply and demand to estimate a time-to-next-offer. Performance is assessed with Cox-Snell residuals and decision curve analysis, demonstrating improved decision support for up to a 5-year outlook. Providing clinical decision-makers with quantitative evidence of likely patient outcomes (e.g., time to next offer and the mortality associated with waiting) may improve decisions and reduce optimism bias, thus reducing discarded organs and matching more patients on the waitlist.
Keywords: Decision science, KDPI, optimism bias, organ transplant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885860 Design and Implementation of an Image Based System to Enhance the Security of ATM
Authors: Seyed Nima Tayarani Bathaie
Abstract:
In this paper, an image-receiving system was designed and implemented through optimization of object detection algorithms using Haar features. This optimized algorithm served as face and eye detection separately. Then, cascading them led to a clear image of the user. Utilization of this feature brought about higher security by preventing fraud. This attribute results from the fact that services will be given to the user on condition that a clear image of his face has already been captured which would exclude the inappropriate person. In order to expedite processing and eliminating unnecessary ones, the input image was compressed, a motion detection function was included in the program, and detection window size was confined.
Keywords: Face detection algorithm, Haar features, Security of ATM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21155859 Block-Based 2D to 3D Image Conversion Method
Authors: S. Sowmyayani, V. Murugan
Abstract:
With the advent of three-dimension (3D) technology, there are lots of research in converting 2D images to 3D images. The main difference between 2D and 3D is the visual illusion of depth in 3D images. In the recent era, there are more depth estimation techniques. The objective of this paper is to convert 2D images to 3D images with less computation time. For this, the input image is divided into blocks from which the depth information is obtained. Having the depth information, a depth map is generated. Then the 3D image is warped using the original image and the depth map. The proposed method is tested on Make3D dataset and NYU-V2 dataset. The experimental results are compared with other recent methods. The proposed method proved to work with less computation time and good accuracy.
Keywords: Depth map, 3D image warping, image rendering, bilateral filter, minimum spanning tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3685858 New Approaches on Stability Analysis for Neural Networks with Time-Varying Delay
Authors: Qingqing Wang, Shouming Zhong
Abstract:
Utilizing the Lyapunov functional method and combining linear matrix inequality (LMI) techniques and integral inequality approach (IIA) to analyze the global asymptotic stability for delayed neural networks (DNNs),a new sufficient criterion ensuring the global stability of DNNs is obtained.The criteria are formulated in terms of a set of linear matrix inequalities,which can be checked efficiently by use of some standard numercial packages.In order to show the stability condition in this paper gives much less conservative results than those in the literature,numerical examples are considered.
Keywords: Neural networks, Globally asymptotic stability , LMI approach , IIA approach , Time-varying delay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19485857 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems
Authors: Jalil Boudjadar
Abstract:
Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.Keywords: Time-critical systems, multicore systems, schedulability analysis, performance, memory interference, energy consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4755856 Visual-Graphical Methods for Exploring Longitudinal Data
Authors: H. W. Ker
Abstract:
Longitudinal data typically have the characteristics of changes over time, nonlinear growth patterns, between-subjects variability, and the within errors exhibiting heteroscedasticity and dependence. The data exploration is more complicated than that of cross-sectional data. The purpose of this paper is to organize/integrate of various visual-graphical techniques to explore longitudinal data. From the application of the proposed methods, investigators can answer the research questions include characterizing or describing the growth patterns at both group and individual level, identifying the time points where important changes occur and unusual subjects, selecting suitable statistical models, and suggesting possible within-error variance.Keywords: Data exploration, exploratory analysis, HLMs/LMEs, longitudinal data, visual-graphical methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21005855 Using Gaussian Process in Wind Power Forecasting
Authors: Hacene Benkhoula, Mohamed Badreddine Benabdella, Hamid Bouzeboudja, Abderrahmane Asraoui
Abstract:
The wind is a random variable difficult to master, for this, we developed a mathematical and statistical methods enable to modeling and forecast wind power. Gaussian Processes (GP) is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space or time and space. GP is an underlying process formed by unrecognized operator’s uses to solve a problem. The purpose of this paper is to present how to forecast wind power by using the GP. The Gaussian process method for forecasting are presented. To validate the presented approach, a simulation under the MATLAB environment has been given.Keywords: Forecasting, Gaussian process, modeling, wind power.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17965854 A New Scheduling Algorithm Based on Traffic Classification Using Imprecise Computation
Authors: Farzad Abtahi, Sahar Khanmohamadi, Bahram Sadeghi Bigham
Abstract:
Wireless channels are characterized by more serious bursty and location-dependent errors. Many packet scheduling algorithms have been proposed for wireless networks to guarantee fairness and delay bounds. However, most existing schemes do not consider the difference of traffic natures among packet flows. This will cause the delay-weight coupling problem. In particular, serious queuing delays may be incurred for real-time flows. In this paper, it is proposed a scheduling algorithm that takes traffic types of flows into consideration when scheduling packets and also it is provided scheduling flexibility by trading off video quality to meet the playback deadline.Keywords: Data communication, Real-time, Scheduling, Video transport.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14805853 Consumer Load Profile Determination with Entropy-Based K-Means Algorithm
Authors: Ioannis P. Panapakidis, Marios N. Moschakis
Abstract:
With the continuous increment of smart meter installations across the globe, the need for processing of the load data is evident. Clustering-based load profiling is built upon the utilization of unsupervised machine learning tools for the purpose of formulating the typical load curves or load profiles. The most commonly used algorithm in the load profiling literature is the K-means. While the algorithm has been successfully tested in a variety of applications, its drawback is the strong dependence in the initialization phase. This paper proposes a novel modified form of the K-means that addresses the aforementioned problem. Simulation results indicate the superiority of the proposed algorithm compared to the K-means.
Keywords: Clustering, load profiling, load modeling, machine learning, energy efficiency and quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12185852 Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery
Authors: Seema Biday, Udhav Bhosle
Abstract:
Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.Keywords: Correlation, Frequency domain, Multitemporal, Relative Radiometric Correction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19845851 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis
Authors: V. Venkatachalam, S. Selvan
Abstract:
The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17565850 The Consumer Private Space: What is and How it can be Approached without Affecting the Consumer's Privacy
Authors: Calin Veghes
Abstract:
The concept of privacy, seen in connection to the consumer's private space and personalization, has recently gained a higher importance as a consequence of the increasing marketing efforts of the organizations based on the capturing, processing and usage of consumer-s personal data.Paper intends to provide a definition of the consumer-s private space based on the types of personal data the consumer is willing to disclose, to assess the attitude toward personalization and to identify the means preferred by consumers to control their personal data and defend their private space. Several implications generated through the definition of the consumer-s private space are identified and weighted from both the consumers- and organizations- perspectives.
Keywords: Consumer private space, personalization, privacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15855849 Discrete Vector Control for Induction Motor Drives with the Rotor Time Constant Update
Authors: A.Larabi, M.S. Boucherit
Abstract:
In this paper, we investigated vector control of an induction machine taking into account discretization problems of the command. In the purpose to show how to include in a discrete model of this current control and with rotor time constant update. The results of simulation obtained are very satisfaisant. That was possible thanks to the good choice of the values of the parameters of the regulators used which shows, the founded good of the method used, for the choice of the parameters of the discrete regulators. The simulation results are presented at the end of this paper.
Keywords: Induction motor, discrete vector control, PIRegulator, transformation of park, PWM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15185848 An Approximation Method for Exact Boundary Controllability of Euler-Bernoulli System
Authors: Abdelaziz Khernane, Naceur Khelil, Leila Djerou
Abstract:
The aim of this work is to study the numerical implementation of the Hilbert Uniqueness Method for the exact boundary controllability of Euler-Bernoulli beam equation. This study may be difficult. This will depend on the problem under consideration (geometry, control and dimension) and the numerical method used. Knowledge of the asymptotic behaviour of the control governing the system at time T may be useful for its calculation. This idea will be developed in this study. We have characterized as a first step, the solution by a minimization principle and proposed secondly a method for its resolution to approximate the control steering the considered system to rest at time T.Keywords: Boundary control, exact controllability, finite difference methods, functional optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14965847 About the Case Portfolio Management Algorithms and Their Applications
Authors: M. Chumburidze, N. Salia, T. Namchevadze
Abstract:
This work deals with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.
Keywords: Credit network, case portfolio, binary tree, priority queue, stack.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 855846 Inventory Control for a Joint Replenishment Problem with Stochastic Demand
Authors: Bassem Roushdy, Nahed Sobhy, Abdelrhim Abdelhamid, Ahmed Mahmoud
Abstract:
Most papers model Joint Replenishment Problem (JRP) as a (kT,S) where kT is a multiple value for a common review period T,and S is a predefined order up to level. In general the (T,S) policy is characterized by a long out of control period which requires a large amount of safety stock compared to the (R,Q) policy. In this paper a probabilistic model is built where an item, call it item(i), with the shortest order time between interval (T)is modeled under (R,Q) policy and its inventory is continuously reviewed, while the rest of items (j) are periodically reviewed at a definite time corresponding to itemKeywords: Inventory management, Joint replenishment, policy evaluation, stochastic process
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30555845 Development of an Adhesive from Prosopis africana Seed Endosperm (Okpeyi)
Authors: Nwangwu Florence Chinyere, Ene Rosemary Ndidiamaka
Abstract:
This research work is an experimental study, through development of an adhesive from Prosopis africana endosperm. The prosopis seed for this work were obtained from Enugu State in the South East part of Nigeria. The seeds were prepared by separating the endosperm from the seed coat and cotyledon. Three methods were used to separate them, which are acidic method, roasting method and boiling method. 20g of seed were treated with different concentrations (25, 40, 55, 70, and 85% w/w) at 100°C and constant time (30 minutes), under continuous stirring with magnetic stirrer. Also 20g of seed were treated with sulphuric acid of concentrations 40% w/w at 100°C with different time (10, 15, 20, 25, 30 minutes), under continuous stirring with magnetic stirrer. Finally, 20g of seed were treated with sulphuric acid of concentrations 40% w/w at different temperature (20°C, 40°C, 60°C, 80°C, and 100°C) with constant time (30 minutes), under continuous stirring with magnetic stirrer. The whole endosperm extracted was adhesive. The physical properties of the adhesive were determined (appearance, odour, taste, solubility, pH, size, and binding strength). The percentage of the adhesive yield makes the commercialization of the seed in Nigeria possible and profitable. The very high viscosity attained at low concentrations makes prosopis adhesive an excellent thickener in the food industry.Keywords: Endosperm, adhesive, ethanol, Prosopis africana seed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14315844 Affecting Factors of the Mechanical Properties to Phenolic/Fiber Composite
Authors: Thirapat Kitinirunkul, Nattawat Winya, Komson Prapunkarn
Abstract:
Influences of the amount of phenolic, curing temperature and curing time on the Mechanical Properties of phenolic/fiber composite were investigated by using two-level factorial design. The latter was used to determine the affects of those factors on mechanical properties. The purpose of this study was to investigate the affects of amount of phenolic, curing temperature and curing time of the composite to determine the best condition for mechanical properties according to MIL-I-24768 by the tensile strength is more than 103 MPa.
Keywords: Phenolic Resin, Composite, Fiber Composite, Affecting Factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41305843 A Novel Dual-Purpose Image Watermarking Technique
Authors: Maha Sharkas, Dahlia R. ElShafie, Nadder Hamdy
Abstract:
Image watermarking has proven to be quite an efficient tool for the purpose of copyright protection and authentication over the last few years. In this paper, a novel image watermarking technique in the wavelet domain is suggested and tested. To achieve more security and robustness, the proposed techniques relies on using two nested watermarks that are embedded into the image to be watermarked. A primary watermark in form of a PN sequence is first embedded into an image (the secondary watermark) before being embedded into the host image. The technique is implemented using Daubechies mother wavelets where an arbitrary embedding factor α is introduced to improve the invisibility and robustness. The proposed technique has been applied on several gray scale images where a PSNR of about 60 dB was achieved.Keywords: Image watermarking, Multimedia Security, Wavelets, Image Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708