Search results for: filtered Poisson process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5542

Search results for: filtered Poisson process

5482 A Novel Machining Signal Filtering Technique: Z-notch Filter

Authors: Nuawi M. Z., Lamin F., Ismail A. R., Abdullah S., Wahid Z.

Abstract:

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Keywords: Digital signal filtering, I-kaz method, Machiningmonitoring, Noise Cancelling, Sound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
5481 Object-Centric Process Mining Using Process Cubes

Authors: Anahita Farhang Ghahfarokhi, Alessandro Berti, Wil M.P. van der Aalst

Abstract:

Process mining provides ways to analyze business processes. Common process mining techniques consider the process as a whole. However, in real-life business processes, different behaviors exist that make the overall process too complex to interpret. Process comparison is a branch of process mining that isolates different behaviors of the process from each other by using process cubes. Process cubes organize event data using different dimensions. Each cell contains a set of events that can be used as an input to apply process mining techniques. Existing work on process cubes assume single case notions. However, in real processes, several case notions (e.g., order, item, package, etc.) are intertwined. Object-centric process mining is a new branch of process mining addressing multiple case notions in a process. To make a bridge between object-centric process mining and process comparison, we propose a process cube framework, which supports process cube operations such as slice and dice on object-centric event logs. To facilitate the comparison, the framework is integrated with several object-centric process discovery approaches.

Keywords: Process mining, multidimensional process mining, multi-perspective business processes, OLAP, process cubes, process discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
5480 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland

Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli

Abstract:

This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.

Keywords: Analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 620
5479 On Four Models of a Three Server Queue with Optional Server Vacations

Authors: Kailash C. Madan

Abstract:

We study four models of a three server queueing system with Bernoulli schedule optional server vacations. Customers arriving at the system one by one in a Poisson process are provided identical exponential service by three parallel servers according to a first-come, first served queue discipline. In model A, all three servers may be allowed a vacation at one time, in Model B at the most two of the three servers may be allowed a vacation at one time, in model C at the most one server is allowed a vacation, and in model D no server is allowed a vacation. We study steady the state behavior of the four models and obtain steady state probability generating functions for the queue size at a random point of time for all states of the system. In model D, a known result for a three server queueing system without server vacations is derived.

Keywords: A three server queue, Bernoulli schedule server vacations, queue size distribution at a random epoch, steady state.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349
5478 Methods for Material and Process Monitoring by Characterization of (Second and Third Order) Elastic Properties with Lamb Waves

Authors: R. Meier, M. Pander

Abstract:

In accordance with the industry 4.0 concept, manufacturing process steps as well as the materials themselves are going to be more and more digitalized within the next years. The “digital twin” representing the simulated and measured dataset of the (semi-finished) product can be used to control and optimize the individual processing steps and help to reduce costs and expenditure of time in product development, manufacturing, and recycling. In the present work, two material characterization methods based on Lamb waves were evaluated and compared. For demonstration purpose, both methods were shown at a standard industrial product - copper ribbons, often used in photovoltaic modules as well as in high-current microelectronic devices. By numerical approximation of the Rayleigh-Lamb dispersion model on measured phase velocities second order elastic constants (Young’s modulus, Poisson’s ratio) were determined. Furthermore, the effective third order elastic constants were evaluated by applying elastic, “non-destructive”, mechanical stress on the samples. In this way, small microstructural variations due to mechanical preconditioning could be detected for the first time. Both methods were compared with respect to precision and inline application capabilities. Microstructure of the samples was systematically varied by mechanical loading and annealing. Changes in the elastic ultrasound transport properties were correlated with results from microstructural analysis and mechanical testing. In summary, monitoring the elastic material properties of plate-like structures using Lamb waves is valuable for inline and non-destructive material characterization and manufacturing process control. Second order elastic constants analysis is robust over wide environmental and sample conditions, whereas the effective third order elastic constants highly increase the sensitivity with respect to small microstructural changes. Both Lamb wave based characterization methods are fitting perfectly into the industry 4.0 concept.

Keywords: Lamb waves, industry 4.0, process control, elasticity, acoustoelasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1067
5477 Simulation of Co2 Capture Process

Authors: K. Movagharnejad, M. Akbari

Abstract:

Carbon dioxide capture process has been simulated and studied under different process conditions. It has been shown that several process parameters such as lean amine temperature, number of adsorber stages, number of stripper stages and stripper pressure affect different process conditions and outputs such as carbon dioxide removal and reboiler duty. It may be concluded that the simulation of carbon dioxide capture process can help to estimate the best process conditions.

Keywords: Absorption, carbon dioxide capture, desorption, process simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3162
5476 Real Time Compensation of Machining Errors for Machine Tools NC based on Systematic Dispersion

Authors: M. Rahou, A. Cheikh, F. Sebaa

Abstract:

Manufacturing tolerancing is intended to determine the intermediate geometrical and dimensional states of the part during its manufacturing process. These manufacturing dimensions also serve to satisfy not only the functional requirements given in the definition drawing, but also the manufacturing constraints, for example geometrical defects of the machine, vibration and the wear of the cutting tool. In this paper, an experimental study on the influence of the wear of the cutting tool (systematic dispersions) is explored. This study was carried out on three stages .The first stage allows machining without elimination of dispersions (random, systematic) so the tolerances of manufacture according to total dispersions. In the second stage, the results of the first stage are filtered in such way to obtain the tolerances according to random dispersions. Finally, from the two previous stages, the systematic dispersions are generated. The objective of this study is to model by the least squares method the error of manufacture based on systematic dispersion. Finally, an approach of optimization of the manufacturing tolerances was developed for machining on a CNC machine tool

Keywords: Dispersions, Compensation, modeling, manufacturing Tolerance, machine tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2316
5475 Critical Assessment of Scoring Schemes for Protein-Protein Docking Predictions

Authors: Dhananjay C. Joshi, Jung-Hsin Lin

Abstract:

Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.

Keywords: protein-protein docking, protein-protein interaction, molecular mechanics energetics, Poisson-Boltzmann calculations

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779
5474 Performance Analysis of Software Reliability Models using Matrix Method

Authors: RajPal Garg, Kapil Sharma, Rajive Kumar, R. K. Garg

Abstract:

This paper presents a computational methodology based on matrix operations for a computer based solution to the problem of performance analysis of software reliability models (SRMs). A set of seven comparison criteria have been formulated to rank various non-homogenous Poisson process software reliability models proposed during the past 30 years to estimate software reliability measures such as the number of remaining faults, software failure rate, and software reliability. Selection of optimal SRM for use in a particular case has been an area of interest for researchers in the field of software reliability. Tools and techniques for software reliability model selection found in the literature cannot be used with high level of confidence as they use a limited number of model selection criteria. A real data set of middle size software project from published papers has been used for demonstration of matrix method. The result of this study will be a ranking of SRMs based on the Permanent value of the criteria matrix formed for each model based on the comparison criteria. The software reliability model with highest value of the Permanent is ranked at number – 1 and so on.

Keywords: Matrix method, Model ranking, Model selection, Model selection criteria, Software reliability models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2293
5473 A New Approach Defining Angular DMD Using Near Field Aperturing

Authors: S. Al-Sowayan, K. L. Lear

Abstract:

A new technique to quantify the differential mode delay (DMD) in multimode fiber (MMF) is been presented. The technique measures DMD based on angular launch and measurements of the difference in modal delay using variable apertures at the fiber face. The result of the angular spatial filtering revealed less excitation of higher order modes when the laser beam is filtered at higher angles. This result would indicate that DMD profiles would experience a data pattern dependency.

Keywords: Fiber measurements, Fiber optic communications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
5472 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated. Two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: Business process repository, Petri nets, Simulation, Woflan, Yasper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
5471 A Goal-Oriented Social Business Process Management Framework

Authors: Mohammad Ehson Rangiha, Bill Karakostas

Abstract:

Social Business Process Management (SBPM) promises to overcome limitations of traditional BPM by allowing flexible process design and enactment through the involvement of users from a social community. This paper proposes a meta-model and architecture for socially driven business process management systems. It discusses the main facets of the architecture such as goalbased role assignment that combines social recommendations with user profile, and process recommendation, through a real example of a charity organization.

Keywords: Business Process Management, Goal-Based Modelling, Process Recommendation Social Collaboration, Social BPM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2542
5470 Fuzzy Estimation of Parameters in Statistical Models

Authors: A. Falsafain, S. M. Taheri, M. Mashinchi

Abstract:

Using a set of confidence intervals, we develop a common approach, to construct a fuzzy set as an estimator for unknown parameters in statistical models. We investigate a method to derive the explicit and unique membership function of such fuzzy estimators. The proposed method has been used to derive the fuzzy estimators of the parameters of a Normal distribution and some functions of parameters of two Normal distributions, as well as the parameters of the Exponential and Poisson distributions.

Keywords: Confidence interval. Fuzzy number. Fuzzy estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2246
5469 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
5468 Process Capability Analysis by Using Statistical Process Control of Rice Polished Cylinder Turning Practice

Authors: S. Bangphan, P. Bangphan, T. Boonkang

Abstract:

Quality control helps industries in improvements of its product quality and productivity. Statistical Process Control (SPC) is one of the tools to control the quality of products that turning practice in bringing a department of industrial engineering process under control. In this research, the process control of a turning manufactured at workshops machines. The varying measurements have been recorded for a number of samples of a rice polished cylinder obtained from a number of trials with the turning practice. SPC technique has been adopted by the process is finally brought under control and process capability is improved.

Keywords: Rice polished cylinder, statistical process control, control charts, process capability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3694
5467 Business Process Orientation: Case of Croatia

Authors: Ljubica Milanović Glavan

Abstract:

Because of the increasing business pressures, companies must be adaptable and flexible in order to withstand them. Inadequate business processes and low level of business process orientation, that in its core accentuates business processes as opposed to business functions and focuses on process performance and customer satisfaction, hider the ability to adapt to changing environment. It has been shown in previous studies that the companies which have reached higher business process maturity level consistently outperform those that have not reached them. The aim of this paper is to provide a basic understanding of business process orientation concept and business process maturity model. Besides that the paper presents the state of business process orientation in Croatia that has been captured with a study conducted in 2013. Based on the results some practical implications and guidelines for managers are given.

Keywords: Business process orientation, business process maturity, Croatia, maturity score.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
5466 A New Divide and Conquer Software Process Model

Authors: Hina Gull, Farooque Azam, Wasi Haider Butt, Sardar Zafar Iqbal

Abstract:

The software system goes through a number of stages during its life and a software process model gives a standard format for planning, organizing and running a project. The article presents a new software development process model named as “Divide and Conquer Process Model", based on the idea first it divides the things to make them simple and then gathered them to get the whole work done. The article begins with the backgrounds of different software process models and problems in these models. This is followed by a new divide and conquer process model, explanation of its different stages and at the end edge over other models is shown.

Keywords: Process Model, Waterfall, divide and conquer, Requirements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909
5465 Effect of Pectinase on the Physico-Chemical Properties of Juice from Pawpaw (Carica papaya) Fruits

Authors: Idoko J. O., Achusi N.

Abstract:

A procedure for the preparation of clarified Pawpaw Juice was developed. About 750ml Pawpaw pulp was measured into 2 measuring cylinders A & B of capacity 1 litre heated to 400C, cooled to 200C. 30mls pectinase was added into cylinder A, while 30mls distilled water was added into cylinder B. Enzyme treated sample (A) was allowed to digest for 5hours after which it was heated to 900C for 15 minutes to inactivate the enzyme. The heated sample was cooled and with the aid of a mucillin cloth the pulp was filtered to obtain the clarified pawpaw juice. The juice was filled into 100ml plastic bottles, pasteurized at 950C for 45 minutes, cooled and stored at room temperature. The sample treated with 30mls distilled water also underwent the same process. Freshly pasteurized sample was analyzed for specific gravity, titratable acidity, pH, sugars and ascorbic acid. The remaining sample was then stored for 2 weeks and the above analyses repeated. There were differences in the results of the freshly pasteurized samples and stored sample in pH and ascorbic acid levels, also sample treated with pectinase yielded higher volumes of juice than that treated with distilled water.

Keywords: Juice, pawpaw, pectinase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2442
5464 Design and Characterization of a CMOS Process Sensor Utilizing Vth Extractor Circuit

Authors: Rohana Musa, Yuzman Yusoff, Chia Chieu Yin, Hanif Che Lah

Abstract:

This paper presents the design and characterization of a low power Complementary Metal Oxide Semiconductor (CMOS) process sensor. The design is targeted for implementation using Silterra’s 180 nm CMOS process technology. The proposed process sensor employs a voltage threshold (Vth) extractor architecture for detection of variations in the fabrication process. The process sensor generates output voltages in the range of 401 mV (fast-fast corner) to 443 mV (slow-slow corner) at nominal condition. The power dissipation for this process sensor is 6.3 µW with a supply voltage of 1.8V with a silicon area of 190 µm X 60 µm. The preliminary result of this process sensor that was fabricated indicates a close resemblance between test and simulated results.

Keywords: CMOS Process sensor, Process, Voltage and Temperature (PVT) sensor, threshold extractor circuit, Vth extractor circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 718
5463 A Quantitative Approach to Strategic Design of Component-Based Business Process Models

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

Keywords: Business process model, process component, component management goals, measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646
5462 Schema and Data Migration of a Relational Database RDB to the Extensible Markup Language XML

Authors: Alae El Alami, Mohamed Bahaj

Abstract:

This article discusses the passage of RDB to XML documents (schema and data) based on metadata and semantic enrichment, which makes the RDB under flattened shape and is enriched by the object concept. The integration and exploitation of the object concept in the XML uses a syntax allowing for the verification of the conformity of the document XML during the creation. The information extracted from the RDB is therefore analyzed and filtered in order to adjust according to the structure of the XML files and the associated object model. Those implemented in the XML document through a SQL query are built dynamically. A prototype was implemented to realize automatic migration, and so proves the effectiveness of this particular approach.

Keywords: RDB, XML, DTD, semantic enrichment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
5461 Measuring Process Component Design on Achieving Managerial Goals

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

Process-oriented software development is a new software development paradigm in which software design is modeled by a business process which is in turn translated into a process execution language for execution. The building blocks of this paradigm are software units that are composed together to work according to the flow of the business process. This new paradigm still exhibits the characteristic of the applications built with the traditional software component technology. This paper discusses an approach to apply a traditional technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses, and these process components can be reused to design the business processes of other application domains. The decomposition considers five managerial goals, namely cost effectiveness, ease of assembly, customization, reusability, and maintainability. The paper presents how to design or decompose process components from a business process model and measure some technical features of the design that would affect the managerial goals. A comparison between the measurement values from different designs can tell which process component design is more appropriate for the managerial goals that have been set. The proposed approach can be applied in Web Services environment which accommodates process-oriented software development.

Keywords: Business Process Model, Managerial Goals, ProcessComponent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
5460 Propagation of Nonlinear Surface Waves in Relativistically Degenerate Quantum Plasma Half-Space

Authors: Swarniv Chandra, Parthasona Maji, Basudev Ghosh

Abstract:

The nonlinear self-interaction of an electrostatic surface wave on a semibounded quantum plasma with relativistic degeneracy is investigated by using quantum hydrodynamic (QHD) model and the Poisson’s equation with appropriate boundary conditions. It is shown that a part of the second harmonic generated through self-interaction does not have a true surface wave character but propagates obliquely away from the plasma-vacuum interface into the bulk of plasma.

Keywords: Harmonic Generation, Quantum Plasma, Quantum Hydrodynamic Model, Relativistic Degeneracy, Surface waves.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238
5459 High Speed Video Transmission for Telemedicine using ATM Technology

Authors: J. P. Dubois, H. M. Chiu

Abstract:

In this paper, we study statistical multiplexing of VBR video in ATM networks. ATM promises to provide high speed realtime multi-point to central video transmission for telemedicine applications in rural hospitals and in emergency medical services. Video coders are known to produce variable bit rate (VBR) signals and the effects of aggregating these VBR signals need to be determined in order to design a telemedicine network infrastructure capable of carrying these signals. We first model the VBR video signal and simulate it using a generic continuous-data autoregressive (AR) scheme. We carry out the queueing analysis by the Fluid Approximation Model (FAM) and the Markov Modulated Poisson Process (MMPP). The study has shown a trade off: multiplexing VBR signals reduces burstiness and improves resource utilization, however, the buffer size needs to be increased with an associated economic cost. We also show that the MMPP model and the Fluid Approximation model fit best, respectively, the cell region and the burst region. Therefore, a hybrid MMPP and FAM completely characterizes the overall performance of the ATM statistical multiplexer. The ramifications of this technology are clear: speed, reliability (lower loss rate and jitter), and increased capacity in video transmission for telemedicine. With migration to full IP-based networks still a long way to achieving both high speed and high quality of service, the proposed ATM architecture will remain of significant use for telemedicine.

Keywords: ATM, multiplexing, queueing, telemedicine, VBR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
5458 The Multi-objective Optimization for the SLS Process Parameters Based on Analytic Hierarchy Process

Authors: Yang Laixia, Deng Jun, Li Dichen, Bai Yang

Abstract:

The forming process parameters of Selective Laser Sintering(SLS) directly affect the forming efficiency and forming quality. Therefore, to determine reasonable process parameters is particularly important. In this paper, the weight of each target of the forming quality and efficiency is firstly calculated with the Analytic Hierarchy Process. And then the size of each target is measured by orthogonal experiment. Finally, the sum of the product of each target with the weight is compared to the process parameters in each group and obtained the optimal molding process parameters.

Keywords: Analytic Hierarchy Process, Multi-objective optimization, Orthogonal test, Selective Laser Sintering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2020
5457 The Importance of 3D Mesh Generation for Large Eddy Simulation of Gas – Solid Turbulent Flows in a Fluidized Beds

Authors: G. González-Silva, E. M. Matos, W. P. Martignoni, M. Mori

Abstract:

The objective of this work is to show a procedure for mesh generation in a fluidized bed using large eddy simulations (LES) of a filtered two-fluid model. The experimental data were obtained by [1] in a laboratory fluidized bed. Results show that it is possible to use mesh with less cells as compared to RANS turbulence model with granular kinetic theory flow (KTGF). Also, the numerical results validate the experimental data near wall of the bed, which cannot be predicted by RANS.model.

Keywords: LES, Mesh, Gas-Solid, Fluidized bed

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2100
5456 Software Engineering Interoperable Environment for University Process Workflow and Document Management

Authors: Bekim Fetaji, Majlinda Fetaji, Mirlinda Ebibi

Abstract:

The objective of the research was focused on the design, development and evaluation of a sustainable web based network system to be used as an interoperable environment for University process workflow and document management. In this manner the most of the process workflows in Universities can be entirely realized electronically and promote integrated University. Definition of the most used University process workflows enabled creating electronic workflows and their execution on standard workflow execution engines. Definition or reengineering of workflows provided increased work efficiency and helped in having standardized process through different faculties. The concept and the process definition as well as the solution applied as Case study are evaluated and findings are reported.

Keywords: design process workflows, workflow and documentmanagement, Business Process, software engineering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1304
5455 Adaptive Sliding Mode Observer for a Class of Systems

Authors: D.Elleuch, T.Damak

Abstract:

In this paper, the performance of two adaptive observers applied to interconnected systems is studied. The nonlinearity of systems can be written in a fractional form. The first adaptive observer is an adaptive sliding mode observer for a Lipchitz nonlinear system and the second one is an adaptive sliding mode observer having a filtered error as a sliding surface. After comparing their performances throughout the inverted pendulum mounted on a car system, it was shown that the second one is more robust to estimate the state.

Keywords: Adaptive observer, Lipchitz system, Interconnected fractional nonlinear system, sliding mode.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
5454 Control-flow Complexity Measurement of Processes and Weyuker's Properties

Authors: Jorge Cardoso

Abstract:

Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.

Keywords: Business process measurement, workflow, complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2668
5453 Optimal Performance of Plastic Extrusion Process Using Fuzzy Goal Programming

Authors: Abbas Al-Refaie

Abstract:

This study optimized the performance of plastic extrusion process of drip irrigation pipes using fuzzy goal programming. Two main responses were of main interest; roll thickness and hardness. Four main process factors were studied. The L18 array was then used for experimental design. The individual-moving range control charts were used to assess the stability of the process, while the process capability index was used to assess process performance. Confirmation experiments were conducted at the obtained combination of optimal factor setting by fuzzy goal programming. The results revealed that process capability was improved significantly from -1.129 to 0.8148 for roll thickness and from 0.0965 to 0.714 and hardness. Such improvement results in considerable savings in production and quality costs.

Keywords: Fuzzy goal programming, extrusion process, process capability, irrigation plastic pipes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 873