Search results for: process selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16843

Search results for: process selection

16423 A Design of the Organic Rankine Cycle for the Low Temperature Waste Heat

Authors: K. Fraňa, M. Müller

Abstract:

A presentation of the design of the Organic Rankine Cycle (ORC) with heat regeneration and super-heating processes is a subject of this paper. The maximum temperature level in the ORC is considered to be 110°C and the maximum pressure varies up to 2.5MPa. The selection process of the appropriate working fluids, thermal design and calculation of the cycle and its components are described. With respect to the safety, toxicity, flammability, price and thermal cycle efficiency, the working fluid selected is R134a. As a particular example, the thermal design of the condenser used for the ORC engine with a theoretical thermal power of 179 kW was introduced. The minimal heat transfer area for a completed condensation was determined to be approximately 520m2.

Keywords: organic rankine cycle, thermal efficiency, working fluids, environmental engineering

Procedia PDF Downloads 454
16422 Bridging the Gap between Different Interfaces for Business Process Modeling

Authors: Katalina Grigorova, Kaloyan Mironov

Abstract:

The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.

Keywords: business process modeling, business process modeling standards, workflow patterns, converting models

Procedia PDF Downloads 578
16421 Optimization of Element Type for FE Model and Verification of Analyses with Physical Tests

Authors: Mustafa Tufekci, Caner Guven

Abstract:

In Automotive Industry, sliding door systems that are also used as body closures, are safety members. Extreme product tests are realized to prevent failures in a design process, but these tests realized experimentally result in high costs. Finite element analysis is an effective tool used for the design process. These analyses are used before production of a prototype for validation of design according to customer requirement. In result of this, the substantial amount of time and cost is saved. Finite element model is created for geometries that are designed in 3D CAD programs. Different element types as bar, shell and solid, can be used for creating mesh model. The cheaper model can be created by the selection of element type, but combination of element type that was used in model, number and geometry of element and degrees of freedom affects the analysis result. Sliding door system is a good example which used these methods for this study. Structural analysis was realized for sliding door mechanism by using FE models. As well, physical tests that have same boundary conditions with FE models were realized. Comparison study for these element types, were done regarding test and analyses results then the optimum combination was achieved.

Keywords: finite element analysis, sliding door mechanism, element type, structural analysis

Procedia PDF Downloads 324
16420 Faults Diagnosis by Thresholding and Decision tree with Neuro-Fuzzy System

Authors: Y. Kourd, D. Lefebvre

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. This paper proposes a method of fault diagnosis based on a neuro-fuzzy hybrid structure. This hybrid structure combines the selection of threshold and decision tree. The validation of this method is obtained with the DAMADICS benchmark. In the first phase of the method, a model will be constructed that represents the normal state of the system to fault detection. Signatures of the faults are obtained with residuals analysis and selection of appropriate thresholds. These signatures provide groups of non-separable faults. In the second phase, we build faulty models to see the flaws in the system that cannot be isolated in the first phase. In the latest phase we construct the tree that isolates these faults.

Keywords: decision tree, residuals analysis, ANFIS, fault diagnosis

Procedia PDF Downloads 623
16419 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data

Authors: K. Sathishkumar, V. Thiagarasu

Abstract:

Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.

Keywords: microarray technology, gene expression data, clustering, gene Selection

Procedia PDF Downloads 321
16418 Laboratory Investigation of Alkali-Surfactant-Alternate Gas (ASAG) Injection – a Novel EOR Process for a Light Oil Sandstone Reservoir

Authors: Vidit Mohan, Ashwin P. Ramesh, Anirudh Toshniwal

Abstract:

Alkali-Surfactant-Alternate-Gas(ASAG) injection, a novel EOR process has the potential to improve displacement efficiency over Surfactant-Alternate-Gas(SAG) by addressing the problem of surfactant adsorption by clay minerals in rock matrix. A detailed laboratory investigation on ASAG injection process was carried out with encouraging results. To further enhance recovery over WAG injection process, SAG injection was investigated at laboratory scale. SAG injection yielded marginal incremental displacement efficiency over WAG process. On investigation, it was found that, clay minerals in rock matrix adsorbed the surfactants and were detrimental for SAG process. Hence, ASAG injection was conceptualized using alkali as a clay stabilizer. The experiment of ASAG injection with surfactant concentration of 5000 ppm and alkali concentration of 0.5 weight% yields incremental displacement efficiency of 5.42% over WAG process. The ASAG injection is a new process and has potential to enhance efficiency of WAG/SAG injection process.

Keywords: alkali surfactant alternate gas (ASAG), surfactant alternate gas (SAG), laboratory investigation, EOR process

Procedia PDF Downloads 476
16417 The Acquisition of Case in Biological Domain Based on Text Mining

Authors: Shen Jian, Hu Jie, Qi Jin, Liu Wei Jie, Chen Ji Yi, Peng Ying Hong

Abstract:

In order to settle the problem of acquiring case in biological related to design problems, a biometrics instance acquisition method based on text mining is presented. Through the construction of corpus text vector space and knowledge mining, the feature selection, similarity measure and case retrieval method of text in the field of biology are studied. First, we establish a vector space model of the corpus in the biological field and complete the preprocessing steps. Then, the corpus is retrieved by using the vector space model combined with the functional keywords to obtain the biological domain examples related to the design problems. Finally, we verify the validity of this method by taking the example of text.

Keywords: text mining, vector space model, feature selection, biologically inspired design

Procedia PDF Downloads 258
16416 An Evaluation on the Methodology of Manufacturing High Performance Organophilic Clay at the Most Efficient and Cost Effective Process

Authors: Siti Nur Izati Azmi, Zatil Afifah Omar, Kathi Swaran, Navin Kumar

Abstract:

Organophilic Clays, also known as Organoclays, is used as a viscosifier in Oil based Drilling fluids. Most often, Organophilic clay are produced from modified Sodium and Calcium based Bentonite. Many studies and data show that Organophilic Clay using Hectorite based clays provide the best yield and good fluid loss properties in an oil-based drilling fluid at a higher cost. In terms of the manufacturing process, the two common methods of manufacturing organophilic clays are a Wet Process and a Dry Process. Wet process is known to produce better performance product at a higher cost while Dry Process shorten the production time. Hence, the purpose of this study is to evaluate the various formulation of an organophilic clay and its performance vs. the cost, as well as to determine the most efficient and cost-effective method of manufacturing organophilic clays.

Keywords: organophilic clay, viscosifier, wet process, dry process

Procedia PDF Downloads 222
16415 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 126
16414 Ionometallurgy for Recycling Silver in Silicon Solar Panel

Authors: Emmanuel Billy

Abstract:

This work is in the CABRISS project (H2020 projects) which aims at developing innovative cost-effective methods for the extraction of materials from the different sources of PV waste: Si based panels, thin film panels or Si water diluted slurries. Aluminum, silicon, indium, and silver will especially be extracted from these wastes in order to constitute materials feedstock which can be used later in a closed-loop process. The extraction of metals from silicon solar cells is often an energy-intensive process. It requires either smelting or leaching at elevated temperature, or the use of large quantities of strong acids or bases that require energy to produce. The energy input equates to a significant cost and an associated CO2 footprint, both of which it would be desirable to reduce. Thus there is a need to develop more energy-efficient and environmentally-compatible processes. Thus, ‘ionometallurgy’ could offer a new set of environmentally-benign process for metallurgy. This work demonstrates that ionic liquids provide one such method since they can be used to dissolve and recover silver. The overall process associates leaching, recovery and the possibility to re-use the solution in closed-loop process. This study aims to evaluate and compare different ionic liquids to leach and recover silver. An electrochemical analysis is first implemented to define the best system for the Ag dissolution. Effects of temperature, concentration and oxidizing agent are evaluated by this approach. Further, a comparative study between conventional approach (nitric acid, thiourea) and the ionic liquids (Cu and Al) focused on the leaching efficiency is conducted. A specific attention has been paid to the selection of the Ionic Liquids. Electrolytes composed of chelating anions are used to facilitate the lixiviation (Cl, Br, I,), avoid problems dealing with solubility issues of metallic species and of classical additional ligands. This approach reduces the cost of the process and facilitates the re-use of the leaching medium. To define the most suitable ionic liquids, electrochemical experiments have been carried out to evaluate the oxidation potential of silver include in the crystalline solar cells. Then, chemical dissolution of metals for crystalline solar cells have been performed for the most promising ionic liquids. After the chemical dissolution, electrodeposition has been performed to recover silver under a metallic form.

Keywords: electrodeposition, ionometallurgy, leaching, recycling, silver

Procedia PDF Downloads 244
16413 Exploring the Importance of Different Product Cues on the Selection for Chocolate from the Consumer Perspective

Authors: Ezeni Brzovska, Durdana Ozretic-Dosen

Abstract:

The purpose of this paper is to deepen the understanding of the product cues that influence purchase decision for a specific product category – chocolate, and to identify demographic differences in the buying behavior. ANOVA was employed for analyzing the significance level for nine product cues, and the survey showed statistically significant differences among different age and gender groups, and between respondents with different levels of education. From the theoretical perspective, the study adds to the existing knowledge by contributing with the research results from the new environment (Southeast Europe, Macedonia), which has been neglected so far. Establishing the level of significance for the product cues that affect buying behavior in the chocolate consumption context might help managers to improve marketing decision-making, and better meet consumer needs through identifying opportunities for packaging innovations and/or personalization toward different target groups.

Keywords: chocolate consumption context, chocolate selection, demographic characteristics, product cues

Procedia PDF Downloads 247
16412 Integrated Design in Additive Manufacturing Based on Design for Manufacturing

Authors: E. Asadollahi-Yazdi, J. Gardan, P. Lafon

Abstract:

Nowadays, manufactures are encountered with production of different version of products due to quality, cost and time constraints. On the other hand, Additive Manufacturing (AM) as a production method based on CAD model disrupts the design and manufacturing cycle with new parameters. To consider these issues, the researchers utilized Design For Manufacturing (DFM) approach for AM but until now there is no integrated approach for design and manufacturing of product through the AM. So, this paper aims to provide a general methodology for managing the different production issues, as well as, support the interoperability with AM process and different Product Life Cycle Management tools. The problem is that the models of System Engineering which is used for managing complex systems cannot support the product evolution and its impact on the product life cycle. Therefore, it seems necessary to provide a general methodology for managing the product’s diversities which is created by using AM. This methodology must consider manufacture and assembly during product design as early as possible in the design stage. The latest approach of DFM, as a methodology to analyze the system comprehensively, integrates manufacturing constraints in the numerical model in upstream. So, DFM for AM is used to import the characteristics of AM into the design and manufacturing process of a hybrid product to manage the criteria coming from AM. Also, the research presents an integrated design method in order to take into account the knowledge of layers manufacturing technologies. For this purpose, the interface model based on the skin and skeleton concepts is provided, the usage and manufacturing skins are used to show the functional surface of the product. Also, the material flow and link between the skins are demonstrated by usage and manufacturing skeletons. Therefore, this integrated approach is a helpful methodology for designer and manufacturer in different decisions like material and process selection as well as, evaluation of product manufacturability.

Keywords: additive manufacturing, 3D printing, design for manufacturing, integrated design, interoperability

Procedia PDF Downloads 312
16411 Supply Chain Risk Management (SCRM): A Simplified Alternative for Implementing SCRM for Small and Medium Enterprises

Authors: Paul W. Murray, Marco Barajas

Abstract:

Recent changes in supply chains, especially globalization and collaboration, have created new risks for enterprises of all sizes. A variety of complex frameworks, often based on enterprise risk management strategies have been presented under the heading of Supply Chain Risk Management (SCRM). The literature on promotes the benefits of a robust SCRM strategy; however, implementing SCRM is difficult and resource demanding for Large Enterprises (LEs), and essentially out of reach for Small and Medium Enterprises (SMEs). This research debunks the idea that SCRM is necessary for all enterprises and instead proposes a simple and effective Vendor Selection Template (VST). Empirical testing and a survey of supply chain practitioners provide a measure of validation to the VST. The resulting VSTis a valuable contribution because is easy to use, provides practical results, and is sufficiently flexible to be universally applied to SMEs.

Keywords: multiple regression analysis, supply chain management, risk assessment, vendor selection

Procedia PDF Downloads 461
16410 Optical Variability of Faint Quasars

Authors: Kassa Endalamaw Rewnu

Abstract:

The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.

Keywords: nuclear activity, galaxies, active quasars, variability

Procedia PDF Downloads 75
16409 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit

Authors: Davit Mirzoyan, Ararat Khachatryan

Abstract:

A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.

Keywords: detection, monitoring, process corner, process variation

Procedia PDF Downloads 518
16408 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins

Authors: Manju Kanu, Subrata Sinha, Surabhi Johari

Abstract:

Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.

Keywords: epitope, b cell, immunogenicity, ebola

Procedia PDF Downloads 308
16407 GIS Pavement Maintenance Selection Strategy

Authors: Mekdelawit Teferi Alamirew

Abstract:

As a practical tool, the Geographical information system (GIS) was used for data integration, collection, management, analysis, and output presentation in pavement mangement systems . There are many GIS techniques to improve the maintenance activities like Dynamic segmentation and weighted overlay analysis which considers Multi Criteria Decision Making process. The results indicated that the developed MPI model works sufficiently and yields adequate output for providing accurate decisions. Hence considering multi criteria to prioritize the pavement sections for maintenance, as a result of the fact that GIS maps can express position, extent, and severity of pavement distress features more effectively than manual approaches, lastly the paper also offers digitized distress maps that can help agencies in their decision-making processes.

Keywords: pavement, flexible, maintenance, index

Procedia PDF Downloads 57
16406 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, J. Franke

Abstract:

The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.

Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production

Procedia PDF Downloads 726
16405 Avoidance of Brittle Fracture in Bridge Bearings: Brittle Fracture Tests and Initial Crack Size

Authors: Natalie Hoyer

Abstract:

Bridges in both roadway and railway systems depend on bearings to ensure extended service life and functionality. These bearings enable proper load distribution from the superstructure to the substructure while permitting controlled movement of the superstructure. The design of bridge bearings, according to Eurocode DIN EN 1337 and the relevant sections of DIN EN 1993, increasingly requires the use of thick plates, especially for long-span bridges. However, these plate thicknesses exceed the limits specified in the national appendix of DIN EN 1993-2. Furthermore, compliance with DIN EN 1993-1-10 regulations regarding material toughness and through-thickness properties necessitates further modifications. Consequently, these standards cannot be directly applied to the selection of bearing materials without supplementary guidance and design rules. In this context, a recommendation was developed in 2011 to regulate the selection of appropriate steel grades for bearing components. Prior to the initiation of the research project underlying this contribution, this recommendation had only been available as a technical bulletin. Since July 2023, it has been integrated into guideline 804 of the German railway. However, recent findings indicate that certain bridge-bearing components are exposed to high fatigue loads, which necessitate consideration in structural design, material selection, and calculations. Therefore, the German Centre for Rail Traffic Research called a research project with the objective of defining a proposal to expand the current standards in order to implement a sufficient choice of steel material for bridge bearings to avoid brittle fracture, even for thick plates and components subjected to specific fatigue loads. The results obtained from theoretical considerations, such as finite element simulations and analytical calculations, are validated through large-scale component tests. Additionally, experimental observations are used to calibrate the calculation models and modify the input parameters of the design concept. Within the large-scale component tests, a brittle failure is artificially induced in a bearing component. For this purpose, an artificially generated initial defect is introduced at the previously defined hotspot into the specimen using spark erosion. Then, a dynamic load is applied until the crack initiation process occurs to achieve realistic conditions in the form of a sharp notch similar to a fatigue crack. This initiation process continues until the crack length reaches a predetermined size. Afterward, the actual test begins, which requires cooling the specimen with liquid nitrogen until a temperature is reached where brittle fracture failure is expected. In the next step, the component is subjected to a quasi-static tensile test until failure occurs in the form of a brittle failure. The proposed paper will present the latest research findings, including the results of the conducted component tests and the derived definition of the initial crack size in bridge bearings.

Keywords: bridge bearings, brittle fracture, fatigue, initial crack size, large-scale tests

Procedia PDF Downloads 38
16404 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: business process adaptation, business process evolution, business process modelling, and context awareness

Procedia PDF Downloads 407
16403 Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management

Authors: Marcos Devaner, Marcela Alves, Cledson Braga, Fabiano Alves, Wilton Bezerra

Abstract:

This article discusses the inclusion of people with disabilities in the process of testing an accessible system solution for distance education. The accessible system, team profile, methodologies and techniques covered in the testing process are presented. The testing process shown in this paper was designed from the experience with user. The testing process emerged from lessons learned from past experiences and the end user is present at all stages of the tests. Also, lessons learned are reported and how it was possible the maturing of the team and the methods resulting in a simple, productive and effective process.

Keywords: experience report, accessible systems, software testing, testing process, systems, e-learning

Procedia PDF Downloads 390
16402 A Framework for Evaluating the QoS and Cost of Web Services Based on Its Functional Performance

Authors: M. Mohemmed Sha, T. Manesh, A. Ahmed Mohamed Mustaq

Abstract:

In this corporate world, the technology of Web services has grown rapidly and its significance for the development of web based applications gradually rises over time. The success of Business to Business integration rely on finding novel partners and their services in a global business environment. But the selection of the most suitable Web service from the list of services with the identical functionality is more vital. The satisfaction level of the customer and the provider’s reputation of the Web service are primarily depending on the range it reaches the customer’s requirements. In most cases the customer of the Web service feels that he is spending for the service which is undelivered. This is because the customer always thinks that the real functionality of the web service is not reached. This will lead to change of the service frequently. In this paper, a framework is proposed to evaluate the Quality of Service (QoS) and its cost that makes the optimal correlation between each other. Also this research work proposes some management decision against the functional deviancy of the web service that are guaranteed at time of selection.

Keywords: web service, service level agreement, quality of a service, cost of a service, QoS, CoS, SOA, WSLA, WsRF

Procedia PDF Downloads 412
16401 Effective Solvents for Proteins Recovery from Microalgae

Authors: Win Nee Phong, Tau Chuan Ling, Pau Loke Show

Abstract:

From an industrial perspective, the exploitation of microalgae for protein source is of great economical and commercial interest due to numerous attractive characteristics. Nonetheless, the release of protein from microalgae is limited by the multiple layers of the rigid thick cell wall that generally contain a large proportion of cellulose. Thus an efficient cell disruption process is required to rupture the cell wall. The conventional downstream processing methods which typically involve several unit operational steps such as disruption, isolation, extraction, concentration and purification are energy-intensive and costly. To reduce the overall cost and establish a feasible technology for the success of the large-scale production, microalgal industry today demands a more cost-effective and eco-friendly technique in downstream processing. One of the main challenges to extract the proteins from microalgae is the presence of rigid cell wall. This study aims to provide some guidance on the selection of the efficient solvent to facilitate the proteins released during the cell disruption process. The effects of solvent types such as methanol, ethanol, 1-propanol and water in rupturing the microalgae cell wall were studied. It is interesting to know that water is the most effective solvent to recover proteins from microalgae and the cost is cheapest among all other solvents.

Keywords: green, microalgae, protein, solvents

Procedia PDF Downloads 256
16400 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS

Authors: David A. Harness

Abstract:

Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.

Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks

Procedia PDF Downloads 176
16399 Development of new Ecological Cleaning Process of Metal Sheets

Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario

Abstract:

In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.

Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.

Procedia PDF Downloads 348
16398 Selection of Developmental Stages of Bovine in vitro-Derived Blastocysts Prior to Vitrification and Embryo Transfer: Implications for Cattle Breeding Programs

Authors: Van Huong Do, Simon Walton, German Amaya, Madeline Batsiokis, Sally Catt, Andrew Taylor-Robinson

Abstract:

Identification of the most suitable stages of bovine in vitro-derived blastocysts (early, expanded and hatching) prior to vitrification is a straightforward process that facilitates the decision as to which blastocyst stage to use for transfer of fresh and vitrified embryos. Research on in vitro evaluation of suitable stages has shown that the more advanced developmental stage of blastocysts is recommended for fresh embryo transfer while the earlier stage is proposed for embryo transfer following vitrification. There is, however, limited information on blastocyst stages using in vivo assessment. Hence, the aim of the present study was to determine the optimal stage of a blastocyst for vitrification and embryo transfer through a two-step procedure of embryo transfer followed by pregnancy testing at 35, 60 and 90 days of pregnancy. 410 good quality oocytes aspirated by the ovum pick-up technique from 8 donor cows were subjected to in vitro embryo production, vitrification and embryo transfer. Good quality embryos were selected, subjected to vitrification and embryo transfer. Subsequently, 77 vitrified embryos at different blastocyst stages were transferred to synchronised recipient cows. The overall cleavage and blastocyst rates of oocytes were 68.8% and 41.7%, respectively. In addition, the fertility and blastocyst production of 6 bulls used for in vitro fertilization was examined and shown to be statistically different (P<0.05). Results of ongoing pregnancy trials conducted at 35 days, 60 days and 90 days will be discussed. However, preliminary data indicate that individual bulls demonstrate distinctly different fertility performance in vitro. Findings from conception rates would provide a useful tool to aid selection of bovine in vitro-derived embryos for vitrification and embryo transfer in commercial settings.

Keywords: blastocyst, embryo transfer, in vitro-derived embryos, ovum pick-up, vitrification

Procedia PDF Downloads 302
16397 Application of Simulated Annealing to Threshold Optimization in Distributed OS-CFAR System

Authors: L. Abdou, O. Taibaoui, A. Moumen, A. Talib Ahmed

Abstract:

This paper proposes an application of the simulated annealing to optimize the detection threshold in an ordered statistics constant false alarm rate (OS-CFAR) system. Using conventional optimization methods, such as the conjugate gradient, can lead to a local optimum and lose the global optimum. Also for a system with a number of sensors that is greater than or equal to three, it is difficult or impossible to find this optimum; Hence, the need to use other methods, such as meta-heuristics. From a variety of meta-heuristic techniques, we can find the simulated annealing (SA) method, inspired from a process used in metallurgy. This technique is based on the selection of an initial solution and the generation of a near solution randomly, in order to improve the criterion to optimize. In this work, two parameters will be subject to such optimisation and which are the statistical order (k) and the scaling factor (T). Two fusion rules; “AND” and “OR” were considered in the case where the signals are independent from sensor to sensor. The results showed that the application of the proposed method to the problem of optimisation in a distributed system is efficiency to resolve such problems. The advantage of this method is that it allows to browse the entire solutions space and to avoid theoretically the stagnation of the optimization process in an area of local minimum.

Keywords: distributed system, OS-CFAR system, independent sensors, simulating annealing

Procedia PDF Downloads 494
16396 Diffusion Adaptation Strategies for Distributed Estimation Based on the Family of Affine Projection Algorithms

Authors: Mohammad Shams Esfand Abadi, Mohammad Ranjbar, Reza Ebrahimpour

Abstract:

This work presents the distributed processing solution problem in a diffusion network based on the adapt then combine (ATC) and combine then adapt (CTA)selective partial update normalized least mean squares (SPU-NLMS) algorithms. Also, we extend this approach to dynamic selection affine projection algorithm (DS-APA) and ATC-DS-APA and CTA-DS-APA are established. The purpose of ATC-SPU-NLMS and CTA-SPU-NLMS algorithm is to reduce the computational complexity by updating the selected blocks of weight coefficients at every iteration. In CTA-DS-APA and ATC-DS-APA, the number of the input vectors is selected dynamically. Diffusion cooperation strategies have been shown to provide good performance based on these algorithms. The good performance of introduced algorithm is illustrated with various experimental results.

Keywords: selective partial update, affine projection, dynamic selection, diffusion, adaptive distributed networks

Procedia PDF Downloads 701
16395 Concept Drifts Detection and Localisation in Process Mining

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.

Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining

Procedia PDF Downloads 343
16394 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.

Keywords: cooperative communications, MAC protocol, relay node, WLAN

Procedia PDF Downloads 327