Search results for: welding process selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17044

Search results for: welding process selection

16414 Exploring the Importance of Different Product Cues on the Selection for Chocolate from the Consumer Perspective

Authors: Ezeni Brzovska, Durdana Ozretic-Dosen

Abstract:

The purpose of this paper is to deepen the understanding of the product cues that influence purchase decision for a specific product category – chocolate, and to identify demographic differences in the buying behavior. ANOVA was employed for analyzing the significance level for nine product cues, and the survey showed statistically significant differences among different age and gender groups, and between respondents with different levels of education. From the theoretical perspective, the study adds to the existing knowledge by contributing with the research results from the new environment (Southeast Europe, Macedonia), which has been neglected so far. Establishing the level of significance for the product cues that affect buying behavior in the chocolate consumption context might help managers to improve marketing decision-making, and better meet consumer needs through identifying opportunities for packaging innovations and/or personalization toward different target groups.

Keywords: chocolate consumption context, chocolate selection, demographic characteristics, product cues

Procedia PDF Downloads 252
16413 Supply Chain Risk Management (SCRM): A Simplified Alternative for Implementing SCRM for Small and Medium Enterprises

Authors: Paul W. Murray, Marco Barajas

Abstract:

Recent changes in supply chains, especially globalization and collaboration, have created new risks for enterprises of all sizes. A variety of complex frameworks, often based on enterprise risk management strategies have been presented under the heading of Supply Chain Risk Management (SCRM). The literature on promotes the benefits of a robust SCRM strategy; however, implementing SCRM is difficult and resource demanding for Large Enterprises (LEs), and essentially out of reach for Small and Medium Enterprises (SMEs). This research debunks the idea that SCRM is necessary for all enterprises and instead proposes a simple and effective Vendor Selection Template (VST). Empirical testing and a survey of supply chain practitioners provide a measure of validation to the VST. The resulting VSTis a valuable contribution because is easy to use, provides practical results, and is sufficiently flexible to be universally applied to SMEs.

Keywords: multiple regression analysis, supply chain management, risk assessment, vendor selection

Procedia PDF Downloads 465
16412 Bridging the Gap between Different Interfaces for Business Process Modeling

Authors: Katalina Grigorova, Kaloyan Mironov

Abstract:

The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.

Keywords: business process modeling, business process modeling standards, workflow patterns, converting models

Procedia PDF Downloads 585
16411 Optical Variability of Faint Quasars

Authors: Kassa Endalamaw Rewnu

Abstract:

The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.

Keywords: nuclear activity, galaxies, active quasars, variability

Procedia PDF Downloads 80
16410 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 129
16409 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins

Authors: Manju Kanu, Subrata Sinha, Surabhi Johari

Abstract:

Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.

Keywords: epitope, b cell, immunogenicity, ebola

Procedia PDF Downloads 314
16408 Integrated Design in Additive Manufacturing Based on Design for Manufacturing

Authors: E. Asadollahi-Yazdi, J. Gardan, P. Lafon

Abstract:

Nowadays, manufactures are encountered with production of different version of products due to quality, cost and time constraints. On the other hand, Additive Manufacturing (AM) as a production method based on CAD model disrupts the design and manufacturing cycle with new parameters. To consider these issues, the researchers utilized Design For Manufacturing (DFM) approach for AM but until now there is no integrated approach for design and manufacturing of product through the AM. So, this paper aims to provide a general methodology for managing the different production issues, as well as, support the interoperability with AM process and different Product Life Cycle Management tools. The problem is that the models of System Engineering which is used for managing complex systems cannot support the product evolution and its impact on the product life cycle. Therefore, it seems necessary to provide a general methodology for managing the product’s diversities which is created by using AM. This methodology must consider manufacture and assembly during product design as early as possible in the design stage. The latest approach of DFM, as a methodology to analyze the system comprehensively, integrates manufacturing constraints in the numerical model in upstream. So, DFM for AM is used to import the characteristics of AM into the design and manufacturing process of a hybrid product to manage the criteria coming from AM. Also, the research presents an integrated design method in order to take into account the knowledge of layers manufacturing technologies. For this purpose, the interface model based on the skin and skeleton concepts is provided, the usage and manufacturing skins are used to show the functional surface of the product. Also, the material flow and link between the skins are demonstrated by usage and manufacturing skeletons. Therefore, this integrated approach is a helpful methodology for designer and manufacturer in different decisions like material and process selection as well as, evaluation of product manufacturability.

Keywords: additive manufacturing, 3D printing, design for manufacturing, integrated design, interoperability

Procedia PDF Downloads 316
16407 Ionometallurgy for Recycling Silver in Silicon Solar Panel

Authors: Emmanuel Billy

Abstract:

This work is in the CABRISS project (H2020 projects) which aims at developing innovative cost-effective methods for the extraction of materials from the different sources of PV waste: Si based panels, thin film panels or Si water diluted slurries. Aluminum, silicon, indium, and silver will especially be extracted from these wastes in order to constitute materials feedstock which can be used later in a closed-loop process. The extraction of metals from silicon solar cells is often an energy-intensive process. It requires either smelting or leaching at elevated temperature, or the use of large quantities of strong acids or bases that require energy to produce. The energy input equates to a significant cost and an associated CO2 footprint, both of which it would be desirable to reduce. Thus there is a need to develop more energy-efficient and environmentally-compatible processes. Thus, ‘ionometallurgy’ could offer a new set of environmentally-benign process for metallurgy. This work demonstrates that ionic liquids provide one such method since they can be used to dissolve and recover silver. The overall process associates leaching, recovery and the possibility to re-use the solution in closed-loop process. This study aims to evaluate and compare different ionic liquids to leach and recover silver. An electrochemical analysis is first implemented to define the best system for the Ag dissolution. Effects of temperature, concentration and oxidizing agent are evaluated by this approach. Further, a comparative study between conventional approach (nitric acid, thiourea) and the ionic liquids (Cu and Al) focused on the leaching efficiency is conducted. A specific attention has been paid to the selection of the Ionic Liquids. Electrolytes composed of chelating anions are used to facilitate the lixiviation (Cl, Br, I,), avoid problems dealing with solubility issues of metallic species and of classical additional ligands. This approach reduces the cost of the process and facilitates the re-use of the leaching medium. To define the most suitable ionic liquids, electrochemical experiments have been carried out to evaluate the oxidation potential of silver include in the crystalline solar cells. Then, chemical dissolution of metals for crystalline solar cells have been performed for the most promising ionic liquids. After the chemical dissolution, electrodeposition has been performed to recover silver under a metallic form.

Keywords: electrodeposition, ionometallurgy, leaching, recycling, silver

Procedia PDF Downloads 246
16406 A Framework for Evaluating the QoS and Cost of Web Services Based on Its Functional Performance

Authors: M. Mohemmed Sha, T. Manesh, A. Ahmed Mohamed Mustaq

Abstract:

In this corporate world, the technology of Web services has grown rapidly and its significance for the development of web based applications gradually rises over time. The success of Business to Business integration rely on finding novel partners and their services in a global business environment. But the selection of the most suitable Web service from the list of services with the identical functionality is more vital. The satisfaction level of the customer and the provider’s reputation of the Web service are primarily depending on the range it reaches the customer’s requirements. In most cases the customer of the Web service feels that he is spending for the service which is undelivered. This is because the customer always thinks that the real functionality of the web service is not reached. This will lead to change of the service frequently. In this paper, a framework is proposed to evaluate the Quality of Service (QoS) and its cost that makes the optimal correlation between each other. Also this research work proposes some management decision against the functional deviancy of the web service that are guaranteed at time of selection.

Keywords: web service, service level agreement, quality of a service, cost of a service, QoS, CoS, SOA, WSLA, WsRF

Procedia PDF Downloads 419
16405 GIS Pavement Maintenance Selection Strategy

Authors: Mekdelawit Teferi Alamirew

Abstract:

As a practical tool, the Geographical information system (GIS) was used for data integration, collection, management, analysis, and output presentation in pavement mangement systems . There are many GIS techniques to improve the maintenance activities like Dynamic segmentation and weighted overlay analysis which considers Multi Criteria Decision Making process. The results indicated that the developed MPI model works sufficiently and yields adequate output for providing accurate decisions. Hence considering multi criteria to prioritize the pavement sections for maintenance, as a result of the fact that GIS maps can express position, extent, and severity of pavement distress features more effectively than manual approaches, lastly the paper also offers digitized distress maps that can help agencies in their decision-making processes.

Keywords: pavement, flexible, maintenance, index

Procedia PDF Downloads 62
16404 Laboratory Investigation of Alkali-Surfactant-Alternate Gas (ASAG) Injection – a Novel EOR Process for a Light Oil Sandstone Reservoir

Authors: Vidit Mohan, Ashwin P. Ramesh, Anirudh Toshniwal

Abstract:

Alkali-Surfactant-Alternate-Gas(ASAG) injection, a novel EOR process has the potential to improve displacement efficiency over Surfactant-Alternate-Gas(SAG) by addressing the problem of surfactant adsorption by clay minerals in rock matrix. A detailed laboratory investigation on ASAG injection process was carried out with encouraging results. To further enhance recovery over WAG injection process, SAG injection was investigated at laboratory scale. SAG injection yielded marginal incremental displacement efficiency over WAG process. On investigation, it was found that, clay minerals in rock matrix adsorbed the surfactants and were detrimental for SAG process. Hence, ASAG injection was conceptualized using alkali as a clay stabilizer. The experiment of ASAG injection with surfactant concentration of 5000 ppm and alkali concentration of 0.5 weight% yields incremental displacement efficiency of 5.42% over WAG process. The ASAG injection is a new process and has potential to enhance efficiency of WAG/SAG injection process.

Keywords: alkali surfactant alternate gas (ASAG), surfactant alternate gas (SAG), laboratory investigation, EOR process

Procedia PDF Downloads 479
16403 Avoidance of Brittle Fracture in Bridge Bearings: Brittle Fracture Tests and Initial Crack Size

Authors: Natalie Hoyer

Abstract:

Bridges in both roadway and railway systems depend on bearings to ensure extended service life and functionality. These bearings enable proper load distribution from the superstructure to the substructure while permitting controlled movement of the superstructure. The design of bridge bearings, according to Eurocode DIN EN 1337 and the relevant sections of DIN EN 1993, increasingly requires the use of thick plates, especially for long-span bridges. However, these plate thicknesses exceed the limits specified in the national appendix of DIN EN 1993-2. Furthermore, compliance with DIN EN 1993-1-10 regulations regarding material toughness and through-thickness properties necessitates further modifications. Consequently, these standards cannot be directly applied to the selection of bearing materials without supplementary guidance and design rules. In this context, a recommendation was developed in 2011 to regulate the selection of appropriate steel grades for bearing components. Prior to the initiation of the research project underlying this contribution, this recommendation had only been available as a technical bulletin. Since July 2023, it has been integrated into guideline 804 of the German railway. However, recent findings indicate that certain bridge-bearing components are exposed to high fatigue loads, which necessitate consideration in structural design, material selection, and calculations. Therefore, the German Centre for Rail Traffic Research called a research project with the objective of defining a proposal to expand the current standards in order to implement a sufficient choice of steel material for bridge bearings to avoid brittle fracture, even for thick plates and components subjected to specific fatigue loads. The results obtained from theoretical considerations, such as finite element simulations and analytical calculations, are validated through large-scale component tests. Additionally, experimental observations are used to calibrate the calculation models and modify the input parameters of the design concept. Within the large-scale component tests, a brittle failure is artificially induced in a bearing component. For this purpose, an artificially generated initial defect is introduced at the previously defined hotspot into the specimen using spark erosion. Then, a dynamic load is applied until the crack initiation process occurs to achieve realistic conditions in the form of a sharp notch similar to a fatigue crack. This initiation process continues until the crack length reaches a predetermined size. Afterward, the actual test begins, which requires cooling the specimen with liquid nitrogen until a temperature is reached where brittle fracture failure is expected. In the next step, the component is subjected to a quasi-static tensile test until failure occurs in the form of a brittle failure. The proposed paper will present the latest research findings, including the results of the conducted component tests and the derived definition of the initial crack size in bridge bearings.

Keywords: bridge bearings, brittle fracture, fatigue, initial crack size, large-scale tests

Procedia PDF Downloads 44
16402 An Evaluation on the Methodology of Manufacturing High Performance Organophilic Clay at the Most Efficient and Cost Effective Process

Authors: Siti Nur Izati Azmi, Zatil Afifah Omar, Kathi Swaran, Navin Kumar

Abstract:

Organophilic Clays, also known as Organoclays, is used as a viscosifier in Oil based Drilling fluids. Most often, Organophilic clay are produced from modified Sodium and Calcium based Bentonite. Many studies and data show that Organophilic Clay using Hectorite based clays provide the best yield and good fluid loss properties in an oil-based drilling fluid at a higher cost. In terms of the manufacturing process, the two common methods of manufacturing organophilic clays are a Wet Process and a Dry Process. Wet process is known to produce better performance product at a higher cost while Dry Process shorten the production time. Hence, the purpose of this study is to evaluate the various formulation of an organophilic clay and its performance vs. the cost, as well as to determine the most efficient and cost-effective method of manufacturing organophilic clays.

Keywords: organophilic clay, viscosifier, wet process, dry process

Procedia PDF Downloads 226
16401 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS

Authors: David A. Harness

Abstract:

Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.

Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks

Procedia PDF Downloads 179
16400 Diffusion Adaptation Strategies for Distributed Estimation Based on the Family of Affine Projection Algorithms

Authors: Mohammad Shams Esfand Abadi, Mohammad Ranjbar, Reza Ebrahimpour

Abstract:

This work presents the distributed processing solution problem in a diffusion network based on the adapt then combine (ATC) and combine then adapt (CTA)selective partial update normalized least mean squares (SPU-NLMS) algorithms. Also, we extend this approach to dynamic selection affine projection algorithm (DS-APA) and ATC-DS-APA and CTA-DS-APA are established. The purpose of ATC-SPU-NLMS and CTA-SPU-NLMS algorithm is to reduce the computational complexity by updating the selected blocks of weight coefficients at every iteration. In CTA-DS-APA and ATC-DS-APA, the number of the input vectors is selected dynamically. Diffusion cooperation strategies have been shown to provide good performance based on these algorithms. The good performance of introduced algorithm is illustrated with various experimental results.

Keywords: selective partial update, affine projection, dynamic selection, diffusion, adaptive distributed networks

Procedia PDF Downloads 707
16399 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit

Authors: Davit Mirzoyan, Ararat Khachatryan

Abstract:

A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.

Keywords: detection, monitoring, process corner, process variation

Procedia PDF Downloads 525
16398 Effective Solvents for Proteins Recovery from Microalgae

Authors: Win Nee Phong, Tau Chuan Ling, Pau Loke Show

Abstract:

From an industrial perspective, the exploitation of microalgae for protein source is of great economical and commercial interest due to numerous attractive characteristics. Nonetheless, the release of protein from microalgae is limited by the multiple layers of the rigid thick cell wall that generally contain a large proportion of cellulose. Thus an efficient cell disruption process is required to rupture the cell wall. The conventional downstream processing methods which typically involve several unit operational steps such as disruption, isolation, extraction, concentration and purification are energy-intensive and costly. To reduce the overall cost and establish a feasible technology for the success of the large-scale production, microalgal industry today demands a more cost-effective and eco-friendly technique in downstream processing. One of the main challenges to extract the proteins from microalgae is the presence of rigid cell wall. This study aims to provide some guidance on the selection of the efficient solvent to facilitate the proteins released during the cell disruption process. The effects of solvent types such as methanol, ethanol, 1-propanol and water in rupturing the microalgae cell wall were studied. It is interesting to know that water is the most effective solvent to recover proteins from microalgae and the cost is cheapest among all other solvents.

Keywords: green, microalgae, protein, solvents

Procedia PDF Downloads 258
16397 Selection of Developmental Stages of Bovine in vitro-Derived Blastocysts Prior to Vitrification and Embryo Transfer: Implications for Cattle Breeding Programs

Authors: Van Huong Do, Simon Walton, German Amaya, Madeline Batsiokis, Sally Catt, Andrew Taylor-Robinson

Abstract:

Identification of the most suitable stages of bovine in vitro-derived blastocysts (early, expanded and hatching) prior to vitrification is a straightforward process that facilitates the decision as to which blastocyst stage to use for transfer of fresh and vitrified embryos. Research on in vitro evaluation of suitable stages has shown that the more advanced developmental stage of blastocysts is recommended for fresh embryo transfer while the earlier stage is proposed for embryo transfer following vitrification. There is, however, limited information on blastocyst stages using in vivo assessment. Hence, the aim of the present study was to determine the optimal stage of a blastocyst for vitrification and embryo transfer through a two-step procedure of embryo transfer followed by pregnancy testing at 35, 60 and 90 days of pregnancy. 410 good quality oocytes aspirated by the ovum pick-up technique from 8 donor cows were subjected to in vitro embryo production, vitrification and embryo transfer. Good quality embryos were selected, subjected to vitrification and embryo transfer. Subsequently, 77 vitrified embryos at different blastocyst stages were transferred to synchronised recipient cows. The overall cleavage and blastocyst rates of oocytes were 68.8% and 41.7%, respectively. In addition, the fertility and blastocyst production of 6 bulls used for in vitro fertilization was examined and shown to be statistically different (P<0.05). Results of ongoing pregnancy trials conducted at 35 days, 60 days and 90 days will be discussed. However, preliminary data indicate that individual bulls demonstrate distinctly different fertility performance in vitro. Findings from conception rates would provide a useful tool to aid selection of bovine in vitro-derived embryos for vitrification and embryo transfer in commercial settings.

Keywords: blastocyst, embryo transfer, in vitro-derived embryos, ovum pick-up, vitrification

Procedia PDF Downloads 306
16396 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, J. Franke

Abstract:

The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.

Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production

Procedia PDF Downloads 733
16395 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.

Keywords: cooperative communications, MAC protocol, relay node, WLAN

Procedia PDF Downloads 331
16394 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: business process adaptation, business process evolution, business process modelling, and context awareness

Procedia PDF Downloads 412
16393 Application of Simulated Annealing to Threshold Optimization in Distributed OS-CFAR System

Authors: L. Abdou, O. Taibaoui, A. Moumen, A. Talib Ahmed

Abstract:

This paper proposes an application of the simulated annealing to optimize the detection threshold in an ordered statistics constant false alarm rate (OS-CFAR) system. Using conventional optimization methods, such as the conjugate gradient, can lead to a local optimum and lose the global optimum. Also for a system with a number of sensors that is greater than or equal to three, it is difficult or impossible to find this optimum; Hence, the need to use other methods, such as meta-heuristics. From a variety of meta-heuristic techniques, we can find the simulated annealing (SA) method, inspired from a process used in metallurgy. This technique is based on the selection of an initial solution and the generation of a near solution randomly, in order to improve the criterion to optimize. In this work, two parameters will be subject to such optimisation and which are the statistical order (k) and the scaling factor (T). Two fusion rules; “AND” and “OR” were considered in the case where the signals are independent from sensor to sensor. The results showed that the application of the proposed method to the problem of optimisation in a distributed system is efficiency to resolve such problems. The advantage of this method is that it allows to browse the entire solutions space and to avoid theoretically the stagnation of the optimization process in an area of local minimum.

Keywords: distributed system, OS-CFAR system, independent sensors, simulating annealing

Procedia PDF Downloads 497
16392 Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management

Authors: Marcos Devaner, Marcela Alves, Cledson Braga, Fabiano Alves, Wilton Bezerra

Abstract:

This article discusses the inclusion of people with disabilities in the process of testing an accessible system solution for distance education. The accessible system, team profile, methodologies and techniques covered in the testing process are presented. The testing process shown in this paper was designed from the experience with user. The testing process emerged from lessons learned from past experiences and the end user is present at all stages of the tests. Also, lessons learned are reported and how it was possible the maturing of the team and the methods resulting in a simple, productive and effective process.

Keywords: experience report, accessible systems, software testing, testing process, systems, e-learning

Procedia PDF Downloads 396
16391 Site Selection in Adaptive Reuse Architecture for Social Housing in Johannesburg, South Africa

Authors: Setapo Moloi, Jun-Ichiro Giorgos Tsutsumi

Abstract:

South Africa’s need for the provision of housing within its major city centres, specifically Gauteng Province (GP), is a major concern. Initiatives for converting misused/ unused buildings to suitable housing for residents who work in the city as well as prospective citizens are currently underway, one aspect that is needed currently, is the re-possession of these buildings repurposing, into housing communities for quality low cost mixed density housing and for this process to have minimal strain on existing infrastructure like energy, emission reduction etc. Unfortunately, there are instances in Johannesburg, the country’s economic capital, with 2017 estimates claiming that 700 buildings lay unused or misused due to issues that will be discussed in this paper, these then become hubs for illegal activity and are an unacceptable form of shelter. It can be argued that the provision of inner-city social housing is lacking, but not due to the unavailability of funding or usable land and buildings, but that these assets are not being used appropriately nor to their full potential. Currently the GP government has mandated the re-purposing of all buildings that meet their criteria (structural stability, feasibility, adaptability, etc.) with the intention of inviting interested parties to propose conversions of the buildings into densified social housing. Going forward, the proposed focus is creation of social housing communities within existing buildings which may be retrofitted with sustainable technologies, green design strategies and principles, aiming for the finished buildings to achieve ‘Net-Zero/Positive’ status. A Net-Zero building, according to The Green Building Council of South Africa (GBCSA) is a building which manages to produce resources it needs to function, and reduces wastage, emissions and demand of these resources during its lifespan. The categories which GBCSA includes are carbon, water, waste and ecology, this may include material selection, construction methods, etc.

Keywords: adaptive reuse, conversion, net-zero, social housing, sustainable communities

Procedia PDF Downloads 138
16390 Analysis and Identification of Different Factors Affecting Students’ Performance Using a Correlation-Based Network Approach

Authors: Jeff Chak-Fu Wong, Tony Chun Yin Yip

Abstract:

The transition from secondary school to university seems exciting for many first-year students but can be more challenging than expected. Enabling instructors to know students’ learning habits and styles enhances their understanding of the students’ learning backgrounds, allows teachers to provide better support for their students, and has therefore high potential to improve teaching quality and learning, especially in any mathematics-related courses. The aim of this research is to collect students’ data using online surveys, to analyze students’ factors using learning analytics and educational data mining and to discover the characteristics of the students at risk of falling behind in their studies based on students’ previous academic backgrounds and collected data. In this paper, we use correlation-based distance methods and mutual information for measuring student factor relationships. We then develop a factor network using the Minimum Spanning Tree method and consider further study for analyzing the topological properties of these networks using social network analysis tools. Under the framework of mutual information, two graph-based feature filtering methods, i.e., unsupervised and supervised infinite feature selection algorithms, are used to analyze the results for students’ data to rank and select the appropriate subsets of features and yield effective results in identifying the factors affecting students at risk of failing. This discovered knowledge may help students as well as instructors enhance educational quality by finding out possible under-performers at the beginning of the first semester and applying more special attention to them in order to help in their learning process and improve their learning outcomes.

Keywords: students' academic performance, correlation-based distance method, social network analysis, feature selection, graph-based feature filtering method

Procedia PDF Downloads 129
16389 Modifying Assessment Modes in the Science Classroom as a Solution to Examination Malpractice

Authors: Catherine Omole

Abstract:

Examination malpractice includes acts that temper with collecting accurate results during the conduct of an examination, thereby giving undue advantage to a student over his colleagues. Even though examination malpractice has been a lingering problem, examinations may not be easy to do away with completely as it is an important feedback tool in the learning process with several other functions e.g for the purpose of selection, placement, certification and promotion. Examination malpractice has created a lot of problems such as a relying on a weak work force based on false assessment results. The question is why is this problem still persisting, despite measures that have been taken to curb this ugly trend over the years? This opinion paper has identified modifications that could help relieve the student of the examination stress and thus increase the student’s effort towards effective learning and discourage examination malpractice in the long run.

Keywords: assessment, examination malpractice, learning, science classroom

Procedia PDF Downloads 260
16388 Development of new Ecological Cleaning Process of Metal Sheets

Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario

Abstract:

In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.

Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.

Procedia PDF Downloads 354
16387 Concept Drifts Detection and Localisation in Process Mining

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.

Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining

Procedia PDF Downloads 345
16386 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection

Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa

Abstract:

Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.

Keywords: classification, airborne LiDAR, parameters selection, support vector machine

Procedia PDF Downloads 147
16385 Study of Drawing Characteristics due to Friction between the Materials by FEM

Authors: Won Jin Ryu, Mok Tan Ahn, Hyeok Choi, Joon Hong Park, Sung Min Kim, Jong Bae Park

Abstract:

Pipes for offshore plants require specifications that satisfy both high strength and high corrosion resistance. Therefore, currently, clad pipes are used in offshore plants. Clad pipes can be made using either overlay welding or clad plates. The present study was intended to figure out the effects of friction between two materials, which is a factor that affects two materials, were figured out using FEM to make clad pipes through heterogenous material drawing instead of the two methods mentioned above. Therefore, FEM has conducted while all other variables that the variable friction was fixed. The experimental results showed increases in pullout force along with increases in the friction in the boundary layer.

Keywords: clad pipe, FEM, friction, pullout force

Procedia PDF Downloads 494