Search results for: Hidden Markov chain with a bi-dimensional observed process
5925 Memory Leak Detection in Distributed System
Authors: Roohi Shabrin S., Devi Prasad B., Prabu D., Pallavi R. S., Revathi P.
Abstract:
Due to memory leaks, often-valuable system memory gets wasted and denied for other processes thereby affecting the computational performance. If an application-s memory usage exceeds virtual memory size, it can leads to system crash. Current memory leak detection techniques for clusters are reactive and display the memory leak information after the execution of the process (they detect memory leak only after it occur). This paper presents a Dynamic Memory Monitoring Agent (DMMA) technique. DMMA framework is a dynamic memory leak detection, that detects the memory leak while application is in execution phase, when memory leak in any process in the cluster is identified by DMMA it gives information to the end users to enable them to take corrective actions and also DMMA submit the affected process to healthy node in the system. Thus provides reliable service to the user. DMMA maintains information about memory consumption of executing processes and based on this information and critical states, DMMA can improve reliability and efficaciousness of cluster computing.Keywords: Dynamic Memory Monitoring Agent (DMMA), Cluster Computing, Memory Leak, Fault Tolerant Framework, Dynamic Memory Leak Detection (DMLD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22845924 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling
Authors: Florin Leon, Silvia Curteanu
Abstract:
Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.Keywords: Adaptive sampling, batch bulk methyl methacrylate polymerization, large margin nearest neighbor regression, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14015923 Comparison of different Channel Modeling Techniques used in the BPLC Systems
Authors: Justinian Anatory, Nelson Theethayi
Abstract:
The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18275922 Handwriting Velocity Modeling by Artificial Neural Networks
Authors: Mohamed Aymen Slim, Afef Abdelkrim, Mohamed Benrejeb
Abstract:
The handwriting is a physical demonstration of a complex cognitive process learnt by man since his childhood. People with disabilities or suffering from various neurological diseases are facing so many difficulties resulting from problems located at the muscle stimuli (EMG) or signals from the brain (EEG) and which arise at the stage of writing. The handwriting velocity of the same writer or different writers varies according to different criteria: age, attitude, mood, writing surface, etc. Therefore, it is interesting to reconstruct an experimental basis records taking, as primary reference, the writing speed for different writers which would allow studying the global system during handwriting process. This paper deals with a new approach of the handwriting system modeling based on the velocity criterion through the concepts of artificial neural networks, precisely the Radial Basis Functions (RBF) neural networks. The obtained simulation results show a satisfactory agreement between responses of the developed neural model and the experimental data for various letters and forms then the efficiency of the proposed approaches.
Keywords: ElectroMyoGraphic (EMG) signals, Experimental approach, Handwriting process, Radial Basis Functions (RBF) neural networks, Velocity Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23155921 Influence of Seasons on Honeybee Wooden Hives Attack by Termites in Port Harcourt, Nigeria
Authors: A. A. Aiyeloja, G.A. Adedeji, S. L. Larinde
Abstract:
Termites have been observed as major pre-colonisation and post-colonisation pest insect of honeybees’ wooden hives in Nigeria. However, pest situation studies in modern beekeeping have been largely directed towards those pests that affect honeybees rather than the biological structure (wood) which houses the honeybees and the influence of seasons on the pests’ activities against the hives. This study, therefore, investigated the influence of seasons on the intensity of hives attacks by termites for 2 years in University of Port Harcourt, Rivers State using visual inspection. The Experimental Apiary was established with 15 Kenyan’s top bar hives made of Triplochiton scleroxylon wood that were strategically placed and observed within the Department of Forestry and Wildlife Management arboretum. The colonies hives consistently showed comparatively lower termite’s infestation levels in the dry season and, consequently, also lower attacks on the colonized hives. The result indicated raining season as a distinct period for more destructive activities of termites on the hives and strongly associated with dryness of the hives. Since previous study and observations have linked colonization with dry season coupled with minimal attacked on colonized hives; the non-colonised hives should be removed from the field at the onset of raining season and returned two weeks prior to dry season to reduce hives degradation by pests.
Keywords: Attack, hives degradation, Nigeria, seasons, termites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28245920 On the Parameter Optimization of Fuzzy Inference Systems
Authors: Erika Martinez Ramirez, Rene V. Mayorga
Abstract:
Nowadays, more engineering systems are using some kind of Artificial Intelligence (AI) for the development of their processes. Some well-known AI techniques include artificial neural nets, fuzzy inference systems, and neuro-fuzzy inference systems among others. Furthermore, many decision-making applications base their intelligent processes on Fuzzy Logic; due to the Fuzzy Inference Systems (FIS) capability to deal with problems that are based on user knowledge and experience. Also, knowing that users have a wide variety of distinctiveness, and generally, provide uncertain data, this information can be used and properly processed by a FIS. To properly consider uncertainty and inexact system input values, FIS normally use Membership Functions (MF) that represent a degree of user satisfaction on certain conditions and/or constraints. In order to define the parameters of the MFs, the knowledge from experts in the field is very important. This knowledge defines the MF shape to process the user inputs and through fuzzy reasoning and inference mechanisms, the FIS can provide an “appropriate" output. However an important issue immediately arises: How can it be assured that the obtained output is the optimum solution? How can it be guaranteed that each MF has an optimum shape? A viable solution to these questions is through the MFs parameter optimization. In this Paper a novel parameter optimization process is presented. The process for FIS parameter optimization consists of the five simple steps that can be easily realized off-line. Here the proposed process of FIS parameter optimization it is demonstrated by its implementation on an Intelligent Interface section dealing with the on-line customization / personalization of internet portals applied to E-commerce.Keywords: Artificial Intelligence, Fuzzy Logic, Fuzzy InferenceSystems, Nonlinear Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19845919 An Experimental Investigation on the Effect of Deep cold Rolling Parameters on Surface Roughness and Hardness of AISI 4140 Steel
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma
Abstract:
Deep cold rolling (DCR) is a cold working process, which easily produces a smooth and work-hardened surface by plastic deformation of surface irregularities. In the present study, the influence of main deep cold rolling process parameters on the surface roughness and the hardness of AISI 4140 steel were studied by using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in terms of identifying the predominant factor amongst the selected parameters, their order of significance and setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. It was found that the ball diameter, rolling force, initial surface roughness and number of tool passes are the most pronounced parameters, which have great effects on the work piece-s surface during the deep cold rolling process. A simple, inexpensive and newly developed DCR tool, with interchangeable collet for using different ball diameters, was used throughout the experimental work presented in this paper.
Keywords: Deep cold rolling, design of experiments, surface hardness, surface roughness
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21565918 Rock Slope Stabilization and Protection for Roads and Multi-Storey Structures in Jabal Omar, Saudi Arabia
Authors: Ibrahim Abdel Gadir Malik, Dafalla Siddig Dafalla, Abdelazim Ibrahim
Abstract:
Jabal Omar is located in the western side of Makkah city in Saudi Arabia. The proposed Jabal Omar Development project includes several multi-storey buildings, roads, bridges and below ground structures founded at various depths. In this study, geological mapping and site inspection which covered pre-selected areas were carried out within the easily accessed parts. Geological features; including rock types, structures, degree of weathering, and geotechnical hazards were observed and analyzed with specified software and also were documented in form of photographs. The presence of joints and fractures in the area made the rock blocks small and weak. The site is full of jointing; it was observed that, the northern side consists of 3 to 4 jointing systems with 2 random fractures associated with dykes. The southern part is affected by 2 to 3 jointing systems with minor fault and shear zones. From the field measurements and observations, it was concluded that, the Jabal Omar intruded by andesitic and basaltic dykes of different thickness and orientation. These dykes made the outcrop weak, highly deformed and made the rock masses sensitive to weathering.
Keywords: Rock, slope, stabilization, protection, Makkah.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14675917 Improvement of Reaction Technology of Decalin Halogenation
Authors: Dmitriy Yu. Korulkin, Ravshan M. Nuraliev, Raissa A. Muzychkina
Abstract:
In this research paper were investigated the main regularities of a radical bromination reaction of decalin. There had been studied the temperature effect, durations of reaction, frequency rate of process, a ratio of initial components, type and number of the initiator on decalin bromination degree. There were specified optimum conditions of synthesis of a perbromodecalin by the method of a decalin bromination. There are developed the technological flowchart of receiving a perbromodecalin and the mass balance of process on the first and the subsequent loadings of components. The results of research of antibacterial and antifungal activity of synthesized bromoderivatives have been represented.
Keywords: Decalin, optimum technology, perbromodecalin, radical bromination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22185916 Mathematical Modeling to Predict Surface Roughness in CNC Milling
Authors: Ab. Rashid M.F.F., Gan S.Y., Muhammad N.Y.
Abstract:
Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.
Keywords: Surface roughness, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21315915 Distribution of Phospholipids, Cholesterol and Carotenoids in Two-Solvent System during Egg Yolk Oil Solvent Extraction
Authors: Aleksandrs Kovalcuks, Mara Duma
Abstract:
Egg yolk oil is a concentrated source of egg bioactive compounds, such as fat-soluble vitamins, phospholipids, cholesterol, carotenoids and others. To extract lipids and other fat-soluble nutrients from liquid egg yolk, a two-step extraction process involving polar (ethanol) and non-polar (hexane) solvents were used. This extraction technique was based on egg yolk bioactive compounds polarities, where non-polar compound was extracted into non-polar hexane, but polar in to polar alcohol/water phase. But many egg yolk bioactive compounds are not strongly polar or non-polar. Egg yolk phospholipids, cholesterol and pigments are amphipatic (have both polar and non-polar regions) and their behavior in ethanol/hexane solvent system is not clear. The aim of this study was to clarify the behavior of phospholipids, cholesterol and carotenoids during extraction of egg yolk oil with ethanol and hexane and determine the loss of these compounds in egg yolk oil. Egg yolks and egg yolk oil were analyzed for phospholipids (phosphatidylcholine (PC) and phosphatidylethanolamine (PE)), cholesterol and carotenoids (lutein, zeaxanthin, canthaxanthin and β-carotene) content using GC-FID and HPLC methods. PC and PE are polar lipids and were extracted into polar ethanol phase. Concentration of PC in ethanol was 97.89% and PE 99.81% from total egg yolk phospholipids. Due to cholesterol’s partial extraction into ethanol, cholesterol content in egg yolk oil was reduced in comparison to its total content presented in egg yolk lipids. The highest amount of lutein and zeaxanthin was concentrated in ethanol extract. The opposite situation was observed with canthaxanthin and β-carotene, which became the main pigments of egg yolk oil.
Keywords: Cholesterol, egg yolk oil, lutein, phospholipids, solvent extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18615914 A Study of the Variables in the Optimisation of a Platinum Precipitation Process
Authors: Tebogo Phetla, Edison Muzenda, M Belaid
Abstract:
This study investigated possible ways to improve the efficiency of the platinum precipitation process using ammonium chloride by reducing the platinum content reporting to the effluent. The ore treated consist of five platinum group metals namely, ruthenium, rhodium, iridium, platinum, palladium and a precious metal gold. Gold, ruthenium, rhodium and iridium were extracted prior the platinum precipitation process. Temperature, reducing agent, flow rate and potential difference were the variables controlled to determine the operation conditions for optimum platinum precipitation efficiency. Hydrogen peroxide was added as the oxidizing agent at the temperature of 85-90oC and potential difference of 700-850mV was the variable used to check the oxidizing state of platinum. The platinum was further purified at temperature between 60-65oC, potential difference above 700 mV, ammonium chloride of 200 l, and at these conditions the platinum content reporting to the effluent was reduced to less than 300ppm, resulting in optimum platinum precipitation efficiency and purity of 99.9%.Keywords: Platinum Group Metals (PGM), Potential difference, Precipitation, Redox reactions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47895913 Process Development of Safe and Ready-to-eat Raw Oyster Meat by Irradiation Technology
Authors: Pattama Ratana-Arporn, Pongtep Wilaipun
Abstract:
White scar oyster (Crassostrea belcheri) is often eaten raw and being the leading vehicle for foodborne disease, especially Salmonella Weltevreden which exposed the prominent and most resistant to radiation. Gamma irradiation at a low dose of 1 kGy was enough to eliminate S. Weltevreden contaminated in oyster meat at a level up to 5 log CFU/g while it still retain the raw characteristics and equivalent sensory quality as the non-irradiated one. Process development of ready-to-eat chilled oyster meat was conducted by shucking the meat, individually packed in plastic bags, subjected to 1 kGy gamma radiation at chilled condition and then stored in 4oC refrigerated temperature. Microbiological determination showed the absence of S. Weltevreden (5 log CFU/g initial inoculated) along the whole storage time of 30 days. Sensory evaluation indicated the decreasing in sensory scores along storage time which determining the product shelf life to be 18 days compared to 15 days of nonirradiated one. The most advantage of developed process was to provide the safe raw oyster to consumers and in addition sensory quality retained and 3-day extension shelf life also exist.Keywords: decontamination, food safety, irradiation, oyster, Salmonella Weltevreden
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16845912 Simultaneous Saccharification and Fermentation(SSF) of Sugarcane Bagasse - Kinetics and Modeling
Authors: E.Sasikumar, T.Viruthagiri
Abstract:
Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Keywords: Sugarcane bagasse, ethanol, optimization, Pachysolen tannophilus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23025911 Unpacking Chilean Preservice Teachers’ Beliefs on Practicum Experiences through Digital Stories
Authors: Claudio Díaz, Mabel Ortiz
Abstract:
An EFL teacher education programme in Chile takes five years to train a future teacher of English. Preservice teachers are prepared to learn an advanced level of English and teach the language from 5th to 12th grade in the Chilean educational system. In the context of their first EFL Methodology course in year four, preservice teachers have to create a five-minute digital story that starts from a critical incident they have experienced as teachers-to-be during their observations or interventions in the schools. A critical incident can be defined as a happening, a specific incident or event either observed by them or involving them. The happening sparks their thinking and may make them subsequently think differently about the particular event. When they create their digital stories, preservice teachers put technology, teaching practice and theory together to narrate a story that is complemented by still images, moving images, text, sound effects and music. The story should be told as a personal narrative, which explains the critical incident. This presentation will focus on the creation process of 50 Chilean preservice teachers’ digital stories highlighting the critical incidents they started their stories. It will also unpack preservice teachers’ beliefs and reflections when approaching their teaching practices in schools. These beliefs will be coded and categorized through content analysis to evidence preservice teachers’ most rooted conceptions about English teaching and learning in Chilean schools. The findings seem to indicate that preservice teachers’ beliefs are strongly mediated by contextual and affective factors.Keywords: Beliefs, Digital stories, Preservice teachers, Practicum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14435910 The Impact of the Cell-Free Solution of Lactic Acid Bacteria on Cadaverine Production by Listeria monocytogenes and Staphylococcus aureus in Lysine-Decarboxylase Broth
Authors: Fatih Özogul, Nurten Toy, Yesim Özogul
Abstract:
The influences of cell-free solutions (CFSs) of lactic acid bacteria (LAB) on cadaverine and other biogenic amines production by Listeria monocytogenes and Staphylococcus aureus were investigated in lysine decarboxylase broth (LDB) using HPLC. Cell free solutions were prepared from Lactococcus lactis subsp. lactis, Leuconostoc mesenteroides subsp. cremoris, Pediococcus acidilactici and Streptococcus thermophiles. Two different concentrations that were 50% and 25% CFS and the control without CFSs were prepared. Significant variations on biogenic amine production were observed in the presence of L. monocytogenes and S. aureus (P < 0.05). The function of CFS on biogenic amine production by foodborne pathogens varied depending on strains and specific amine. Cadaverine formation by L. monocytogenes and S. aureus in control were 500.9 and 948.1 mg/L, respectively while the CFSs of LAB induced 4-fold lower cadaverine production by L. monocytogenes and 7-fold lower cadaverine production by S. aureus. The CFSs resulted in strong decreases in cadaverine and putrescine production by L. monocytogenes and S. aureus, although remarkable increases were observed for histamine, spermidine, spermine, serotonin, dopamine, tyramine and agmatine in the presence of LAB in lysine decarboxylase broth.
Keywords: Cell-free solution, lactic acid bacteria, cadaverine, food borne-pathogen.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21245909 Pin type Clamping Attachment for Remote Setup of Machining Process
Authors: Afzeri, R. Muhida, Darmawan, A. N. Berahim
Abstract:
Sharing the manufacturing facility through remote operation and monitoring of a machining process is challenge for effective use the production facility. Several automation tools in term of hardware and software are necessary for successfully remote operation of a machine. This paper presents a prototype of workpiece holding attachment for remote operation of milling process by self configuration the workpiece setup. The prototype is designed with mechanism to reorient the work surface into machining spindle direction with high positioning accuracy. Variety of parts geometry is hold by attachment to perform single setup machining. Pin type with array pattern additionally clamps the workpiece surface from two opposite directions for increasing the machining rigidity. Optimum pins configuration for conforming the workpiece geometry with minimum deformation is determined through hybrid algorithms, Genetic Algorithms (GA) and Particle Swarm Optimization (PSO). Prototype with intelligent optimization technique enables to hold several variety of workpiece geometry which is suitable for machining low of repetitive production in remote operation.Keywords: Optimization, Remote machining, GeneticAlgorithms, Machining Fixture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26395908 Destination of the Solid Waste Generated at the Agricultural Products Wholesale Market in Brazil
Authors: C de Almeida, I. M. Dal Fabbro
Abstract:
The Brazilian Agricultural Products Wholesale Market fits well as example of residues generating system, reaching 750 metric tons per month of total residues, from which 600 metric tons are organic material and 150 metric tons are recyclable materials. Organic material is basically composed of fruit, vegetables and flowers leftovers from the products commercialization. The recyclable compounds are generate from packing material employed in the commercialization process. This research work devoted efforts in carrying quantitative analysis of the residues generated in the agricultural enterprise at its final destination. Data survey followed the directions implemented by the Residues Management Program issued by the agricultural enterprise. It was noticed from that analysis the necessity of changing the logistics applied to the recyclable material collecting process. However, composting process was elected as the organic compounds destination which is considered adequate for a material composed of significant percentage of organic matter far higher than wood, cardboard and plastics contents.
Keywords: Composting, environment, recycling, solid waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20205907 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines
Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé
Abstract:
The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).
Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13305906 The Synthetic T2 Quality Control Chart and its Multi-Objective Optimization
Authors: Francisco Aparisi, Marco A. de Luna
Abstract:
In some real applications of Statistical Process Control it is necessary to design a control chart to not detect small process shifts, but keeping a good performance to detect moderate and large shifts in the quality. In this work we develop a new quality control chart, the synthetic T2 control chart, that can be designed to cope with this objective. A multi-objective optimization is carried out employing Genetic Algorithms, finding the Pareto-optimal front of non-dominated solutions for this optimization problem.Keywords: Multi-objective optimization, Quality Control, SPC, Synthetic T2 control chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15595905 The Sign in the Communication Process
Authors: S. Pesina, T. Solonchak
Abstract:
In the process of information transmission (concept verbalization) we deal mostly with the substance (contents), and then pay attention to the form. Recalling events from the remote past, often we cannot exactly reproduce specific heard or pronounced words, as well as the syntactic structures. We remember events, feelings, images; we recall the general contents of the discourse. The thought gets a specific language form only during the concept verbalization phase. With minimum time for pondering, depending on the language competence level, the grammar and syntactic shaping often occurs automatically with the use of famous models and stereotypes. This means that the language form adapts itself to the consciousness, and not vice versa.
Keywords: Lexical eidos, phenomenology, noema, polysemantic word, semantic core.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19425904 Application of Whole Genome Amplification Technique for Genotype Analysis of Bovine Embryos
Authors: S. Moghaddaszadeh-Ahrabi, S. Farajnia, Gh. Rahimi-Mianji, A. Nejati-Javaremi
Abstract:
In recent years, there has been an increasing interest toward the use of bovine genotyped embryos for commercial embryo transfer programs. Biopsy of a few cells in morulla stage is essential for preimplantation genetic diagnosis (PGD). Low amount of DNA have limited performing the several molecular analyses within PGD analyses. Whole genome amplification (WGA) promises to eliminate this problem. We evaluated the possibility and performance of an improved primer extension preamplification (I-PEP) method with a range of starting bovine genomic DNA from 1-8 cells into the WGA reaction. We optimized a short and simple I-PEP (ssI-PEP) procedure (~3h). This optimized WGA method was assessed by 6 loci specific polymerase chain reactions (PCRs), included restriction fragments length polymorphism (RFLP). Optimized WGA procedure possesses enough sensitivity for molecular genetic analyses through the few input cells. This is a new era for generating characterized bovine embryos in preimplantation stage.Keywords: Whole genome amplification (WGA), Genotyping, Bovine, Preimplantation genetic diagnosis (PGD)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16695903 A Contribution to 3D Modeling of Manufacturing Tolerance Optimization
Authors: F. Sebaa, A. Cheikh, M. Rahou
Abstract:
The study of the generated defects on manufactured parts shows the difficulty to maintain parts in their positions during the machining process and to estimate them during the pre-process plan. This work presents a contribution to the development of 3D models for the optimization of the manufacturing tolerances. An experimental study allows the measurement of the defects of part positioning for the determination of ε and the choice of an optimal setup of the part. An approach of 3D tolerance based on the small displacements method permits the determination of the manufacturing errors upstream. A developed tool, allows an automatic generation of the tolerance intervals along the three axes.Keywords: Manufacturing tolerances, 3D modeling, optimization, errors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15735902 Placement of Implants in Palatum of a Teenager without Maxillary Incisor Teeth
Authors: Luan Mavriqi, Ilma Robo, Emin Kuzimi, Egresa Baca
Abstract:
The process of skeletal growth in an adolescent significantly affects the displacement of implants placed in the palatine suture. The problems caused by this process have an impact on the dental function and aesthetics of the affected data. If fixed prostheses are placed based on implants, the whole structure would impede maxillary growth. This is the significant difference between the maxilla and the mandible, as the lower jaw has no growth process that affects the movement of the implants or the latter to inhibit the growth of the jaw. In a teenager patient an accident occurred accompanied by loss of maxillary central incisors. The main complaint of patients is aesthetics and phonetics. Dental history of patients refers to the presence of a Maryland bridge that was accompanied by dissatisfaction on the part of the patient. Implant placement is not indicated as jaw augmentation may lead to displacement of the implant. The treatment plan includes the placement of implants in the palatum where this bone thickness allows as a procedure.in this article only the first stage of treatment is presented. Implant treatment is ongoing, will be followed by the second phase of treatment when the patient has reached the age of 18 years.
Keywords: Implants, palatum, adolescent, primary incisor teeth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3945901 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant
Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan
Abstract:
The most important process of the water treatment plant process is coagulation, which uses alum and poly aluminum chloride (PACL). Therefore, determining the dosage of alum and PACL is the most important factor to be prescribed. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for chemical dose prediction, as used for coagulation, such as alum and PACL, with input data consisting of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of the Bangkhen Water Treatment Plant (BKWTP), under the authority of the Metropolitan Waterworks Authority of Thailand. The data were collected from 1 January 2019 to 31 December 2019 in order to cover the changing seasons of Thailand. The input data of ANN are divided into three groups: training set, test set, and validation set. The coefficient of determination and the mean absolute errors of the alum model are 0.73, 3.18 and the PACL model are 0.59, 3.21, respectively.
Keywords: Soft jar test, jar test, water treatment plant process, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6645900 Slow, Wet and Catalytic Pyrolysis of Fowl Manure
Authors: Renzo Carta, Mario Cruccu, Francesco Desogus
Abstract:
This work presents the experimental results obtained at a pilot plant which works with a slow, wet and catalytic pyrolysis process of dry fowl manure. This kind of process mainly consists in the cracking of the organic matrix and in the following reaction of carbon with water, which is either already contained in the organic feed or added, to produce carbon monoxide and hydrogen. Reactions are conducted in a rotating reactor maintained at a temperature of 500°C; the required amount of water is about 30% of the dry organic feed. This operation yields a gas containing about 59% (on a volume basis) of hydrogen, 17% of carbon monoxide and other products such as light hydrocarbons (methane, ethane, propane) and carbon monoxide in lesser amounts. The gas coming from the reactor can be used to produce not only electricity, through internal combustion engines, but also heat, through direct combustion in industrial boilers. Furthermore, as the produced gas is devoid of both solid particles and pollutant species (such as dioxins and furans), the process (in this case applied to fowl manure) can be considered as an optimal way for the disposal and the contemporary energetic valorization of organic materials, in such a way that is not damaging to the environment.Keywords: Brushwood, fowl manure, kenaf, pilot plant, pyrolysis, pyrolysis gas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23665899 Signal Driven Sampling and Filtering a Promising Approach for Time Varying Signals Processing
Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin
Abstract:
The mobile systems are powered by batteries. Reducing the system power consumption is a key to increase its autonomy. It is known that mostly the systems are dealing with time varying signals. Thus, we aim to achieve power efficiency by smartly adapting the system processing activity in accordance with the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting signal driven sampling and processing. In this context, a signal driven filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by analysing the input signal local variations. Thus, it correlates the processing activity with the signal variations. It leads towards a drastic computational gain of the proposed technique compared to the classical one.Keywords: Level Crossing Sampling, Activity Selection, Adaptive Rate Filtering, Computational Complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13615898 Integration Methods and Processes of Product Design and Flexible Production for Direct Production within the iCIM 3000 System
Authors: Roman Ružarovský, Radovan Holubek, Daynier Rolando Delgado Sobrino
Abstract:
Currently is characterized production engineering together with the integration of industrial automation and robotics such very quick view of to manufacture the products. The production range is continuously changing, expanding and producers have to be flexible in this regard. It means that need to offer production possibilities, which can respond to the quick change. Engineering product development is focused on supporting CAD software, such systems are mainly used for product design. That manufacturers are competitive, it should be kept procured machines made available capable of responding to output flexibility. In response to that problem is the development of flexible manufacturing systems, consisting of various automated systems. The integration of flexible manufacturing systems and subunits together with product design and of engineering is a possible solution for this issue. Integration is possible through the implementation of CIM systems. Such a solution and finding a hyphen between CAD and procurement system ICIM 3000 from Festo Co. is engaged in the research project and this contribution. This can be designed the products in CAD systems and watch the manufacturing process from order to shipping by the development of methods and processes of integration, This can be modeled in CAD systems products and watch the manufacturing process from order to shipping to develop methods and processes of integration, which will improve support for product design parameters by monitoring of the production process, by creating of programs for production using the CAD and therefore accelerates the a total of process from design to implementation.
Keywords: CAD- Computer Aided Design, CAM- Computer Aided Manufacturing, CIM- Computer integrated manufacturing, iCIM 3000, integration, direct production from CAD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22995897 Retrieving Extended High Dynamic Range from Digital Negative Image - An Experiment on Architectural Photo Imaging
Authors: See Zi Siang, Khairul Hazrin Hashim, Harold Thwaites, Lee Xia Sheng, Ooi Wooi Har
Abstract:
The paper explores the development of an optimization of method and apparatus for retrieving extended high dynamic range from digital negative image. Architectural photo imaging can benefit from high dynamic range imaging (HDRI) technique for preserving and presenting sufficient luminance in the shadow and highlight clipping image areas. The HDRI technique that requires multiple exposure images as the source of HDRI rendering may not be effective in terms of time efficiency during the acquisition process and post-processing stage, considering it has numerous potential imaging variables and technical limitations during the multiple exposure process. This paper explores an experimental method and apparatus that aims to expand the dynamic range from digital negative image in HDRI environment. The method and apparatus explored is based on a single source of RAW image acquisition for the use of HDRI post-processing. It will cater the optimization in order to avoid and minimize the conventional HDRI photographic errors caused by different physical conditions during the photographing process and the misalignment of multiple exposed image sequences. The study observes the characteristics and capabilities of RAW image format as digital negative used for the retrieval of extended high dynamic range process in HDRI environment.
Keywords: High Dynamic Range Image, Photography Workflow Optimization, Digital Negative Image, Architectural Image
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16175896 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Modeldriven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: crossdomain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: Automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2191