Search results for: backward-facing step
2554 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter
Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal
Abstract:
Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.Keywords: air, component-specific toxicity, human health risks, particulate matter
Procedia PDF Downloads 3112553 Finite Volume Method for Flow Prediction Using Unstructured Meshes
Authors: Juhee Lee, Yongjun Lee
Abstract:
In designing a low-energy-consuming buildings, the heat transfer through a large glass or wall becomes critical. Multiple layers of the window glasses and walls are employed for the high insulation. The gravity driven air flow between window glasses or wall layers is a natural heat convection phenomenon being a key of the heat transfer. For the first step of the natural heat transfer analysis, in this study the development and application of a finite volume method for the numerical computation of viscous incompressible flows is presented. It will become a part of the natural convection analysis with high-order scheme, multi-grid method, and dual-time step in the future. A finite volume method based on a fully-implicit second-order is used to discretize and solve the fluid flow on unstructured grids composed of arbitrary-shaped cells. The integrations of the governing equation are discretised in the finite volume manner using a collocated arrangement of variables. The convergence of the SIMPLE segregated algorithm for the solution of the coupled nonlinear algebraic equations is accelerated by using a sparse matrix solver such as BiCGSTAB. The method used in the present study is verified by applying it to some flows for which either the numerical solution is known or the solution can be obtained using another numerical technique available in the other researches. The accuracy of the method is assessed through the grid refinement.Keywords: finite volume method, fluid flow, laminar flow, unstructured grid
Procedia PDF Downloads 2862552 A Stepwise Approach for Piezoresistive Microcantilever Biosensor Optimization
Authors: Amal E. Ahmed, Levent Trabzon
Abstract:
Due to the low concentration of the analytes in biological samples, the use of Biological Microelectromechanical System (Bio-MEMS) biosensors for biomolecules detection results in a minuscule output signal that is not good enough for practical applications. In response to this, a need has arisen for an optimized biosensor capable of giving high output signal in response the detection of few analytes in the sample; the ultimate goal is being able to convert the attachment of a single biomolecule into a measurable quantity. For this purpose, MEMS microcantilevers based biosensors emerged as a promising sensing solution because it is simple, cheap, very sensitive and more importantly does not need analytes optical labeling (Label-free). Among the different microcantilever transducing techniques, piezoresistive based microcantilever biosensors became more prominent because it works well in liquid environments and has an integrated readout system. However, the design of piezoresistive microcantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. It was found that the parameters that can be optimized to enhance the sensitivity of Piezoresistive microcantilever-based sensors are: cantilever dimensions, cantilever material, cantilever shape, piezoresistor material, piezoresistor doping level, piezoresistor dimensions, piezoresistor position, Stress Concentration Region's (SCR) shape and position. After a systematic analyzation of the effect of each design and process parameters on the sensitivity, a step-wise optimization approach was developed in which almost all these parameters were variated one at each step while fixing the others to get the maximum possible sensitivity at the end. At each step, the goal was to optimize the parameter in a way that it maximizes and concentrates the stress in the piezoresistor region for the same applied force thus get the higher sensitivity. Using this approach, an optimized sensor that has 73.5x times higher electrical sensitivity (ΔR⁄R) than the starting sensor was obtained. In addition to that, this piezoresistive microcantilever biosensor it is more sensitive than the other similar sensors previously reported in the open literature. The mechanical sensitivity of the final senior is -1.5×10-8 Ω/Ω ⁄pN; which means that for each 1pN (10-10 g) biomolecules attach to this biosensor; the piezoresistor resistivity will decrease by 1.5×10-8 Ω. Throughout this work COMSOL Multiphysics 5.0, a commercial Finite Element Analysis (FEA) tool, has been used to simulate the sensor performance.Keywords: biosensor, microcantilever, piezoresistive, stress concentration region (SCR)
Procedia PDF Downloads 5712551 Multi-Criteria Evaluation of IDS Architectures in Cloud Computing
Authors: Elmahdi Khalil, Saad Enniari, Mostapha Zbakh
Abstract:
Cloud computing promises to increase innovation and the velocity with witch applications are deployed, all while helping any enterprise meet most IT service needs at a lower total cost of ownership and higher return investment. As the march of cloud continues, it brings both new opportunities and new security challenges. To take advantages of those opportunities while minimizing risks, we think that Intrusion Detection Systems (IDS) integrated in the cloud is one of the best existing solutions nowadays in the field. The concept of intrusion detection was known since past and was first proposed by a well-known researcher named Anderson in 1980's. Since that time IDS's are evolving. Although, several efforts has been made in the area of Intrusion Detection systems for cloud computing environment, many attacks still prevail. Therefore, the work presented in this paper proposes a multi criteria analysis and a comparative study between several IDS architectures designated to work in a cloud computing environments. To achieve this objective, in the first place we will search in the state of the art of several consistent IDS architectures designed to work in a cloud environment. Whereas, in a second step we will establish the criteria that will be useful for the evaluation of architectures. Later, using the approach of multi criteria decision analysis Mac Beth (Measuring Attractiveness by a Categorical Based Evaluation Technique we will evaluate the criteria and assign to each one the appropriate weight according to their importance in the field of IDS architectures in cloud computing. The last step is to evaluate architectures against the criteria and collecting results of the model constructed in the previous steps.Keywords: cloud computing, cloud security, intrusion detection/prevention system, multi-criteria decision analysis
Procedia PDF Downloads 4722550 Hydrodynamics and Heat Transfer Characteristics of a Solar Thermochemical Fluidized Bed Reactor
Authors: Selvan Bellan, Koji Matsubara, Nobuyuki Gokon, Tatsuya Kodama, Hyun Seok-Cho
Abstract:
In concentrated solar thermal industry, fluidized-bed technology has been used to produce hydrogen by thermochemical two step water splitting cycles, and synthetic gas by gasification of coal coke. Recently, couple of fluidized bed reactors have been developed and tested at Niigata University, Japan, for two-step thermochemical water splitting cycles and coal coke gasification using Xe light, solar simulator. The hydrodynamic behavior of the gas-solid flow plays a vital role in the aforementioned fluidized bed reactors. Thus, in order to study the dynamics of dense gas-solid flow, a CFD-DEM model has been developed; in which the contact forces between the particles have been calculated by the spring-dashpot model, based on the soft-sphere method. Heat transfer and hydrodynamics of a solar thermochemical fluidized bed reactor filled with ceria particles have been studied numerically and experimentally for beam-down solar concentrating system. An experimental visualization of particles circulation pattern and mixing of two-tower fluidized bed system has been presented. Simulation results have been compared with experimental data to validate the CFD-DEM model. Results indicate that the model can predict the particle-fluid flow of the two-tower fluidized bed reactor. Using this model, the key operating parameters can be optimized.Keywords: solar reactor, CFD-DEM modeling, fluidized bed, beam-down solar concentrating system
Procedia PDF Downloads 1972549 Economic Valuation of Environmental Services Sustained by Flamboyant Park in Goiania-Go, Brazil
Authors: Brenda R. Berca, Jessica S. Vieira, Lucas G. Candido, Matheus C. Ferreira, Paulo S. A. Lopes Filho, Rafaella O. Baracho
Abstract:
This study aims to estimate the economic value environmental services sustained by Flamboyant Lourival Louza Municipal Park in Goiânia, Goiás, Brazil. The Flamboyant Park is one of the most relevant urban parks, and it is located near a stadium, a shopping center, and two supercenters. In order to define the methods used for the valuation of Flamboyant Park, the first step was carrying out bibliographical research with the view to better understand which method is most feasible to valuate the Park. Thus, the following direct methods were selected: travel cost, hedonic pricing, and contingent valuation. In addition, an indirect method (replacement cost) was applied at Flamboyant Park. The second step was creating and applying two surveys. The first survey aimed at the visitors of the park, addressing socio-economic issues, the use of the Park, as well as its importance and the willingness the visitors, had to pay for its existence. The second survey was destined to the existing trade in the Park, in order to collect data regarding the profits obtained by them. In the end, the characterization of the profile of the visitors and the application of the methods of contingent valuation, travel cost, replacement cost and hedonic pricing were obtained, thus monetarily valuing the various ecosystem services sustained by the park. Some services were not valued due to difficulties encountered during the process.Keywords: contingent valuation, ecosystem services, economic environmental valuation, hedonic pricing, travel cost
Procedia PDF Downloads 2272548 Continuous Production of Prebiotic Pectic Oligosaccharides from Sugar Beet Pulp in a Continuous Cross Flow Membrane Bioreactor
Authors: Neha Babbar, S. Van Roy, W. Dejonghe, S. Sforza, K. Elst
Abstract:
Pectic oligosaccharides (a class of prebiotics) are non-digestible carbohydrates which benefits the host by stimulating the growth of healthy gut micro flora. Production of prebiotic pectic oligosaccharides (POS) from pectin rich agricultural residues involves a cutting of long chain polymer of pectin to oligomers of pectin while avoiding the formation of monosaccharides. The objective of the present study is to develop a two-step continuous biocatalytic membrane reactor (MER) for the continuous production of POS (from sugar beet pulp) in which conversion is combined with separation. Optimization of the ratio of POS/monosaccharides, stability and productivities of the process was done by testing various residence times (RT) in the reactor vessel with diluted (10 RT, 20 RT, and 30 RT) and undiluted (30 RT, 40 RT and 60 RT) substrate. The results show that the most stable processes (steady state) were 20 RT and 30 RT for diluted substrate and 40 RT and 60 RT for undiluted substrate. The highest volumetric and specific productivities of 20 g/L/h and 11 g/gE/h; 17 g/l/h and 9 g/gE/h were respectively obtained with 20 RT (diluted substrate) and 40 RT (undiluted substrate). Under these conditions, the permeates of the reactor test with 20 RT (diluted substrate) consisted of 80 % POS fractions while that of 40 RT (undiluted substrate) resulted in 70% POS fractions. A two-step continuous biocatalytic MER for the continuous POS production looks very promising for the continuous production of tailor made POS. Although both the processes i.e 20 RT (diluted substrate) and 40 RT (undiluted substrate) gave the best results, but for an Industrial application it is preferable to use an undiluted substrate.Keywords: pectic oligosaccharides, membrane reactor, residence time, specific productivity, volumetric productivity
Procedia PDF Downloads 4402547 Optimization of Hepatitis B Surface Antigen Purifications to Improving the Production of Hepatitis B Vaccines on Pichia pastoris
Authors: Rizky Kusuma Cahyani
Abstract:
Hepatitis B is a liver inflammatory disease caused by hepatitis B virus (HBV). This infection can be prevented by vaccination which contains HBV surface protein (sHBsAg). However, vaccine supply is limited. Several attempts have been conducted to produce local sHBsAg. However, the purity degree and protein yield are still inadequate. Therefore optimization of HBsAg purification steps is required to obtain high yield with better purification fold. In this study, optimization of purification was done in 2 steps, precipitation using variation of NaCl concentration (0,3 M; 0,5 M; 0,7 M) and PEG (3%, 5%, 7%); ion exchange chromatography (IEC) using NaCl 300-500 mM elution buffer concentration.To determine HBsAg protein, bicinchoninic acid assay (BCA) and enzyme-linked immunosorbent assay (ELISA) was used in this study. Visualization of HBsAg protein was done by SDS-PAGE analysis. Based on quantitative analysis, optimal condition at precipitation step was given 0,3 M NaCl and PEG 3%, while in ion exchange chromatography step, the optimum condition when protein eluted with NaCl 500 mM. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) analysis indicates that the presence of protein HBsAg with a molecular weight of 25 kDa (monomer) and 50 kDa (dimer). The optimum condition for purification of sHBsAg produced in Pichia pastoris gave a yield of 47% and purification fold 17x so that it would increase the production of hepatitis B vaccine to be more optimal.Keywords: hepatitis B virus, HBsAg, hepatitis B surface antigen, Pichia pastoris, purification
Procedia PDF Downloads 1512546 Computer-Aided Detection of Simultaneous Abdominal Organ CT Images by Iterative Watershed Transform
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
Interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis applications. Segmentation of liver, spleen and kidneys is regarded as a major primary step in the computer-aided diagnosis of abdominal organ diseases. In this paper, a semi-automated method for medical image data is presented for the abdominal organ segmentation data using mathematical morphology. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. Our algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter, we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, simultaneous organ segmentation, the watershed algorithm
Procedia PDF Downloads 4412545 Sediment Wave and Cyclic Steps as Mechanism for Sediment Transport in Submarine Canyons Thalweg
Authors: Taiwo Olusoji Lawrence, Peace Mawo Aaron
Abstract:
Seismic analysis of bedforms has proven to be one of the best ways to study deepwater sedimentary features. Canyons are known to be sediment transportation conduit. Sediment wave are large-scale depositional bedforms in various parts of the world's oceans formed predominantly by suspended load transport. These undulating objects usually have tens of meters to a few kilometers in wavelength and a height of several meters. Cyclic steps have long long-wave upstream-migrating bedforms confined by internal hydraulic jumps. They usually occur in regions with high gradients and slope breaks. Cyclic steps and migrating sediment waves are the most common bedform on the seafloor. Cyclic steps and related sediment wave bedforms are significant to the morpho-dynamic evolution of deep-water depositional systems architectural elements, especially those located along tectonically active margins with high gradients and slope breaks that can promote internal hydraulic jumps in turbidity currents. This report examined sedimentary activities and sediment transportation in submarine canyons and provided distinctive insight into factors that created a complex seabed canyon system in the Ceara Fortaleza basin Brazilian Equatorial Margin (BEM). The growing importance of cyclic steps made it imperative to understand the parameters leading to their formation, migration, and architecture as well as their controls on sediment transport in canyon thalweg. We extracted the parameters of the observed bedforms and evaluated the aspect ratio and asymmetricity. We developed a relationship between the hydraulic jump magnitude, depth of the hydraulic fall and the length of the cyclic step therein. It was understood that an increase in the height of the cyclic step increases the magnitude of the hydraulic jump and thereby increases the rate of deposition on the preceding stoss side. An increase in the length of the cyclic steps reduces the magnitude of the hydraulic jump and reduces the rate of deposition at the stoss side. Therefore, flat stoss side was noticed at most preceding cyclic step and sediment wave.Keywords: Ceara Fortaleza, submarine canyons, cyclic steps, sediment wave
Procedia PDF Downloads 1142544 A Comparison of Inverse Simulation-Based Fault Detection in a Simple Robotic Rover with a Traditional Model-Based Method
Authors: Murray L. Ireland, Kevin J. Worrall, Rebecca Mackenzie, Thaleia Flessa, Euan McGookin, Douglas Thomson
Abstract:
Robotic rovers which are designed to work in extra-terrestrial environments present a unique challenge in terms of the reliability and availability of systems throughout the mission. Should some fault occur, with the nearest human potentially millions of kilometres away, detection and identification of the fault must be performed solely by the robot and its subsystems. Faults in the system sensors are relatively straightforward to detect, through the residuals produced by comparison of the system output with that of a simple model. However, faults in the input, that is, the actuators of the system, are harder to detect. A step change in the input signal, caused potentially by the loss of an actuator, can propagate through the system, resulting in complex residuals in multiple outputs. These residuals can be difficult to isolate or distinguish from residuals caused by environmental disturbances. While a more complex fault detection method or additional sensors could be used to solve these issues, an alternative is presented here. Using inverse simulation (InvSim), the inputs and outputs of the mathematical model of the rover system are reversed. Thus, for a desired trajectory, the corresponding actuator inputs are obtained. A step fault near the input then manifests itself as a step change in the residual between the system inputs and the input trajectory obtained through inverse simulation. This approach avoids the need for additional hardware on a mass- and power-critical system such as the rover. The InvSim fault detection method is applied to a simple four-wheeled rover in simulation. Additive system faults and an external disturbance force and are applied to the vehicle in turn, such that the dynamic response and sensor output of the rover are impacted. Basic model-based fault detection is then employed to provide output residuals which may be analysed to provide information on the fault/disturbance. InvSim-based fault detection is then employed, similarly providing input residuals which provide further information on the fault/disturbance. The input residuals are shown to provide clearer information on the location and magnitude of an input fault than the output residuals. Additionally, they can allow faults to be more clearly discriminated from environmental disturbances.Keywords: fault detection, ground robot, inverse simulation, rover
Procedia PDF Downloads 3082543 Repeated Suicidal Attempts in Foster Teenagers: Breaking the Cycle Using a Stepped Care Approach
Authors: Mathilde Blondon, Salla Aicha Dieng, Catherine Pfister
Abstract:
In a paradoxical way, teenagers nowadays seem to use suicidal attempts to elaborate on their trauma abuses and regain some kind of control in their lives. As their behavior becomes life-threatening, the hospital offers a variety of expertise to address their need, with Child Protective Services also joining in, to a point when teenagers could have a feeling of losing control of their lives, which results in them making more suicidal attempts. Our goal here is to walk with these foster teenagers long enough to step therapy up first, then as their mental health is restored enough to step the therapy down in a way that is secure and will give them their life back. This would prevent them from making suicidal attempts to get a feeling of control over their life. We’ll present a clinical case of a 14-year-old girl named Sofia, who was suffering from parental deprivation, an identity disorder, and severe depression disorder. Our intervention took place in January 2024, after Sofia had undergone four hospitalizations, including a two-month period in a specialized clinic. In a stepping-up effort, a substantial setting has been built around Sofia. She was coming three days a week to therapeutic activities at the Child Psychiatry Day Hospital, she had one psychotherapy session a week at the Medical-Psychological Center, and she was meeting with the Adolescent Psychiatrist on a regular basis. However, her suicidal attempts frequency continued to increase to the point when she couldn’t stay more than four days outside the hospital unit without harming herself and being brought back to the Emergency Unit. We were all stuck in some kind of medical deadlock, writing to clinics that had no room for her while social workers were calling foster homes that wouldn’t even accept her either. At some point, a clinical decision was made by the psychiatrist to stop what appeared to be a global movement of traumatic repetition, which involved Sofia’s family, the medical team and the social workers as one. This decision to step therapy down created a surprise and put an end to the cycle. It provided a new path, a new solution where Sofia could securely settle without being unfaithful to her family. Her suicidal attempts stopped for four weeks. She had one relapse, then didn’t make another attempt so far. There is a fine line between too little and too much, a pathway with the right amount of care and support. We believe it is not a steady line but rather a path up and down the hill. It’s about building up this moment when medication and mental processes have improved the subject’s condition enough to allow the medical team to step therapy down and give more control back to the subject. These needed variations used to come from a change of hospital or medical team. Stepped care avoids any breaking of bonds and appears to be decisive in stopping teenagers’ suicidal attempts.Keywords: child protection, adolescent psychiatry, teenager suicidal attempt, foster teenagers, parental deprivation, stepped care
Procedia PDF Downloads 362542 Computer Network Applications, Practical Implementations and Structural Control System Representations
Authors: El Miloudi Djelloul
Abstract:
The computer network play an important position for practical implementations of the differently system. To implement a system into network above all is needed to know all the configurations, which is responsible to be a part of the system, and to give adequate information and solution in realtime. So if want to implement this system for example in the school or relevant institutions, the first step is to analyze the types of model which is needed to be configured and another important step is to organize the works in the context of devices, as a part of the general system. Often before configuration, as important point is descriptions and documentations from all the works into the respective process, and then to organize in the aspect of problem-solving. The computer network as critic infrastructure is very specific so the paper present the effectiveness solutions in the structured aspect viewed from one side, and another side is, than the paper reflect the positive aspect in the context of modeling and block schema presentations as an better alternative to solve the specific problem because of continually distortions of the system from the line of devices, programs and signals or packed collisions, which are in movement from one computer node to another nodes.Keywords: local area networks, LANs, block schema presentations, computer network system, computer node, critical infrastructure packed collisions, structural control system representations, computer network, implementations, modeling structural representations, companies, computers, context, control systems, internet, software
Procedia PDF Downloads 3652541 Inulinase Immobilization on Functionalized Magnetic Nanoparticles Prepared with Soy Protein Isolate Conjugated Bovine Serum Albumin for High Fructose Syrup Production
Authors: Homa Torabizadeh, Mohaddeseh Mikani
Abstract:
Inulinase from Aspergillus niger was covalently immobilized on magnetic nanoparticles (MNPs/Fe3O4) covered with soy protein isolate (SPI/Fe3O4) functionalized by bovine serum albumin (BSA) nanoparticles. MNPs are promising enzyme carriers because they separate easily under external magnetic fields and have enhanced immobilized enzyme reusability. As MNPs aggregate simply, surface coating strategy was employed. SPI functionalized by BSA was a suitable candidate for nanomagnetite coating due to its superior biocompatibility and hydrophilicity. Fe3O4@SPI-BSA nanoparticles were synthesized as a novel carrier with narrow particle size distribution. Step by step fabrication monitoring of Fe3O4@SPI-BSA nanoparticles was performed using field emission scanning electron microscopy and dynamic light scattering. The results illustrated that nanomagnetite with the spherical morphology was well monodispersed with the diameter of about 35 nm. The average size of the SPI-BSA nanoparticles was 80 to 90 nm, and their zeta potential was around −34 mV. Finally, the mean diameter of fabricated Fe3O4@SPI-BSA NPs was less than 120 nm. Inulinase enzyme from Aspergillus niger was covalently immobilized through gluteraldehyde on Fe3O4@SPI-BSA nanoparticles successfully. Fourier transform infrared spectra and field emission scanning electron microscopy images provided sufficient proof for the enzyme immobilization on the nanoparticles with 80% enzyme loading.Keywords: high fructose syrup, inulinase immobilization, functionalized magnetic nanoparticles, soy protein isolate
Procedia PDF Downloads 2992540 A Two-Stage Process for the Sustainable Production of Aliphatic Polyesters
Authors: A. Douka, S. Vouyiouka, L. M. Papaspyridi, D. Korres, C. Papaspyrides
Abstract:
A "green" process was studied for the preparation of partially renewable aliphatic polyesters based on 1,4-butanediol and 1,8-octanediol with various diacids and derivatives, namely diethyl succinate, adipic acid, sebacic acid, 1,12-dodecanedioic acid and 1,14-tetradecanedioic acid. A first step of enzymatic prepolymerization was carried out in the presence of two different solvents, toluene and diphenylether, applying molecular sieves and vacuum, respectively, to remove polycondensation by-products. Poly(octylene adipate) (PE 8.6), poly(octylene dodecanate)(PE 8.12) and poly(octylene tetradecanate) (PE 8.14) were firstly enzymatically produced in toluene using molecular sieves giving however, low-molecular-weight products. Thereafter, the synthesis of PE 8.12 and PE 8.14 was examined under optimized conditions using diphenylether as solvent and a more vigorous by-product removal step, such as application of vacuum. Apart from these polyesters, the optimized process was also implemented for the production of another long-chain polyester-poly(octylene sebacate) (PE 8.10) and a short-chain polyester-poly(butylene succinate) (PE 4.4). Subsequently, bulk post-polymerization in the melt or solid state was performed. SSP runs involved absence of biocatalyst and reaction temperatures (T) in the vicinity of the prepolymer melting point (Tm-T varied between 15.5 up to 4oC). Focusing on PE 4.4 and PE 8.12, SSP took place under vacuum or flowing nitrogen leading to increase of the molecular weight and improvement of the end product physical appearance and thermal properties.Keywords: aliphatic polyester, enzymatic polymerization, solid state polymerization, Novozym 435
Procedia PDF Downloads 3242539 Effects of Soaking of Maize on the Viscosity of Masa and Tortilla Physical Properties at Different Nixtamalization Times
Authors: Jorge Martínez-Rodríguez, Esther Pérez-Carrillo, Diana Laura Anchondo Álvarez, Julia Lucía Leal Villarreal, Mariana Juárez Dominguez, Luisa Fernanda Torres Hernández, Daniela Salinas Morales, Erick Heredia-Olea
Abstract:
Maize tortillas are a staple food in Mexico which are mostly made by nixtamalization, which includes the cooking and steeping of maize kernels in alkaline conditions. The cooking step in nixtamalization demands a lot of energy and also generates nejayote, a water pollutant, at the end of the process. The aim of this study was to reduce the cooking time by adding a maize soaking step before nixtamalization while maintaining the quality properties of masa and tortillas. Maize kernels were soaked for 36 h to increase moisture up to 36%. Then, the effect of different cooking times (0, 5, 10, 15, 20, 20, 25, 30, 35, 45-control and 50 minutes) was evaluated on viscosity profile (RVA) of masa to select the treatments with a profile similar or equal to control. All treatments were left steeping overnight and had the same milling conditions. Treatments selected were 20- and 25-min cooking times which had similar values for pasting temperature (79.23°C and 80.23°C), Maximum Viscosity (105.88 Cp and 96.25 Cp) and Final Viscosity (188.5 Cp and 174 Cp) to those of 45 min-control (77.65 °C, 110.08 Cp, and 186.70 Cp, respectively). Afterward, tortillas were produced with the chosen treatments (20 and 25 min) and for control, then were analyzed for texture, damage starch, colorimetry, thickness, and average diameter. Colorimetric analysis of tortillas only showed significant differences for yellow/blue coordinates (b* parameter) at 20 min (0.885), unlike the 25-minute treatment (1.122). Luminosity (L*) and red/green coordinates (a*) showed no significant differences from treatments with respect control (69.912 and 1.072, respectively); however, 25 minutes was closer in both parameters (73.390 and 1.122) than 20 minutes (74.08 and 0.884). For the color difference, (E), the 25 min value (3.84) was the most similar to the control. However, for tortilla thickness and diameter, the 20-minute with 1.57 mm and 13.12 cm respectively was closer to those of the control (1.69 mm and 13.86 cm) although smaller to it. On the other hand, the 25 min treatment tortilla was smaller than both 20 min and control with 1.51 mm thickness and 13.590 cm diameter. According to texture analyses, there was no difference in terms of stretchability (8.803-10.308 gf) and distance for the break (95.70-126.46 mm) among all treatments. However, for the breaking point, all treatments (317.1 gf and 276.5 gf for 25 and 20- min treatment, respectively) were significantly different from the control tortilla (392.2 gf). Results suggest that by adding a soaking step and reducing cooking time by 25 minutes, masa and tortillas obtained had similar functional and textural properties to the traditional nixtamalization process.Keywords: tortilla, nixtamalization, corn, lime cooking, RVA, colorimetry, texture, masa rheology
Procedia PDF Downloads 1772538 Simulation-Based Parametric Study for the Hybrid Superplastic Forming of AZ31
Authors: Fatima Ghassan Al-Abtah, Naser Al-Huniti, Elsadig Mahdi
Abstract:
As the lightest constructional metal on earth, magnesium alloys offer excellent potential for weight reduction in the transportation industry, and it was observed that some magnesium alloys exhibit superior ductility and superplastic behavior at high temperatures. The main limitation of the superplastic forming (SPF) includes the low production rate since it needs a long forming time for each part. Through this study, an SPF process that starts with a mechanical pre-forming stage is developed to promote formability and reduce forming time. A two-dimensional finite element model is used to simulate the process. The forming process consists of two steps. At the pre-forming step (deep drawing), the sheet is drawn into the die to a preselected level, using a mechanical punch, and at the second step (SPF) a pressurized gas is applied at a controlled rate. It is shown that a significant reduction in forming time and improved final thickness uniformity can be achieved when the hybrid forming technique is used, where the process achieved a fully formed part at 400°C. Investigation for the impact of different forming process parameters achieved by comparing forming time and the distribution of final thickness that were obtained from the simulation analysis. Maximum thinning decreased from over 67% to less than 55% and forming time significantly decreased by more than 6 minutes, and the required gas pressure profile was predicted for optimum forming process parameters based on the 0.001/sec target constant strain rate within the sheet.Keywords: magnesium, plasticity, superplastic forming, finite element analysis
Procedia PDF Downloads 1562537 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial
Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester
Abstract:
First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution
Procedia PDF Downloads 3742536 Evaluation of Vehicle Classification Categories: Florida Case Study
Authors: Ren Moses, Jaqueline Masaki
Abstract:
This paper addresses the need for accurate and updated vehicle classification system through a thorough evaluation of vehicle class categories to identify errors arising from the existing system and proposing modifications. The data collected from two permanent traffic monitoring sites in Florida were used to evaluate the performance of the existing vehicle classification table. The vehicle data were collected and classified by the automatic vehicle classifier (AVC), and a video camera was used to obtain ground truth data. The Federal Highway Administration (FHWA) vehicle classification definitions were used to define vehicle classes from the video and compare them to the data generated by AVC in order to identify the sources of misclassification. Six types of errors were identified. Modifications were made in the classification table to improve the classification accuracy. The results of this study include the development of updated vehicle classification table with a reduction in total error by 5.1%, a step by step procedure to use for evaluation of vehicle classification studies and recommendations to improve FHWA 13-category rule set. The recommendations for the FHWA 13-category rule set indicate the need for the vehicle classification definitions in this scheme to be updated to reflect the distribution of current traffic. The presented results will be of interest to States’ transportation departments and consultants, researchers, engineers, designers, and planners who require accurate vehicle classification information for planning, designing and maintenance of transportation infrastructures.Keywords: vehicle classification, traffic monitoring, pavement design, highway traffic
Procedia PDF Downloads 1812535 Design and Simulation of Low Cost Boost-Half- Bridge Microinverter with Grid Connection
Authors: P. Bhavya, P. R. Jayasree
Abstract:
This paper presents a low cost transformer isolated boost half bridge micro-inverter for single phase grid connected PV system. Since the output voltage of a single PV panel is as low as 20~50V, a high voltage gain inverter is required for the PV panel to connect to the single-phase grid. The micro-inverter has two stages, an isolated dc-dc converter stage and an inverter stage with a dc link. To achieve MPPT and to step up the PV voltage to the dc link voltage, a transformer isolated boost half bridge dc-dc converter is used. To output the synchronised sinusoidal current with unity power factor to the grid, a pulse width modulated full bridge inverter with LCL filter is used. Variable step size Maximum Power Point Tracking (MPPT) method is adopted such that fast tracking and high MPPT efficiency are both obtained. AC voltage as per grid requirement is obtained at the output of the inverter. High power factor (>0.99) is obtained at both heavy and light loads. This paper gives the results of computer simulation program of a grid connected solar PV system using MATLAB/Simulink and SIM Power System tool.Keywords: boost-half-bridge, micro-inverter, maximum power point tracking, grid connection, MATLAB/Simulink
Procedia PDF Downloads 3412534 Identification and Characterization of Antimicrobial Peptides Isolated from Entophytic Bacteria and Their Activity against Multidrug-Resistance Gram-Negative Bacteria in South Korea
Authors: Maryam Beiranvand
Abstract:
Multi-drug resistance in various microorganisms has increased globally in many healthcare facilities. Less effective antimicrobial activity of drug therapies for infection control becomes trouble. Since 1980, no new type of antimicrobial drug has been identified, even though combinations of antibiotic drugs have been discovered almost every decade. Between 1981 and 2006, over 70% of novel pharmaceuticals and chemical agents came from natural sources. Microorganisms have yielded almost 22,000 natural compounds. The identification of antimicrobial components from endophytes bacteria could help overcome the threat posed by multi-drug resistant strains. The project aims to analyze and identify antimicrobial peptides isolated from entophytic bacteria and their activity against multidrug-resistant Gram-negative bacteria in South Korea. Endophytic Paenibacillus polymyxa. 4G3 isolated from the plant, Gynura procumbery exhibited considerable antimicrobial activity against Methicillin-resistant Staphylococcus aureus, and Escherichia coli. The Rapid Annotations using Subsystems Technology showed that the total size of the draft genome was 5,739,603bp, containing 5178 genes with 45.8% G+C content. Genome annotation using antiSMASH version 6.0.0 was performed, which predicted the most common types of non-ribosomal peptide synthetase (NRPS) and polyketide synthase (PKS). In this study, diethyl aminoethyl cellulose (DEAEC) resin was used as the first step in purifying for unknown peptides, and then the target protein was identified using hydrophilic and hydrophobic solutions, optimal pH, and step-by-step tests for antimicrobial activity. This crude was subjected to C18 chromatography and elution with 0, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90%, and 100% methanol, respectively. Only the fraction eluted with 20% -60% methanol demonstrated good antimicrobial activity against MDR E. coli. The concentration of the active fragment was measured by the Brad-ford test, and Protein A280 - Thermo Fisher Scientific at the end by examining the SDS PAGE Resolving Gel, 10% Acrylamide and purity were confirmed. Our study showed that, based on the combined results of the analysis and purification. P polymyxa. 4G3 has a high potential exists for producing novel functions of polymyxin E and bacitracin against bacterial pathogens.Keywords: endophytic bacteria, antimicrobial activity, antimicrobial peptide, whole genome sequencing analysis, multi -drug resistance gram negative bacteria
Procedia PDF Downloads 772533 Estimation of the Length and Location of Ground Surface Deformation Caused by the Reverse Faulting
Authors: Nader Khalafian, Mohsen Ghaderi
Abstract:
Field observations have revealed many examples of structures which were damaged due to ground surface deformation caused by the faulting phenomena. In this paper some efforts were made in order to estimate the length and location of the ground surface where large displacements were created due to the reverse faulting. This research has conducted in two steps; (1) in the first step, a 2D explicit finite element model were developed using ABAQUS software. A subroutine for Mohr-Coulomb failure criterion with strain softening model was developed by the authors in order to properly model the stress strain behavior of the soil in the fault rapture zone. The results of the numerical analysis were verified with the results of available centrifuge experiments. Reasonable coincidence was found between the numerical and experimental data. (2) In the second step, the effects of the fault dip angle (δ), depth of soil layer (H), dilation and friction angle of sand (ψ and φ) and the amount of fault offset (d) on the soil surface displacement and fault rupture path were investigated. An artificial neural network-based model (ANN), as a powerful prediction tool, was developed to generate a general model for predicting faulting characteristics. A properly sized database was created to train and test network. It was found that the length and location of the zone of displaced ground surface can be accurately estimated using the proposed model.Keywords: reverse faulting, surface deformation, numerical, neural network
Procedia PDF Downloads 4212532 Tomato-Weed Classification by RetinaNet One-Step Neural Network
Authors: Dionisio Andujar, Juan lópez-Correa, Hugo Moreno, Angela Ri
Abstract:
The increased number of weeds in tomato crops highly lower yields. Weed identification with the aim of machine learning is important to carry out site-specific control. The last advances in computer vision are a powerful tool to face the problem. The analysis of RGB (Red, Green, Blue) images through Artificial Neural Networks had been rapidly developed in the past few years, providing new methods for weed classification. The development of the algorithms for crop and weed species classification looks for a real-time classification system using Object Detection algorithms based on Convolutional Neural Networks. The site study was located in commercial corn fields. The classification system has been tested. The procedure can detect and classify weed seedlings in tomato fields. The input to the Neural Network was a set of 10,000 RGB images with a natural infestation of Cyperus rotundus l., Echinochloa crus galli L., Setaria italica L., Portulaca oeracea L., and Solanum nigrum L. The validation process was done with a random selection of RGB images containing the aforementioned species. The mean average precision (mAP) was established as the metric for object detection. The results showed agreements higher than 95 %. The system will provide the input for an online spraying system. Thus, this work plays an important role in Site Specific Weed Management by reducing herbicide use in a single step.Keywords: deep learning, object detection, cnn, tomato, weeds
Procedia PDF Downloads 1042531 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm
Procedia PDF Downloads 3342530 Modeling the Acquisition of Expertise in a Sequential Decision-Making Task
Authors: Cristóbal Moënne-Loccoz, Rodrigo C. Vergara, Vladimir López, Domingo Mery, Diego Cosmelli
Abstract:
Our daily interaction with computational interfaces is plagued of situations in which we go from inexperienced users to experts through self-motivated exploration of the same task. In many of these interactions, we must learn to find our way through a sequence of decisions and actions before obtaining the desired result. For instance, when drawing cash from an ATM machine, choices are presented in a step-by-step fashion so that a specific sequence of actions must be performed in order to produce the expected outcome. But, as they become experts in the use of such interfaces, do users adopt specific search and learning strategies? Moreover, if so, can we use this information to follow the process of expertise development and, eventually, predict future actions? This would be a critical step towards building truly adaptive interfaces that can facilitate interaction at different moments of the learning curve. Furthermore, it could provide a window into potential mechanisms underlying decision-making behavior in real world scenarios. Here we tackle this question using a simple game interface that instantiates a 4-level binary decision tree (BDT) sequential decision-making task. Participants have to explore the interface and discover an underlying concept-icon mapping in order to complete the game. We develop a Hidden Markov Model (HMM)-based approach whereby a set of stereotyped, hierarchically related search behaviors act as hidden states. Using this model, we are able to track the decision-making process as participants explore, learn and develop expertise in the use of the interface. Our results show that partitioning the problem space into such stereotyped strategies is sufficient to capture a host of exploratory and learning behaviors. Moreover, using the modular architecture of stereotyped strategies as a Mixture of Experts, we are able to simultaneously ask the experts about the user's most probable future actions. We show that for those participants that learn the task, it becomes possible to predict their next decision, above chance, approximately halfway through the game. Our long-term goal is, on the basis of a better understanding of real-world decision-making processes, to inform the construction of interfaces that can establish dynamic conversations with their users in order to facilitate the development of expertise.Keywords: behavioral modeling, expertise acquisition, hidden markov models, sequential decision-making
Procedia PDF Downloads 2522529 Second Generation Biofuels: A Futuristic Green Deal for Lignocellulosic Waste
Authors: Nivedita Sharma
Abstract:
The global demand for fossil fuels is very high, but their use is not sustainable since its reserves are declining. Additionally, fossil fuels are responsible for the accumulation of greenhouse gases. The emission of greenhouse gases from the transport sector can be reduced by substituting fossil fuels by biofuels. Thus, renewable fuels capable of sequestering carbon dioxide are in high demand. Second‐generation biofuels, which require lignocellulosic biomass as a substrate and ultimately producing ethanol, fall largely in this category. Bioethanol is a favorable and near carbon-neutral renewable biofuel leading to reduction in tailpipe pollutant emission and improving the ambient air quality. Lignocellulose consists of three main components: cellulose, hemicellulose and lignin which can be converted to ethanol with the help of microbial enzymes. Enzymatic hydrolysis of lignocellulosic biomass in 1st step is considered as the most efficient and least polluting methods for generating fermentable hexose and pentose sugars which subsequently are fermented to power alcohol by yeasts in 2nd step of the process. In the present technology, a complete bioconversion process i.e. potential hydrolytic enzymes i.e. cellulase and xylanase producing microorganisms have been isolated from different niches, screened for enzyme production, identified using phenotyping and genotyping, enzyme production, purification and application of enzymes for saccharification of different lignocellulosic biomass followed by fermentation of hydrolysate to ethanol with high yield is to be presented in detail.Keywords: cellulase, xylanase, lignocellulose, bioethanol, microbial enzymes
Procedia PDF Downloads 982528 Implementation of a Paraconsistent-Fuzzy Digital PID Controller in a Level Control Process
Authors: H. M. Côrtes, J. I. Da Silva Filho, M. F. Blos, B. S. Zanon
Abstract:
In a modern society the factor corresponding to the increase in the level of quality in industrial production demand new techniques of control and machinery automation. In this context, this work presents the implementation of a Paraconsistent-Fuzzy Digital PID controller. The controller is based on the treatment of inconsistencies both in the Paraconsistent Logic and in the Fuzzy Logic. Paraconsistent analysis is performed on the signals applied to the system inputs using concepts from the Paraconsistent Annotated Logic with annotation of two values (PAL2v). The signals resulting from the paraconsistent analysis are two values defined as Dc - Degree of Certainty and Dct - Degree of Contradiction, which receive a treatment according to the Fuzzy Logic theory, and the resulting output of the logic actions is a single value called the crisp value, which is used to control dynamic system. Through an example, it was demonstrated the application of the proposed model. Initially, the Paraconsistent-Fuzzy Digital PID controller was built and tested in an isolated MATLAB environment and then compared to the equivalent Digital PID function of this software for standard step excitation. After this step, a level control plant was modeled to execute the controller function on a physical model, making the tests closer to the actual. For this, the control parameters (proportional, integral and derivative) were determined for the configuration of the conventional Digital PID controller and of the Paraconsistent-Fuzzy Digital PID, and the control meshes in MATLAB were assembled with the respective transfer function of the plant. Finally, the results of the comparison of the level control process between the Paraconsistent-Fuzzy Digital PID controller and the conventional Digital PID controller were presented.Keywords: fuzzy logic, paraconsistent annotated logic, level control, digital PID
Procedia PDF Downloads 2842527 Modeling of Turbulent Flow for Two-Dimensional Backward-Facing Step Flow
Authors: Alex Fedoseyev
Abstract:
This study investigates a generalized hydrodynamic equation (GHE) simplified model for the simulation of turbulent flow over a two-dimensional backward-facing step (BFS) at Reynolds number Re=132000. The GHE were derived from the generalized Boltzmann equation (GBE). GBE was obtained by first principles from the chain of Bogolubov kinetic equations and considers particles of finite dimensions. The GHE has additional terms, temporal and spatial fluctuations, compared to the Navier-Stokes equations (NSE). These terms have a timescale multiplier τ, and the GHE becomes the NSE when $\tau$ is zero. The nondimensional τ is a product of the Reynolds number and the squared length scale ratio, τ=Re*(l/L)², where l is the apparent Kolmogorov length scale, and L is a hydrodynamic length scale. The BFS flow modeling results obtained by 2D calculations cannot match the experimental data for Re>450. One or two additional equations are required for the turbulence model to be added to the NSE, which typically has two to five parameters to be tuned for specific problems. It is shown that the GHE does not require an additional turbulence model, whereas the turbulent velocity results are in good agreement with the experimental results. A review of several studies on the simulation of flow over the BFS from 1980 to 2023 is provided. Most of these studies used different turbulence models when Re>1000. In this study, the 2D turbulent flow over a BFS with height H=L/3 (where L is the channel height) at Reynolds number Re=132000 was investigated using numerical solutions of the GHE (by a finite-element method) and compared to the solutions from the Navier-Stokes equations, k–ε turbulence model, and experimental results. The comparison included the velocity profiles at X/L=5.33 (near the end of the recirculation zone, available from the experiment), recirculation zone length, and velocity flow field. The mean velocity of NSE was obtained by averaging the solution over the number of time steps. The solution with a standard k −ε model shows a velocity profile at X/L=5.33, which has no backward flow. A standard k−ε model underpredicts the experimental recirculation zone length X/L=7.0∓0.5 by a substantial amount of 20-25%, and a more sophisticated turbulence model is needed for this problem. The obtained data confirm that the GHE results are in good agreement with the experimental results for turbulent flow over two-dimensional BFS. A turbulence model was not required in this case. The computations were stable. The solution time for the GHE is the same or less than that for the NSE and significantly less than that for the NSE with the turbulence model. The proposed approach was limited to 2D and only one Reynolds number. Further work will extend this approach to 3D flow and a higher Re.Keywords: backward-facing step, comparison with experimental data, generalized hydrodynamic equations, separation, reattachment, turbulent flow
Procedia PDF Downloads 612526 Tool for Analysing the Sensitivity and Tolerance of Mechatronic Systems in Matlab GUI
Authors: Bohuslava Juhasova, Martin Juhas, Renata Masarova, Zuzana Sutova
Abstract:
The article deals with the tool in Matlab GUI form that is designed to analyse a mechatronic system sensitivity and tolerance. In the analysed mechatronic system, a torque is transferred from the drive to the load through a coupling containing flexible elements. Different methods of control system design are used. The classic form of the feedback control is proposed using Naslin method, modulus optimum criterion and inverse dynamics method. The cascade form of the control is proposed based on combination of modulus optimum criterion and symmetric optimum criterion. The sensitivity is analysed on the basis of absolute and relative sensitivity of system function to the change of chosen parameter value of the mechatronic system, as well as the control subsystem. The tolerance is analysed in the form of determining the range of allowed relative changes of selected system parameters in the field of system stability. The tool allows to analyse an influence of torsion stiffness, torsion damping, inertia moments of the motor and the load and controller(s) parameters. The sensitivity and tolerance are monitored in terms of the impact of parameter change on the response in the form of system step response and system frequency-response logarithmic characteristics. The Symbolic Math Toolbox for expression of the final shape of analysed system functions was used. The sensitivity and tolerance are graphically represented as 2D graph of sensitivity or tolerance of the system function and 3D/2D static/interactive graph of step/frequency response.Keywords: mechatronic systems, Matlab GUI, sensitivity, tolerance
Procedia PDF Downloads 4332525 Multi Response Optimization in Drilling Al6063/SiC/15% Metal Matrix Composite
Authors: Hari Singh, Abhishek Kamboj, Sudhir Kumar
Abstract:
This investigation proposes a grey-based Taguchi method to solve the multi-response problems. The grey-based Taguchi method is based on the Taguchi’s design of experimental method, and adopts Grey Relational Analysis (GRA) to transfer multi-response problems into single-response problems. In this investigation, an attempt has been made to optimize the drilling process parameters considering weighted output response characteristics using grey relational analysis. The output response characteristics considered are surface roughness, burr height and hole diameter error under the experimental conditions of cutting speed, feed rate, step angle, and cutting environment. The drilling experiments were conducted using L27 orthogonal array. A combination of orthogonal array, design of experiments and grey relational analysis was used to ascertain best possible drilling process parameters that give minimum surface roughness, burr height and hole diameter error. The results reveal that combination of Taguchi design of experiment and grey relational analysis improves surface quality of drilled hole.Keywords: metal matrix composite, drilling, optimization, step drill, surface roughness, burr height, hole diameter error
Procedia PDF Downloads 319