Search results for: response spectrum method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23557

Search results for: response spectrum method

19597 Pressure-Detecting Method for Estimating Levitation Gap Height of Swirl Gripper

Authors: Kaige Shi, Chao Jiang, Xin Li

Abstract:

The swirl gripper is an electrically activated noncontact handling device that uses swirling airflow to generate a lifting force. This force can be used to pick up a workpiece placed underneath the swirl gripper without any contact. It is applicable, for example, in the semiconductor wafer production line, where contact must be avoided during the handling and moving of a workpiece to minimize damage. When a workpiece levitates underneath a swirl gripper, the gap height between them is crucial for safe handling. Therefore, in this paper, we propose a method to estimate the levitation gap height by detecting pressure at two points. The method is based on theoretical model of the swirl gripper, and has been experimentally verified. Furthermore, the force between the gripper and the workpiece can also be estimated using the detected pressure. As a result, the nonlinear relationship between the force and gap height can be linearized by adjusting the rotating speed of the fan in the swirl gripper according to the estimated force and gap height. The linearized relationship is expected to enhance handling stability of the workpiece.

Keywords: swirl gripper, noncontact handling, levitation, gap height estimation

Procedia PDF Downloads 119
19596 The Influence of Water on the Properties of Cellulose Fibre Insulation

Authors: Pablo Lopez Hurtado, Antroine Rouilly, Virginie Vandenbossche

Abstract:

Cellulose fibre insulation is an eco-friendly building material made from recycled paper fibres, treated with borates for fungal and fire resistance. It is comparable in terms of thermal and acoustic performance to mineral wool insulation and other insulation materials based on non-renewable resources. The main method of application consists in separating and blowing the fibres in attics or closed wall cavities. Another method, known as the “wet spray method” is gaining interest. With this method the fibres are projected with pulverized water, which stick to the wall cavities. The issue with the wet spray technique is that the water dosage could be difficult to control. A high water dosage implies not only a longer drying time, depending on ambient conditions, but also a change in the performance of the material itself. In our work we studied the thermal and mechanical properties of wet spray-cellulose insulation in order to understand how water dosage could affect these properties. The material was first characterized to study the chemical and physical properties of the fibres. Then representative samples of wet sprayed cellulose with varying applied water dosage were subject to thermal conductivity and compression testing in order to better understand how changes in the fibres induced by drying can affect these properties.

Keywords: cellulose fibre, recycled paper, moisture sorption, thermal insulation

Procedia PDF Downloads 290
19595 The Philosophical Hermeneutics Contribution to Form a Highly Qualified Judiciary in Brazil

Authors: Thiago R. Pereira

Abstract:

The philosophical hermeneutics is able to change the Brazilian Judiciary because of the understanding of the characteristics of the human being. It is impossible for humans, to be invested in the function of being a judge, making absolutely neutral decisions, but the philosophical hermeneutics can assist the judge making impartial decisions, based on the federal constitution. The normative legal positivism imagined a neutral judge, a judge able to try without any preconceived ideas, without allowing his/her background to influence him/her. When a judge arbitrates based on legal rules, the problem is smaller, but when there are no clear legal rules, and the judge must try based on principles, the risk of the decision is based on what they believe in. Solipsistically, this issue gains a huge dimension. Today, the Brazilian judiciary is independent, but there must be a greater knowledge of philosophy and the philosophy of law, partially because the bigger problem is the unpredictability of decisions made by the judiciary. Actually, when a lawsuit is filed, the result of this judgment is absolutely unpredictable. It is almost a gamble. There must be the slightest legal certainty and predictability of judicial decisions, so that people, with similar cases, may not receive opposite sentences. The relativism, since classical antiquity, believes in the possibility of multiple answers. Since the Greeks in in the sixth century before Christ, through the Germans in the eighteenth century, and even today, it has been established the constitution as the great law, the Groundnorm, and thus, the relativism of life can be greatly reduced when a hermeneut uses the Constitution as North interpretational, where all interpretation must act as the hermeneutic constitutional filter. For a current philosophy of law, that inside a legal system with a Federal Constitution, there is a single correct answer to a specific case. The challenge is how to find this right answer. The only answer to this question will be that we should use the constitutional principles. But in many cases, a collision between principles will take place, and to resolve this issue, the judge or the hermeneut will choose a solipsism way, using what they personally believe to be the right one. For obvious reasons, that conduct is not safe. Thus, a theory of decision is necessary to seek justice, and the hermeneutic philosophy and the linguistic turn will be necessary for one to find the right answer. In order to help this difficult mission, it will be necessary to use philosophical hermeneutics in order to find the right answer, which is the constitutionally most appropriate response. The constitutionally appropriate response will not always be the answer that individuals agree to, but we must put aside our preferences and defend the answer that the Constitution gives us. Therefore, the hermeneutics applied to Law, in search constitutionally appropriate response, should be the safest way to avoid judicial individual decisions. The aim of this paper is to present the science of law starting from the linguistic turn, the philosophical hermeneutics, moving away from legal positivism. The methodology used in this paper is qualitative, academic and theoretical, philosophical hermeneutics with the mission to conduct research proposing a new way of thinking about the science of law. The research sought to demonstrate the difficulty of the Brazilian courts to depart from the secular influence of legal positivism. Moreover, the research sought to demonstrate the need to think science of law within a contemporary perspective, where the linguistic turn, philosophical hermeneutics, will be the surest way to conduct the science of law in the present century.

Keywords: hermeneutic, right answer, solipsism, Brazilian judiciary

Procedia PDF Downloads 331
19594 Analysis of Heat Exchanger Area of Two Stage Cascade Refrigeration System Using Taguchi Methodology

Authors: A. D. Parekh

Abstract:

The present work describes relative contributions of operating parameters on required heat transfer area of three heat exchangers viz. evaporator, condenser and cascade condenser of two stage R404A-R508B cascade refrigeration system using Taguchi method. The operating parameters considered in present study includes (1) condensing temperature of high temperature cycle and low temperature cycle (2) evaporating temperature of low temperature cycle (3) degree of superheating in low temperature cycle (4) refrigerating effect. Heat transfer areas of three heat exchangers are studied with variation of above operating parameters and also optimum working levels of each operating parameter has been obtained for minimum heat transfer area of each heat exchanger using Taguchi method. The analysis using Taguchi method reveals that evaporating temperature of low temperature cycle and refrigerating effect contribute relatively largely on the area of evaporator. Condenser area is mainly influenced by both condensing temperature of high temperature cycle and refrigerating effect. Area of cascade condenser is mainly affected by refrigerating effect and the effects of other operating parameters are minimal.

Keywords: cascade refrigeration system, Taguchi method, heat transfer area, ANOVA, optimal solution

Procedia PDF Downloads 369
19593 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition

Authors: A. Bayaga

Abstract:

This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.

Keywords: AIDS mortality rates, epidemiological model, time-homogeneous markov jump process, transition probability, statistics South Africa

Procedia PDF Downloads 476
19592 Synthesis and Characterization of Renewable Resource Based Green Epoxy Coating

Authors: Sukanya Pradhan, Smita Mohanty, S. K Nayak

Abstract:

Plant oils are a great renewable source for being a reliable starting material to access new products with a wide spectrum of structural and functional variations. Even though petroleum products might also render the same, but it would also impose a high risk factor of environmental and health hazard. Since epoxidized vegetable oils are easily available, eco-compatible, non-toxic and renewable, hence these have drawn much of the attentions in the polymer industrial sector especially for the development of eco-friendly coating materials. In this study a waterborne epoxy coating was prepared from epoxidized soyabean oil by using triethanolamine. Because of its hydrophobic nature, it was a tough and tedius task to make it hydrophilic. The hydrophobic biobased epoxy was modified into waterborne epoxy by the help of a plant based anhydride as curing agent. Physico-mechanical, chemical resistance tests and thermal analysis of the green coating material were carried out which showed good physic-mechanical, chemical resistance properties as well as environment friendly. The complete characterization of the final material was done in terms of scratch hardness, gloss test, impact resistance, adhesion and bend test.

Keywords: epoxidized soybean oil, waterborne, curing agent, green coating

Procedia PDF Downloads 530
19591 Modeling of Crack Propagation Path in Concrete with Coarse Trapezoidal Aggregates by Boundary Element Method

Authors: Chong Wang, Alexandre Urbano Hoffmann

Abstract:

Interaction between a crack and a trapezoidal aggregate in a single edge notched concrete beam is simulated using boundary element method with an automatic crack extension program. The stress intensity factors of the growing crack are obtained from the J-integral. Three crack extension paths: deflecting around the particulate, growing along the interface and penetrating into the particulate are achieved in terms of the mismatch state of mechanical characteristics of matrix and the particulate. The toughening is also given by the ratio of stress intensity factors. The results reveal that as stress shielding occurs, toughening is obtained when the crack is approaching to a stiff and strong aggregate weakly bonded to a relatively soft matrix. The present work intends to help for the design of aggregate reinforced concretes.

Keywords: aggregate concrete, boundary element method, two-phase composite, crack extension path, crack/particulate interaction

Procedia PDF Downloads 416
19590 Clinical Outcomes of Neonates Born to COVID-19 Positive Mothers in a Tertiary Level Private Hospital

Authors: Patricia Abigail B. Miranda, Imelda A. Luna

Abstract:

Introduction: COVID-19 infection is a novel viral illness that began as a local epidemic in December 2019 in Wuhan, China which quickly emerged into a pandemic by February 2020. The virus causes a spectrum of signs and symptoms, ranging from mild upper respiratory symptoms to acute respiratory distress syndrome, which may lead to death. Among children and neonates, those afflicted with the disease may present asymptomatically or with mild symptoms. To date, there has been limited local data that describes the outcomes of the growing number of COVID-19 cases, specifically in neonates. Methods: A cross-sectional study was conducted to determine the outcomes of neonates born to COVID-19 Positive Mothers from March 2020 until June 2022. The prevalence of COVID-19 among these neonates was also determined. Results: COVID-positive prevalence after 24 hours of life is at 8%, while prevalence after 48 hours among those who still underwent testing was at 13.51%. Moreover, among those COVID-19-negative neonates who had symptoms, they mostly presented with tachypnea (5.7%). The prevalence of complications among COVID-19-negative neonates delivered to COVID-19-positive mothers is 22.7%. Conclusion: Neonates born to COVID-19-positive mothers who yielded positive COVID-19 results are generally asymptomatic. Moreover, there are no associated mortalities among those who yielded positive results.

Keywords: COVID-19, neonates, outcomes, clinical profile

Procedia PDF Downloads 63
19589 A Prenylflavanoid, HME5 with Antiproliferative Activity in Human Ovarian Cancer Cells

Authors: Mashitoh Abd Rahman, Najihah Mohd Hashim, Faiqah Ramli, Syam Mohan, Noraziah Nordin, Hamed Karimian, Hapipah Mohd Ali

Abstract:

Ovarian cancer is the most lethal gynecological malignancies. HME5, a prenylflavanoid has been isolated from local medicinal plant. This compound has been reported to possess a broad spectrum of biological activities including anticancer property. However, the potential of HME5 as an antiproliferative and cytotoxic agent on an ovarian cancer cells has not yet been investigated. In this present study, we examined the antiproliferative and cytotoxic effect of HME5 on Caov-3 (Human Ovarian Adenocarcinoma) cell line by using 3-[4,5-dimethylthizol-2-y]-2,5-diphenyltetrazolium bromide (MTT) assay, Acridine orange and propidium Iodide (AOPi) and cell cycle analysis study. HME5 has shown to inhibit Caov-3 in a time-dependent manner with the IC50 values of 5µg/ml, 2µg/ml and 1µg/ml after 24h, 48h and 72h treatment, respectively. Morphological study from AOPi analysis showed that HME5 induced apoptosis after 24 and 48h post-treatment. Nevertheless, HME5 exhibited cell cycle arrest at G1 phase as indicated in flow cytometry cell cycle profiling. In conclusion, HME5 inhibited proliferation of Caov-3 through induction of apoptosis and cell cycle arrest at G1 phase.

Keywords: apoptosis, prenylflavanoid, ovarian cancer, HME5

Procedia PDF Downloads 442
19588 Parametric Analysis and Optimal Design of Functionally Graded Plates Using Particle Swarm Optimization Algorithm and a Hybrid Meshless Method

Authors: Foad Nazari, Seyed Mahmood Hosseini, Mohammad Hossein Abolbashari, Mohammad Hassan Abolbashari

Abstract:

The present study is concerned with the optimal design of functionally graded plates using particle swarm optimization (PSO) algorithm. In this study, meshless local Petrov-Galerkin (MLPG) method is employed to obtain the functionally graded (FG) plate’s natural frequencies. Effects of two parameters including thickness to height ratio and volume fraction index on the natural frequencies and total mass of plate are studied by using the MLPG results. Then the first natural frequency of the plate, for different conditions where MLPG data are not available, is predicted by an artificial neural network (ANN) approach which is trained by back-error propagation (BEP) technique. The ANN results show that the predicted data are in good agreement with the actual one. To maximize the first natural frequency and minimize the mass of FG plate simultaneously, the weighted sum optimization approach and PSO algorithm are used. However, the proposed optimization process of this study can provide the designers of FG plates with useful data.

Keywords: optimal design, natural frequency, FG plate, hybrid meshless method, MLPG method, ANN approach, particle swarm optimization

Procedia PDF Downloads 352
19587 Analysis of Ancient and Present Lightning Protection Systems of Large Heritage Stupas in Sri Lanka

Authors: J.R.S.S. Kumara, M.A.R.M. Fernando, S.Venkatesh, D.K. Jayaratne

Abstract:

Protection of heritage monuments against lightning has become extremely important as far as their historical values are concerned. When such structures are large and tall, the risk of lightning initiated from both cloud and ground can be high. This paper presents a lightning risk analysis of three giant stupas in Anuradhapura era (fourth century BC onwards) in Sri Lanka. The three stupas are Jethawaaramaya (269-296 AD), Abayagiriya (88-76 BC) and Ruwanweliseya (161-137 BC), the third, fifth and seventh largest ancient structures in the world. These stupas are solid brick structures consisting of a base, a near hemispherical dome and a conical spire on the top. The ancient stupas constructed with a dielectric crystal on the top and connected to the ground through a conducting material, was considered as the hypothesis for their original lightning protection technique. However, at present, all three stupas are protected with Franklin rod type air termination systems located on top of the spire. First, a risk analysis was carried out according to IEC 62305 by considering the isokeraunic level of the area and the height of the stupas. Then the standard protective angle method and rolling sphere method were used to locate the possible touching points on the surface of the stupas. The study was extended to estimate the critical current which could strike on the unprotected areas of the stupas. The equations proposed by (Uman 2001) and (Cooray2007) were used to find the striking distances. A modified version of rolling sphere method was also applied to see the effects of upward leaders. All these studies were carried out for two scenarios: with original (i.e. ancient) lightning protection system and with present (i.e. new) air termination system. The field distribution on the surface of the stupa in the presence of a downward leader was obtained using finite element based commercial software COMSOL Multiphysics for further investigations of lightning risks. The obtained results were analyzed and compared each other to evaluate the performance of ancient and new lightning protection methods and identify suitable methods to design lightning protection systems for stupas. According to IEC standards, all three stupas with new and ancient lightning protection system has Level IV protection as per protection angle method. However according to rolling sphere method applied with Uman’s equation protection level is III. The same method applied with Cooray’s equation always shows a high risk with respect to Uman’s equation. It was found that there is a risk of lightning strikes on the dome and square chamber of the stupa, and the corresponding critical current values were different with respect to the equations used in the rolling sphere method and modified rolling sphere method.

Keywords: Stupa, heritage, lightning protection, rolling sphere method, protection level

Procedia PDF Downloads 227
19586 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit

Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic

Abstract:

Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.

Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method

Procedia PDF Downloads 102
19585 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders

Authors: Hao-Hsi Tseng, Hsin-Yun Lee

Abstract:

The Most Advantageous Tender (MAT) has been criticized for its susceptibility to dictatorial situations and for its processing of same score, same rank issues. This study applies the four criteria from Arrow's Impossibility Theorem to construct a mechanism for revealing illegitimate scores in scoring methods. While commonly be used to improve on problems resulting from extreme scores, ranking methods hide significant defects, adversely affecting selection fairness. To address these shortcomings, this study relies mainly on the overall evaluated score method, using standardized scores plus normal cumulative distribution function conversion to calculate the evaluation of vender preference. This allows for free score evaluations, which reduces the influence of dictatorial behavior and avoiding same score, same rank issues. Large-scale simulations confirm that this method outperforms currently used methods using the Impossibility Theorem.

Keywords: Arrow’s impossibility theorem, cumulative normal distribution function, most advantageous tender, scoring method

Procedia PDF Downloads 453
19584 Study of Electron Cyclotron Resonance Acceleration by Cylindrical TE₀₁₁ Mode

Authors: Oswaldo Otero, Eduardo A. Orozco, Ana M. Herrera

Abstract:

In this work, we present results from analytical and numerical studies of the electron acceleration by a TE₀₁₁ cylindrical microwave mode in a static homogeneous magnetic field under electron cyclotron resonance (ECR) condition. The stability of the orbits is analyzed using the particle orbit theory. In order to get a better understanding of the interaction wave-particle, we decompose the azimuthally electric field component as the superposition of right and left-hand circular polarization standing waves. The trajectory, energy and phase-shift of the electron are found through a numerical solution of the relativistic Newton-Lorentz equation in a finite difference method by the Boris method. It is shown that an electron longitudinally injected with an energy of 7 keV in a radial position r=Rc/2, being Rc the cavity radius, is accelerated up to energy of 90 keV by an electric field strength of 14 kV/cm and frequency of 2.45 GHz. This energy can be used to produce X-ray for medical imaging. These results can be used as a starting point for study the acceleration of electrons in a magnetic field changing slowly in time (GYRAC), which has some important applications as the electron cyclotron resonance ion proton accelerator (ECR-IPAC) for cancer therapy and to control plasma bunches with relativistic electrons.

Keywords: Boris method, electron cyclotron resonance, finite difference method, particle orbit theory, X-ray

Procedia PDF Downloads 147
19583 Simplified Modeling of Post-Soil Interaction for Roadside Safety Barriers

Authors: Charly Julien Nyobe, Eric Jacquelin, Denis Brizard, Alexy Mercier

Abstract:

The performance of road side safety barriers depends largely on the dynamic interactions between post and soil. These interactions play a key role in the response of barriers to crash testing. In the literature, soil-post interaction is modeled in crash test simulations using three approaches. Many researchers have initially used the finite element approach, in which the post is embedded in a continuum soil modelled by solid finite elements. This method represents a more comprehensive and detailed approach, employing a mesh-based continuum to model the soil’s behavior and its interaction with the post. Although this method takes all soil properties into account, it is nevertheless very costly in terms of simulation time. In the second approach, all the points of the post located at a predefined depth are fixed. Although this approach reduces CPU computing time, it overestimates soil-post stiffness. The third approach involves modeling the post as a beam supported by a set of nonlinear springs in the horizontal directions. For support in the vertical direction, the posts were constrained at a node at ground level. This approach is less costly, but the literature does not provide a simple procedure to determine the constitutive law of the springs The aim of this study is to propose a simple and low-cost procedure to obtain the constitutive law of nonlinear springs that model the soil-post interaction. To achieve this objective, we will first present a procedure to obtain the constitutive law of nonlinear springs thanks to the simulation of a soil compression test. The test consists in compressing the soil contained in the tank by a rigid solid, up to a vertical displacement of 200 mm. The resultant force exerted by the ground on the rigid solid and its vertical displacement are extracted and, a force-displacement curve was determined. The proposed procedure for replacing the soil with springs must be tested against a reference model. The reference model consists of a wooden post embedded into the ground and impacted with an impactor. Two simplified models with springs are studied. In the first model, called Kh-Kv model, the springs are attached to the post in the horizontal and vertical directions. The second Kh model is the one described in the literature. The two simplified models are compared with the reference model according to several criteria: the displacement of a node located at the top of the post in vertical and horizontal directions; displacement of the post's center of rotation and impactor velocity. The results given by both simplified models are very close to the reference model results. It is noticeable that the Kh-Kv model is slightly better than the Kh model. Further, the former model is more interesting than the latter as it involves less arbitrary conditions. The simplified models also reduce the simulation time by a factor 4. The Kh-Kv model can therefore be used as a reliable tool to represent the soil-post interaction in a future research and development of road safety barriers.

Keywords: crash tests, nonlinear springs, soil-post interaction modeling, constitutive law

Procedia PDF Downloads 5
19582 Comparative Parametric Analysis on the Dynamic Response of Fibre Composite Beams with Debonding

Authors: Indunil Jayatilake, Warna Karunasena

Abstract:

Fiber Reinforced Polymer (FRP) composites enjoy an array of applications ranging from aerospace, marine and military to automobile, recreational and civil industry due to their outstanding properties. A structural glass fiber reinforced polymer (GFRP) composite sandwich panel made from E-glass fiber skin and a modified phenolic core has been manufactured in Australia for civil engineering applications. One of the major mechanisms of damage in FRP composites is skin-core debonding. The presence of debonding is of great concern not only because it severely affects the strength but also it modifies the dynamic characteristics of the structure, including natural frequency and vibration modes. This paper deals with the investigation of the dynamic characteristics of a GFRP beam with single and multiple debonding by finite element based numerical simulations and analyses using the STRAND7 finite element (FE) software package. Three-dimensional computer models have been developed and numerical simulations were done to assess the dynamic behavior. The FE model developed has been validated with published experimental, analytical and numerical results for fully bonded as well as debonded beams. A comparative analysis is carried out based on a comprehensive parametric investigation. It is observed that the reduction in natural frequency is more affected by single debonding than the equally sized multiple debonding regions located symmetrically to the single debonding position. Thus it is revealed that a large single debonding area leads to more damage in terms of natural frequency reduction than isolated small debonding zones of equivalent area, appearing in the GFRP beam. Furthermore, the extents of natural frequency shifts seem mode-dependent and do not seem to have a monotonous trend of increasing with the mode numbers.

Keywords: debonding, dynamic response, finite element modelling, novel FRP beams

Procedia PDF Downloads 106
19581 The Analysis of Own Signals of PM Electrical Machines – Example of Eccentricity

Authors: Marcin Baranski

Abstract:

This article presents a vibration diagnostic method designed for permanent magnets (PM) traction motors. Those machines are commonly used in traction drives of electrical vehicles. Specific structural properties of machines excited by permanent magnets are used in this method - electromotive force (EMF) generated due to vibrations. This work presents: field-circuit model, results of static tests, results of calculations and simulations.

Keywords: electrical vehicle, permanent magnet, traction drive, vibrations, electrical machine, eccentricity

Procedia PDF Downloads 611
19580 Enhancing Wire Electric Discharge Machining Efficiency through ANOVA-Based Process Optimization

Authors: Rahul R. Gurpude, Pallvita Yadav, Amrut Mulay

Abstract:

In recent years, there has been a growing focus on advanced manufacturing processes, and one such emerging process is wire electric discharge machining (WEDM). WEDM is a precision machining process specifically designed for cutting electrically conductive materials with exceptional accuracy. It achieves material removal from the workpiece metal through spark erosion facilitated by electricity. Initially developed as a method for precision machining of hard materials, WEDM has witnessed significant advancements in recent times, with numerous studies and techniques based on electrical discharge phenomena being proposed. These research efforts and methods in the field of ED encompass a wide range of applications, including mirror-like finish machining, surface modification of mold dies, machining of insulating materials, and manufacturing of micro products. WEDM has particularly found extensive usage in the high-precision machining of complex workpieces that possess varying hardness and intricate shapes. During the cutting process, a wire with a diameter ranging from 0.18mm is employed. The evaluation of EDM performance typically revolves around two critical factors: material removal rate (MRR) and surface roughness (SR). To comprehensively assess the impact of machining parameters on the quality characteristics of EDM, an Analysis of Variance (ANOVA) was conducted. This statistical analysis aimed to determine the significance of various machining parameters and their relative contributions in controlling the response of the EDM process. By undertaking this analysis, optimal levels of machining parameters were identified to achieve desirable material removal rates and surface roughness.

Keywords: WEDM, MRR, optimization, surface roughness

Procedia PDF Downloads 62
19579 Antibiotic Potential of Bioactive Compounds from a Marine Streptomyces Isolated from South Pacific Sediments

Authors: Ilaisa Kacivakanadina, Samson Viulu, Brad Carte, Katy Soapi

Abstract:

Two bioactive compounds namely Vulgamycin (also known as enterocin A) and 5-deoxyenterocin were purified from a marine bacterial strain 1903. Strain 1903 was isolated from marine sediments collected from the Solomon Islands. Morphological features of strain 1903 showed that it belongs to the genus Streptomyces. The two secondary metabolites were extracted using EtOAc and purified by chromatographic methods using EtOAc and hexane solvents. Mass spectrum and NMR data of pure compounds were used to elucidate the chemical structures. In this study, results showed that both compounds were strongly active against Wild Type Staphylococcus aureus (WTSA) (MIC < 1 µg/mL) and in Brine shrimp assays (BSA) (MIC < 1 µg/mL). 5-deoxyenterocin was also active against Rifamycin resistant Staphylococcus aureus (RRSA) (MIC, 250 µg/mL) while vulgamycin showed bioactivity against Methicillin resistant Staphylococcus aureus (MRSA) (MIC 250 µg/mL). To the best of our knowledge, this is the first study that showed the bio-activity of 5-deoxyenterocin. This is also the first time that Vulgamycin has been reported to be active in a BSA. There has not been any mechanism of action studies for these two compounds against pathogens. This warrants further studies on their mechanism of action against microbial pathogens.

Keywords: 5-deoxyenterocin, bioactivity, brine shrimp assay (BSA), vulgamycin

Procedia PDF Downloads 170
19578 Integrating Artificial Neural Network and Taguchi Method on Constructing the Real Estate Appraisal Model

Authors: Mu-Yen Chen, Min-Hsuan Fan, Chia-Chen Chen, Siang-Yu Jhong

Abstract:

In recent years, real estate prediction or valuation has been a topic of discussion in many developed countries. Improper hype created by investors leads to fluctuating prices of real estate, affecting many consumers to purchase their own homes. Therefore, scholars from various countries have conducted research in real estate valuation and prediction. With the back-propagation neural network that has been popular in recent years and the orthogonal array in the Taguchi method, this study aimed to find the optimal parameter combination at different levels of orthogonal array after the system presented different parameter combinations, so that the artificial neural network obtained the most accurate results. The experimental results also demonstrated that the method presented in the study had a better result than traditional machine learning. Finally, it also showed that the model proposed in this study had the optimal predictive effect, and could significantly reduce the cost of time in simulation operation. The best predictive results could be found with a fewer number of experiments more efficiently. Thus users could predict a real estate transaction price that is not far from the current actual prices.

Keywords: artificial neural network, Taguchi method, real estate valuation model, investors

Procedia PDF Downloads 468
19577 Distributed Acoustic Sensing Signal Model under Static Fiber Conditions

Authors: G. Punithavathy

Abstract:

The research proposes a statistical model for the distributed acoustic sensor interrogation units that broadcast a laser pulse into the fiber optics, where interactions within the fiber determine the localized acoustic energy that causes light reflections known as backscatter. The backscattered signal's amplitude and phase can be calculated using explicit equations. The created model makes amplitude signal spectrum and autocorrelation predictions that are confirmed by experimental findings. Phase signal characteristics that are useful for researching optical time domain reflectometry (OTDR) system sensing applications are provided and examined, showing good agreement with the experiment. The experiment was successfully done with the use of Python coding. In this research, we can analyze the entire distributed acoustic sensing (DAS) component parts separately. This model assumes that the fiber is in a static condition, meaning that there is no external force or vibration applied to the cable, that means no external acoustic disturbances present. The backscattered signal consists of a random noise component, which is caused by the intrinsic imperfections of the fiber, and a coherent component, which is due to the laser pulse interacting with the fiber.

Keywords: distributed acoustic sensing, optical fiber devices, optical time domain reflectometry, Rayleigh scattering

Procedia PDF Downloads 60
19576 Efficient Principal Components Estimation of Large Factor Models

Authors: Rachida Ouysse

Abstract:

This paper proposes a constrained principal components (CnPC) estimator for efficient estimation of large-dimensional factor models when errors are cross sectionally correlated and the number of cross-sections (N) may be larger than the number of observations (T). Although principal components (PC) method is consistent for any path of the panel dimensions, it is inefficient as the errors are treated to be homoskedastic and uncorrelated. The new CnPC exploits the assumption of bounded cross-sectional dependence, which defines Chamberlain and Rothschild’s (1983) approximate factor structure, as an explicit constraint and solves a constrained PC problem. The CnPC method is computationally equivalent to the PC method applied to a regularized form of the data covariance matrix. Unlike maximum likelihood type methods, the CnPC method does not require inverting a large covariance matrix and thus is valid for panels with N ≥ T. The paper derives a convergence rate and an asymptotic normality result for the CnPC estimators of the common factors. We provide feasible estimators and show in a simulation study that they are more accurate than the PC estimator, especially for panels with N larger than T, and the generalized PC type estimators, especially for panels with N almost as large as T.

Keywords: high dimensionality, unknown factors, principal components, cross-sectional correlation, shrinkage regression, regularization, pseudo-out-of-sample forecasting

Procedia PDF Downloads 137
19575 A Study on Consumer Awareness, Safety Perceptions and Practices About Food Preservatives and Flavouring Agents Used in Packed / Canned Foods from South India

Authors: Harsha Kumar H. N., Anshu Kumar Jha, Khushboo Kamal Taneja, Krishan Kabra, Mohamed Hafeez Sadiq

Abstract:

Introduction: The increasing use of preservatives and flavouring agents has the potential to cause health problem among the people. There are no published studies from India exploring the awareness, safety perception, & practices about Food Preservatives (FPs) and Flavouring Agents (FAs). So this study was conducted with the objectives of assessing the awareness, safety perceptions & practices about Food Preservatives (FPs), Flavouring Agents (FAs) in commonly bought / purchased packed food items. Materials & method: This cross-sectional study was conducted in January 2012. Sample size of 126 was computed using the formula for infinite population. People who bought packed food items in malls were approached and requested to fill a pre-tested semi-structured questionnaire. The questionnaire explored awareness, safety perception & practices of FPs & FAs. Data was then analyzed using SPSS version 10.0. Chi-square test was used to know if the observed differences were statistically significant. ‘p’ value <0.05 was considered significant. Results: Totally 123 people (males- 48.8% and females-51.2%) participated (response rate of 97.6%) in the study. Majority of the people were aware about presence of ‘FPs’ (91.7%) and ‘FAs’ (84.9%) though their knowledge was inadequate. Breakup of the study subjects according to level of awareness about FPs was as follows (%): Good (37.4), Satisfactory (40.6), Poor (22) & FAs (%): Good (49.6), Satisfactory (36) & Poor (14). Distribution according to type of practices for FPs was as follows (%): Favourable (14), Unfavourable (86) & FAs (%): Favourable (30.5), Unfavourable (69.5). There was a gap between knowledge and practices. Conclusion: Though there was awareness, the knowledge was inadequate. Unfavourable practices were observed. The gaps in the knowledge and unhealthy practices need to be addressed by public awareness campaign.

Keywords: food preservatives, flavouring agents, knowledge and practices, general population

Procedia PDF Downloads 496
19574 The Effects of Adding Vibrotactile Feedback to Upper Limb Performance during Dual-Tasking and Response to Misleading Visual Feedback

Authors: Sigal Portnoy, Jason Friedman, Eitan Raveh

Abstract:

Introduction: Sensory substitution is possible due to the capacity of our brain to adapt to information transmitted by a synthetic receptor via an alternative sensory system. Practical sensory substitution systems are being developed in order to increase the functionality of individuals with sensory loss, e.g. amputees. For upper limb prosthetic-users the loss of tactile feedback compels them to allocate visual attention to their prosthesis. The effect of adding vibrotactile feedback (VTF) to the applied force has been studied, however its effect on the allocation if visual attention during dual-tasking and the response during misleading visual feedback have not been studied. We hypothesized that VTF will improve the performance and reduce visual attention during dual-task assignments in healthy individuals using a robotic hand and improve the performance in a standardized functional test, despite the presence of misleading visual feedback. Methods: For the dual-task paradigm, twenty healthy subjects were instructed to toggle two keyboard arrow keys with the left hand to retain a moving virtual car on a road on a screen. During the game, instructions for various activities, e.g. mix the sugar in the glass with a spoon, appeared on the screen. The subject performed these tasks with a robotic hand, attached to the right hand. The robotic hand was controlled by the activity of the flexors and extensors of the right wrist, recorded using surface EMG electrodes. Pressure sensors were attached at the tips of the robotic hand and induced VTF using vibrotactile actuators attached to the right arm of the subject. An eye-tracking system tracked to visual attention of the subject during the trials. The trials were repeated twice, with and without the VTF. Additionally, the subjects performed the modified box and blocks, hidden from eyesight, in a motion laboratory. A virtual presentation of a misleading visual feedback was be presented on a screen so that twice during the trial, the virtual block fell while the physical block was still held by the subject. Results: This is an ongoing study, which current results are detailed below. We are continuing these trials with transradial myoelectric prosthesis-users. In the healthy group, the VTF did not reduce the visual attention or improve performance during dual-tasking for the tasks that were typed transfer-to-target, e.g. place the eraser on the shelf. An improvement was observed for other tasks. For example, the average±standard deviation of time to complete the sugar-mixing task was 13.7±17.2s and 19.3±9.1s with and without the VTF, respectively. Also, the number of gaze shifts from the screen to the hand during this task were 15.5±23.7 and 20.0±11.6, with and without the VTF, respectively. The response of the subjects to the misleading visual feedback did not differ between the two conditions, i.e. with and without VTF. Conclusions: Our interim results suggest that the performance of certain activities of daily living may be improved by VTF. The substitution of visual sensory input by tactile feedback might require a long training period so that brain plasticity can occur and allow adaptation to the new condition.

Keywords: prosthetics, rehabilitation, sensory substitution, upper limb amputation

Procedia PDF Downloads 326
19573 Plot Scale Estimation of Crop Biophysical Parameters from High Resolution Satellite Imagery

Authors: Shreedevi Moharana, Subashisa Dutta

Abstract:

The present study focuses on the estimation of crop biophysical parameters like crop chlorophyll, nitrogen and water stress at plot scale in the crop fields. To achieve these, we have used high-resolution satellite LISS IV imagery. A new methodology has proposed in this research work, the spectral shape function of paddy crop is employed to get the significant wavelengths sensitive to paddy crop parameters. From the shape functions, regression index models were established for the critical wavelength with minimum and maximum wavelengths of multi-spectrum high-resolution LISS IV data. Moreover, the functional relationships were utilized to develop the index models. From these index models crop, biophysical parameters were estimated and mapped from LISS IV imagery at plot scale in crop field level. The result showed that the nitrogen content of the paddy crop varied from 2-8%, chlorophyll from 1.5-9% and water content variation observed from 40-90% respectively. It was observed that the variability in rice agriculture system in India was purely a function of field topography.

Keywords: crop parameters, index model, LISS IV imagery, plot scale, shape function

Procedia PDF Downloads 153
19572 Ethanol Chlorobenzene Dosimetr Usage for Measuring Dose of the Intraoperative Linear Electron Accelerator System

Authors: Mojtaba Barzegar, Alireza Shirazi, Saied Rabi Mahdavi

Abstract:

Intraoperative radiation therapy (IORT) is an innovative treatment modality that the delivery of a large single dose of radiation to the tumor bed during the surgery. The radiotherapy success depends on the absorbed dose delivered to the tumor. The achievement better accuracy in patient treatment depends upon the measured dose by standard dosimeter such as ionization chamber, but because of the high density of electric charge/pulse produced by the accelerator in the ionization chamber volume, the standard correction factor for ion recombination Ksat calculated with the classic two-voltage method is overestimated so the use of dose/pulse independent dosimeters such as chemical Fricke and ethanol chlorobenzene (ECB) dosimeters have been suggested. Dose measurement is usually calculated and calibrated in the Zmax. Ksat calculated by comparison of ion chamber response and ECB dosimeter at each applicator degree, size, and dose. The relative output factors for IORT applicators have been calculated and compared with experimentally determined values and the results simulated by Monte Carlo software. The absorbed doses have been calculated and measured with statistical uncertainties less than 0.7% and 2.5% consecutively. The relative differences between calculated and measured OF’s were up to 2.5%, for major OF’s the agreement was better. In these conditions, together with the relative absorbed dose calculations, the OF’s could be considered as an indication that the IORT electron beams have been well simulated. These investigations demonstrate the utility of the full Monte Carlo simulation of accelerator head with ECB dosimeter allow us to obtain detailed information of clinical IORT beams.

Keywords: intra operative radiotherapy, ethanol chlorobenzene, ksat, output factor, monte carlo simulation

Procedia PDF Downloads 462
19571 Recirculated Sedimentation Method to Control Contamination for Algal Biomass Production

Authors: Ismail S. Bostanci, Ebru Akkaya

Abstract:

Microalgae-derived biodiesel, fertilizer or industrial chemicals' production with wastewater has great potential. Especially water from a municipal wastewater treatment plant is a very important nutrient source for biofuel production. Microalgae biomass production in open ponds system is lower cost culture systems. There are many hurdles for commercial algal biomass production in large scale. One of the important technical bottlenecks for microalgae production in open system is culture contamination. The algae culture contaminants can generally be described as invading organisms which could cause pond crash. These invading organisms can be competitors, parasites, and predators. Contamination is unavoidable in open systems. Potential contaminant organisms are already inoculated if wastewater is utilized for algal biomass cultivation. Especially, it is important to control contaminants to retain in acceptable level in order to reach true potential of algal biofuel production. There are several contamination management methods in algae industry, ranging from mechanical, chemical, biological and growth condition change applications. However, none of them are accepted as a suitable contamination control method. This experiment describes an innovative contamination control method, 'Recirculated Sedimentation Method', to manage contamination to avoid pond cash. The method can be used for the production of algal biofuel, fertilizer etc. and algal wastewater treatment. To evaluate the performance of the method on algal culture, an experiment was conducted for 90 days at a lab-scale raceway (60 L) reactor with the use of non-sterilized and non-filtered wastewater (secondary effluent and centrate of anaerobic digestion). The application of the method provided the following; removing contaminants (predators and diatoms) and other debris from reactor without discharging the culture (with microscopic evidence), increasing raceway tank’s suspended solids holding capacity (770 mg L-1), increasing ammonium removal rate (29.83 mg L-1 d-1), decreasing algal and microbial biofilm formation on inner walls of reactor, washing out generated nitrifier from reactor to prevent ammonium consumption.

Keywords: contamination control, microalgae culture contamination, pond crash, predator control

Procedia PDF Downloads 188
19570 Revealing the Risks of Obstructive Sleep Apnea

Authors: Oyuntsetseg Sandag, Lkhagvadorj Khosbayar, Naidansuren Tsendeekhuu, Densenbal Dansran, Bandi Solongo

Abstract:

Introduction: Obstructive sleep apnea (OSA) is a common disorder affecting at least 2% to 4% of the adult population. It is estimated that nearly 80% of men and 93% of women with moderate to severe sleep apnea are undiagnosed. A number of screening questionnaires and clinical screening models have been developed to help identify patients with OSA, also it’s indeed to clinical practice. Purpose of study: Determine dependence of obstructive sleep apnea between for severe risk and risk factor. Material and Methods: A cross-sectional study included 114 patients presenting from theCentral state 3th hospital and Central state 1th hospital. Patients who had obstructive sleep apnea (OSA)selected in this study. Standard StopBang questionnaire was obtained from all patients.According to the patients’ response to the StopBang questionnaire was divided into low risk, intermediate risk, and high risk.Descriptive statistics were presented mean ± standard deviation (SD). Each questionnaire was compared on the likelihood ratio for a positive result, the likelihood ratio for a negative test result of regression. Statistical analyses were performed utilizing SPSS 16. Results: 114 patients were obtained (mean age 48 ± 16, male 57)that divided to low risk 54 (47.4%), intermediate risk 33 (28.9%), high risk 27 (23.7%). Result of risk factor showed significantly increasing that mean age (38 ± 13vs. 54 ± 14 vs. 59 ± 10, p<0.05), blood pressure (115 ± 18vs. 133 ± 19vs. 142 ± 21, p<0.05), BMI(24 IQR 22; 26 vs. 24 IQR 22; 29 vs. 28 IQR 25; 34, p<0.001), neck circumference (35 ± 3.4 vs. 38 ± 4.7 vs. 41 ± 4.4, p<0.05)were increased. Results from multiple logistic regressions showed that age is significantly independently factor for OSA (odds ratio 1.07, 95% CI 1.02-1.23, p<0.01). Predictive value of age was significantly higher factor for OSA (AUC=0.833, 95% CI 0.758-0.909, p<0.001). Our study showing that risk of OSA is beginning 47 years old (sensitivity 78.3%, specifity74.1%). Conclusions: According to most of all patients’ response had intermediate risk and high risk. Also, age, blood pressure, neck circumference and BMI were increased such as risk factor was increased for OSA. Especially age is independently factor and highest significance for OSA. Patients’ age one year is increased likelihood risk factor 1.1 times is increased.

Keywords: obstructive sleep apnea, Stop-Bang, BMI (Body Mass Index), blood pressure

Procedia PDF Downloads 294
19569 Optimization of Process Parameters for Copper Extraction from Wastewater Treatment Sludge by Sulfuric Acid

Authors: Usarat Thawornchaisit, Kamalasiri Juthaisong, Kasama Parsongjeen, Phonsiri Phoengchan

Abstract:

In this study, sludge samples that were collected from the wastewater treatment plant of a printed circuit board manufacturing industry in Thailand were subjected to acid extraction using sulfuric acid as the chemical extracting agent. The effects of sulfuric acid concentration (A), the ratio of a volume of acid to a quantity of sludge (B) and extraction time (C) on the efficiency of copper extraction were investigated with the aim of finding the optimal conditions for maximum removal of copper from the wastewater treatment sludge. Factorial experimental design was employed to model the copper extraction process. The results were analyzed statistically using analysis of variance to identify the process variables that were significantly affected the copper extraction efficiency. Results showed that all linear terms and an interaction term between volume of acid to quantity of sludge ratio and extraction time (BC), had statistically significant influence on the efficiency of copper extraction under tested conditions in which the most significant effect was ascribed to volume of acid to quantity of sludge ratio (B), followed by sulfuric acid concentration (A), extraction time (C) and interaction term of BC, respectively. The remaining two-way interaction terms, (AB, AC) and the three-way interaction term (ABC) is not statistically significant at the significance level of 0.05. The model equation was derived for the copper extraction process and the optimization of the process was performed using a multiple response method called desirability (D) function to optimize the extraction parameters by targeting maximum removal. The optimum extraction conditions of 99% of copper were found to be sulfuric acid concentration: 0.9 M, ratio of the volume of acid (mL) to the quantity of sludge (g) at 100:1 with an extraction time of 80 min. Experiments under the optimized conditions have been carried out to validate the accuracy of the Model.

Keywords: acid treatment, chemical extraction, sludge, waste management

Procedia PDF Downloads 184
19568 Coffee Consumption and Glucose Metabolism: a Systematic Review of Clinical Trials

Authors: Caio E. G. Reis, Jose G. Dórea, Teresa H. M. da Costa

Abstract:

Objective: Epidemiological data shows an inverse association of coffee consumption with risk of type 2 diabetes mellitus. However, the clinical effects of coffee consumption on the glucose metabolism biomarkers remain controversial. Thus, this paper reviews clinical trials that evaluated the effects of coffee consumption on glucose metabolism. Research Design and Methods: We identified studies published until December 2014 by searching electronic databases and reference lists. We included randomized clinical trials which the intervention group received caffeinated and/or decaffeinated coffee and the control group received water or placebo treatments and measured biomarkers of glucose metabolism. The Jadad Score was applied to evaluate the quality of the studies whereas studies that scored ≥ 3 points were considered for the analyses. Results: Seven clinical trials (total of 237 subjects) were analyzed involving adult healthy, overweight and diabetic subjects. The studies were divided in short-term (1 to 3h) and long-term (2 to 16 weeks) duration. The results for short-term studies showed that caffeinated coffee consumption may increase the area under the curve for glucose response, while for long-term studies caffeinated coffee may improve the glycemic metabolism by reducing the glucose curve and increasing insulin response. These results seem to show that the benefits of coffee consumption occur in the long-term as has been shown in the reduction of type 2 diabetes mellitus risk in epidemiological studies. Nevertheless, until the relationship between long-term coffee consumption and type 2 diabetes mellitus is better understood and any mechanism involved identified, it is premature to make claims about coffee preventing type 2 diabetes mellitus. Conclusion: The findings suggest that caffeinated coffee may impairs glucose metabolism in short-term but in the long-term the studies indicate reduction of type 2 diabetes mellitus risk. More clinical trials with comparable methodology are needed to unravel this paradox.

Keywords: coffee, diabetes mellitus type 2, glucose, insulin

Procedia PDF Downloads 447