Search results for: computational experiment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4531

Search results for: computational experiment

4291 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning

Authors: Wei Feilong

Abstract:

In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.

Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment

Procedia PDF Downloads 243
4290 A Matheuristic Algorithm for the School Bus Routing Problem

Authors: Cagri Memis, Muzaffer Kapanoglu

Abstract:

The school bus routing problem (SBRP) is a variant of the Vehicle Routing Problem (VRP) classified as a location-allocation-routing problem. In this study, the SBRP is decomposed into two sub-problems: (1) bus route generation and (2) bus stop selection to solve large instances of the SBRP in reasonable computational times. To solve the first sub-problem, we propose a genetic algorithm to generate bus routes. Once the routes have been fixed, a sub-problem remains of allocating students to stops considering the capacity of the buses and the walkability constraints of the students. While the exact method solves small-scale problems, treating large-scale problems with the exact method becomes complex due to computational problems, a deficiency that the genetic algorithm can overcome. Results obtained from the proposed approach on 150 instances up to 250 stops show that the matheuristic algorithm provides better solutions in reasonable computational times with respect to benchmark algorithms.

Keywords: genetic algorithm, matheuristic, school bus routing problem, vehicle routing problem

Procedia PDF Downloads 40
4289 Pesticide Risk: A Study on the Effectiveness of Organic/Biopesticides in Sustainable Agriculture

Authors: Berk Kılıç, Ömer Aydın, Kerem Mestani, Defne Uzun

Abstract:

In agriculture and farming, pesticides are frequently used to kill off or fend off any pests (bugs, bacteria, fungi, etc.). However, traditional pesticides have proven to have harmful effects on both the environment and the human body, such as hazards in the endocrine, neurodevelopmental, and reproductive systems. This experiment aims to test the effectiveness of organic/bio-pesticides (environmentally friendly pesticides) compared to traditional pesticides. Black pepper and garlic will be used as biopesticides in this experiment. The results support that organic farming applying organic pesticides operates through non-toxic mechanisms, offering minimal threats to human well-being and the environment. Consequently, consuming organic produce can significantly diminish the dangers associated with pesticide intake. In this study, method is introduced to reduce pesticide-related risks by promoting organic farming techniques within organic/bio-pesticide usage.

Keywords: pesticide, garlic, black pepper, bio-pesticide

Procedia PDF Downloads 37
4288 Project-Bbased Learning (PBL) Taken to Extremes: Full-Year/Full-Time PBL Replacement of Core Curriculum

Authors: Stephen Grant Atkins

Abstract:

Radical use of project-based learning (PBL) in a small New Zealand business school provides an opportunity to longitudinally examine its effects over a decade of pre-Covid data. Prior to this business school’s implementation of PBL, starting in 2012, the business pedagogy literature presented just one example of PBL replacing an entire core-set of courses. In that instance, a British business school merged four of its ‘degree Year 3’ accounting courses into one PBL semester. As radical as that would have seemed, to students aged 20-to-22, the PBL experiment conducted in a New Zealand business school was notably more extreme: 41 nationally-approved Learning Outcomes (L.O.s), these deriving from 8 separate core courses, were aggregated into one grand set of L.O.s, and then treated as a ‘full-year’/‘full-time’ single course. The 8 courses in question were all components of this business school’s compulsory ‘degree Year 1’ curriculum. Thus, the students involved were notably younger (…ages 17-to-19…), and no ‘part-time’ enrolments were allowed. Of interest are this PBL experiment’s effects on subsequent performance outcomes in ‘degree Years 2 & 3’ (….which continued to operate in their traditional ways). Of special interest is the quality of ‘group project’ outcomes. This is because traditionally, ‘degree Year 1’ course assessments are only minimally based on group work. This PBL experiment altered that practice radically, such that PBL ‘degree Year 1’ alumni entered their remaining two years of business coursework with far more ‘project group’ experience. Timeline-wise, thus of interest here, firstly, is ‘degree Year 2’ performance outcomes data from years 2010-2012 + 2016-2018, and likewise ‘degree Year 3’ data for years 2011-2013 + 2017-2019. Those years provide a pre-&-post comparative baseline for performance outcomes in students never exposed to this school’s radical PBL experiment. That baseline is then compared to PBL alumni outcomes (2013-2016….including’Student Evaluation of Course Quality’ outcomes…) to clarify ‘radical PBL’ effects.

Keywords: project-based learning, longitudinal mixed-methods, students criticism, effects-on-learning

Procedia PDF Downloads 70
4287 Analysis of Fault Tolerance on Grid Computing in Real Time Approach

Authors: Parampal Kaur, Deepak Aggarwal

Abstract:

In the computational Grid, fault tolerance is an imperative issue to be considered during job scheduling. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence, fault tolerance plays a key role in the grid to avoid the problem of unreliability. Scheduling the task to the appropriate resource is a vital requirement in computational Grid. The fittest resource scheduling algorithm searches for the appropriate resource based on the job requirements, in contrary to the general scheduling algorithms where jobs are scheduled to the resources with best performance factor. The proposed method is to improve the fault tolerance of the fittest resource scheduling algorithm by scheduling the job in coordination with job replication when the resource has low reliability. Based on the reliability index of the resource, the resource is identified as critical. The tasks are scheduled based on the criticality of the resources. Results show that the execution time of the tasks is comparatively reduced with the proposed algorithm using real-time approach rather than a simulator.

Keywords: computational grid, fault tolerance, task replication, job scheduling

Procedia PDF Downloads 409
4286 A Study of Behavioral Phenomena Using an Artificial Neural Network

Authors: Yudhajit Datta

Abstract:

Will is a phenomenon that has puzzled humanity for a long time. It is a belief that Will Power of an individual affects the success achieved by an individual in life. It is thought that a person endowed with great will power can overcome even the most crippling setbacks of life while a person with a weak will cannot make the most of life even the greatest assets. Behavioral aspects of the human experience such as will are rarely subjected to quantitative study owing to the numerous uncontrollable parameters involved. This work is an attempt to subject the phenomena of will to the test of an artificial neural network. The claim being tested is that will power of an individual largely determines success achieved in life. In the study, an attempt is made to incorporate the behavioral phenomenon of will into a computational model using data pertaining to the success of individuals obtained from an experiment. A neural network is to be trained using data based upon part of the model, and subsequently used to make predictions regarding will corresponding to data points of success. If the prediction is in agreement with the model values, the model is to be retained as a candidate. Ultimately, the best-fit model from among the many different candidates is to be selected, and used for studying the correlation between success and will.

Keywords: will power, will, success, apathy factor, random factor, characteristic function, life story

Procedia PDF Downloads 353
4285 Consideration of Uncertainty in Engineering

Authors: A. Mohammadi, M. Moghimi, S. Mohammadi

Abstract:

Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.

Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method

Procedia PDF Downloads 379
4284 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 259
4283 Frequency Recognition Models for Steady State Visual Evoked Potential Based Brain Computer Interfaces (BCIs)

Authors: Zeki Oralhan, Mahmut Tokmakçı

Abstract:

SSVEP based brain computer interface (BCI) systems have been preferred, because of high information transfer rate (ITR) and practical use. ITR is the parameter of BCI overall performance. For high ITR value, one of specification BCI system is that has high accuracy. In this study, we investigated to recognize SSVEP with shorter time and lower error rate. In the experiment, there were 8 flickers on light crystal display (LCD). Participants gazed to flicker which had 12 Hz frequency and 50% duty cycle ratio on the LCD during 10 seconds. During the experiment, EEG signals were acquired via EEG device. The EEG data was filtered in preprocessing session. After that Canonical Correlation Analysis (CCA), Multiset CCA (MsetCCA), phase constrained CCA (PCCA), and Multiway CCA (MwayCCA) methods were applied on data. The highest average accuracy value was reached when MsetCCA was applied.

Keywords: brain computer interface, canonical correlation analysis, human computer interaction, SSVEP

Procedia PDF Downloads 242
4282 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.

Keywords: Lagrange interpolation, linear complexity, monomial matrix, Newton interpolation

Procedia PDF Downloads 198
4281 Experimental Study of Heat Transfer and Pressure Drop in Serpentine Channel Water Cooler Heat Sink

Authors: Hao Xiaohong, Wu Zongxiang, Chen Xuefeng

Abstract:

With the high power density and high integration of electronic devices, their heat flux has been increasing rapidly. Therefore, an effective cooling technology is essential for the reliability and efficient operation of electronic devices. Liquid cooling is studied increasingly widely for its higher heat transfer efficiency. Serpentine channels are superior in the augmentation of single-phase convective heat transfer because of their better channel velocity distribution. In this paper, eight different frame sizes water-cooled serpentine channel heat sinks are designed to study the heat transfer and pressure drop characteristics. With water as the working fluid, experiment setup is established and the results showed the effect of different channel width, fin thickness and number of channels on thermal resistance and pressure drop.

Keywords: heat transfer, experiment, serpentine heat sink, pressure drop

Procedia PDF Downloads 427
4280 Underneath Vehicle Inspection Using Fuzzy Logic, Subsumption, and Open Cv-Library

Authors: Hazim Abdulsada

Abstract:

The inspection of underneath vehicle system has been given significant attention by governments after the threat of terrorism become more prevalent. New technologies such as mobile robots and computer vision are led to have more secure environment. This paper proposed that a mobile robot like Aria robot can be used to search and inspect the bombs under parking a lot vehicle. This robot is using fuzzy logic and subsumption algorithms to control the robot that movies underneath the vehicle. An OpenCV library and laser Hokuyo are added to Aria robot to complete the experiment for under vehicle inspection. This experiment was conducted at the indoor environment to demonstrate the efficiency of our methods to search objects and control the robot movements under vehicle. We got excellent results not only by controlling the robot movement but also inspecting object by the robot camera at same time. This success allowed us to know the requirement to construct a new cost effective robot with more functionality.

Keywords: fuzzy logic, mobile robots, Opencv, subsumption, under vehicle inspection

Procedia PDF Downloads 444
4279 Numerical Simulation and Experimental Study on Cable Damage Detection Using an MFL Technique

Authors: Jooyoung Park, Junkyeong Kim, Aoqi Zhang, Seunghee Park

Abstract:

Non-destructive testing on cable is in great demand due to safety accidents at sites where many equipments using cables are installed. In this paper, the quantitative change of the obtained signal was analyzed using a magnetic flux leakage (MFL) method. A two-dimensional simulation was conducted with a FEM model replicating real elevator cables. The simulation data were compared for three parameters (depth of defect, width of defect and inspection velocity). Then, an experiment on same conditions was carried out to verify the results of the simulation. Signals obtained from both the simulation and the experiment were transformed to characterize the properties of the damage. Throughout the results, a cable damage detection based on an MFL method was confirmed to be feasible. In further study, it is expected that the MFL signals of an entire specimen will be gained and visualized as well.

Keywords: magnetic flux leakage (mfl), cable damage detection, non-destructive testing, numerical simulation

Procedia PDF Downloads 352
4278 Sustainability in the Purchase of Airline Tickets: Analysis of Digital Communication from the Perspective of Neuroscience

Authors: Rodríguez Sánchez Carla, Sancho-Esper Franco, Guillen-Davo Marina

Abstract:

Tourism is one of the most important sectors worldwide since it is an important economic engine for today's society. It is also one of the sectors that most negatively affect the environment in terms of CO₂ emissions due to this expansion. In light of this, airlines are developing Voluntary Carbon Offset (VCO). There is important evidence focused on analyzing the features of these VCO programs and their efficacy in reducing CO₂ emissions, and findings are mixed without a clear consensus. Different research approaches have centered on analyzing factors and consequences of VCO programs, such as economic modelling based on panel data, survey research based on traveler responses or experimental research analyzing customer decisions in a simulated context. This study belongs to the latter group because it tries to understand how different characteristics of an online ticket purchase website affect the willingness of a traveler to choose a sustainable one. The proposed behavioral model is based on several theories, such as the nudge theory, the dual processing ELM and the cognitive dissonance theory. This randomized experiment aims at overcoming previous studies based on self-reported measures that mainly study sustainable behavioral intention rather than actual decision-making. It also complements traditional self-reported independent variables by gathering objective information from an eye-tracking device. This experiment analyzes the influence of two characteristics of the online purchase website: i) the type of information regarding flight CO₂ emissions (quantitative vs. qualitative) and the comparison framework related to the sustainable purchase decision (negative: alternative with more emissions than the average flight of the route vs. positive: alternative with less emissions than the average flight of the route), therefore it is a 2x2 experiment with four alternative scenarios. A pretest was run before the actual experiment to refine the experiment features and to check the manipulations. Afterward, a different sample of students answered the pre-test questionnaire aimed at recruiting the cases and measuring several pre-stimulus measures. One week later, students came to the neurolab at the University setting to be part of the experiment, made their decision regarding online purchases and answered the post-test survey. A final sample of 21 students was gathered. The committee of ethics of the institution approved the experiment. The results show that qualitative information generates more sustainable decisions (less contaminant alternative) than quantitative information. Moreover, evidence shows that subjects are more willing to choose the sustainable decision to be more ecological (comparison of the average with the less contaminant alternative) rather than to be less contaminant (comparison of the average with the more contaminant alternative). There are also interesting differences in the information processing variables from the eye tracker. Both the total time to make the choice and the specific times by area of interest (AOI) differ depending on the assigned scenario. These results allow for a better understanding of the factors that condition the decision of a traveler to be part of a VCO program and provide useful information for airline managers to promote these programs to reduce environmental impact.

Keywords: voluntary carbon offset, airline, online purchase, carbon emission, sustainability, randomized experiment

Procedia PDF Downloads 35
4277 Multitasking Incentives and Employee Performance: Evidence from Call Center Field Experiments and Laboratory Experiments

Authors: Sung Ham, Chanho Song, Jiabin Wu

Abstract:

Employees are commonly incentivized on both quantity and quality performance and much of the extant literature focuses on demonstrating that multitasking incentives lead to tradeoffs. Alternatively, we consider potential solutions to the tradeoff problem from both a theoretical and an experimental perspective. Across two field experiments from a call center, we find that tradeoffs can be mitigated when incentives are jointly enhanced across tasks, where previous research has suggested that incentives be reduced instead of enhanced. In addition, we also propose and test, in a laboratory setting, the implications of revising the metric used to assess quality. Our results indicate that metrics can be adjusted to align quality and quantity more efficiently. Thus, this alignment has the potential to thwart the classic tradeoff problem. Finally, we validate our findings with an economic experiment that verifies that effort is largely consistent with our theoretical predictions.

Keywords: incentives, multitasking, field experiment, experimental economics

Procedia PDF Downloads 135
4276 On the Study of the Electromagnetic Scattering by Large Obstacle Based on the Method of Auxiliary Sources

Authors: Hidouri Sami, Aguili Taoufik

Abstract:

We consider fast and accurate solutions of scattering problems by large perfectly conducting objects (PEC) formulated by an optimization of the Method of Auxiliary Sources (MAS). We present various techniques used to reduce the total computational cost of the scattering problem. The first technique is based on replacing the object by an array of finite number of small (PEC) object with the same shape. The second solution reduces the problem on considering only the half of the object.These two solutions are compared to results from the reference bibliography.

Keywords: method of auxiliary sources, scattering, large object, RCS, computational resources

Procedia PDF Downloads 212
4275 Numerical Simulation on Two Components Particles Flow in Fluidized Bed

Authors: Wang Heng, Zhong Zhaoping, Guo Feihong, Wang Jia, Wang Xiaoyi

Abstract:

Flow of gas and particles in fluidized beds is complex and chaotic, which is difficult to measure and analyze by experiments. Some bed materials with bad fluidized performance always fluidize with fluidized medium. The material and the fluidized medium are different in many properties such as density, size and shape. These factors make the dynamic process more complex and the experiment research more limited. Numerical simulation is an efficient way to describe the process of gas-solid flow in fluidized bed. One of the most popular numerical simulation methods is CFD-DEM, i.e., computational fluid dynamics-discrete element method. The shapes of particles are always simplified as sphere in most researches. Although sphere-shaped particles make the calculation of particle uncomplicated, the effects of different shapes are disregarded. However, in practical applications, the two-component systems in fluidized bed also contain sphere particles and non-sphere particles. Therefore, it is needed to study the two component flow of sphere particles and non-sphere particles. In this paper, the flows of mixing were simulated as the flow of molding biomass particles and quartz in fluidized bad. The integrated model was built on an Eulerian–Lagrangian approach which was improved to suit the non-sphere particles. The constructed methods of cylinder-shaped particles were different when it came to different numerical methods. Each cylinder-shaped particle was constructed as an agglomerate of fictitious small particles in CFD part, which means the small fictitious particles gathered but not combined with each other. The diameter of a fictitious particle d_fic and its solid volume fraction inside a cylinder-shaped particle α_fic, which is called the fictitious volume fraction, are introduced to modify the drag coefficient β by introducing the volume fraction of the cylinder-shaped particles α_cld and sphere-shaped particles α_sph. In a computational cell, the void ε, can be expressed as ε=1-〖α_cld α〗_fic-α_sph. The Ergun equation and the Wen and Yu equation were used to calculate β. While in DEM method, cylinder-shaped particles were built by multi-sphere method, in which small sphere element merged with each other. Soft sphere model was using to get the connect force between particles. The total connect force of cylinder-shaped particle was calculated as the sum of the small sphere particles’ forces. The model (size=1×0.15×0.032 mm3) contained 420000 sphere-shaped particles (diameter=0.8 mm, density=1350 kg/m3) and 60 cylinder-shaped particles (diameter=10 mm, length=10 mm, density=2650 kg/m3). Each cylinder-shaped particle was constructed by 2072 small sphere-shaped particles (d=0.8 mm) in CFD mesh and 768 sphere-shaped particles (d=3 mm) in DEM mesh. The length of CFD and DEM cells are 1 mm and 2 mm. Superficial gas velocity was changed in different models as 1.0 m/s, 1.5 m/s, 2.0m/s. The results of simulation were compared with the experimental results. The movements of particles were regularly as fountain. The effect of superficial gas velocity on cylinder-shaped particles was stronger than that of sphere-shaped particles. The result proved this present work provided a effective approach to simulation the flow of two component particles.

Keywords: computational fluid dynamics, discrete element method, fluidized bed, multiphase flow

Procedia PDF Downloads 288
4274 Culturing of Bovine Pre-Compacted Morlae in TCM-199 and Baf in a Standard 5% CO2 Laboratory Incubator and in the Vagina of a Goat Doe

Authors: Daniel M. Barry

Abstract:

Since more than half a century ago, attempts have been made to culture cells and embryos outside the body (in vitro or ex vivo). This was done with different culture media and in various “incubators”. In the present study two different culture media were used: a standard TCM-199 culture medium and first trimester amniotic fluid (BAF) collected sterilely from pregnant cows after slaughter. Two different culture conditions were also investigated, the standard laboratory CO2 incubator versus culturing bovine embryos in the vagina of a goat doe. Two experiments were done: Firstly the permeability of different receptacles to CO2 gas was analyzed for possible culture in the vagina. Four-well plates and straws were used to incubate TCM-199 and BAF for a period of 120 h in the presence or absence of 5% CO2 gas. The pH values were measured and recorded every 24 h. In the second experiment pre-compacted morula stage bovine embryos were cultured in the above culture media in sealed 0.25 mL straws in a standard laboratory incubator and in the vagina of a goat doe. Evaluation was done on (1) stage of development and (2) number of blastomeres after 96 h of culture. In the first experiment it was shown that the CO2 gas diffused out of the 4-well plate as well as through the wall of the straws in the absence of CO2 gas, while in the presence of CO2 the pH of both media stabilized between 7.3 and 7.5. This meant that the semen straws were permeable to CO2 gas and could therefore be used as receptacles for culturing early stage bovine embryos. In the second experiment no statistical differences (p>0.05) were found in the number of pre-compacted bovine embryos that developed to the blastocyst stage, or the hatched blastocyst stage, neither for the culture medium used, or the method of culturing in the two incubators. Neither was there any difference (p>0.05) in the number of blastomeres that developed at the blastocyst stage between the two types of incubators. The bovine embryos tended to develop more blastomeres when cultured in BAF than when cultured in TCM-199 in both the standard laboratory incubator and when using the vagina of a goat doe as an incubator.

Keywords: alternative culture, bovine embryos, vagina, bovine amniotic fluid, incubator

Procedia PDF Downloads 460
4273 Fire Protection Performance of Different Industrial Intumescent Coatings for Steel Beams

Authors: Serkan Kocapinar, Gülay Altay

Abstract:

This study investigates the efficiency of two different industrial intumescent coatings which have different types of certifications, in the fire protection performance in steel beams in the case of ISO 834 fire for 2 hours. A better understanding of industrial intumescent coatings, which assure structural integrity and prevent a collapse of steel structures, is needed to minimize the fire risks in steel structures. A comparison and understanding of different fire protective intumescent coatings, which are Product A and Product B, are used as a thermal barrier between the steel components and the fire. Product A is tested according to EN 13381-8 and BS 476-20,22 and is certificated by ISO Standards. Product B is tested according to EN 13381-8 and ASTM UL-94 and is certificated by the Turkish Standards Institute (TSE). Generally, fire tests to evaluate the fire performance of steel components are done numerically with commercial software instead of experiments due to the high cost of an ISO 834 fire test in a furnace. Hence, there is a gap in the literature about the comparisons of different certificated intumescent coatings for fire protection in the case of ISO 834 fire in a furnace experiment for 2 hours. The experiment was carried out by using two 1-meter UPN 200 steel sections. Each one was coated by different industrial intumescent coatings. A furnace was used by the Turkish Standards Institute (TSE) for the experiment. The temperature of the protected steels and the inside of the furnace was measured with the help of 24 thermocouples which were applied before the intumescent coatings during the two hours for the performance of intumescent coatings by getting a temperature-time curve of steel components. FIN EC software was used to determine the critical temperatures of protected steels, and Abaqus was used for thermal analysis to get theoretical results to compare with the experimental results.

Keywords: fire safety, structural steel, ABAQUS, thermal analysis, FIN EC, intumescent coatings

Procedia PDF Downloads 74
4272 Simulation to Detect Virtual Fractional Flow Reserve in Coronary Artery Idealized Models

Authors: Nabila Jaman, K. E. Hoque, S. Sawall, M. Ferdows

Abstract:

Coronary artery disease (CAD) is one of the most lethal diseases of the cardiovascular diseases. Coronary arteries stenosis and bifurcation angles closely interact for myocardial infarction. We want to use computer-aided design model coupled with computational hemodynamics (CHD) simulation for detecting several types of coronary artery stenosis with different locations in an idealized model for identifying virtual fractional flow reserve (vFFR). The vFFR provides us the information about the severity of stenosis in the computational models. Another goal is that we want to imitate patient-specific computed tomography coronary artery angiography model for constructing our idealized models with different left anterior descending (LAD) and left circumflex (LCx) bifurcation angles. Further, we want to analyze whether the bifurcation angles has an impact on the creation of narrowness in coronary arteries or not. The numerical simulation provides the CHD parameters such as wall shear stress (WSS), velocity magnitude and pressure gradient (PGD) that allow us the information of stenosis condition in the computational domain.

Keywords: CAD, CHD, vFFR, bifurcation angles, coronary stenosis

Procedia PDF Downloads 134
4271 X-Ray Photoelectron Spectroscopy Characterization of the Surface Layer on Inconel 625 after Exposition in Molten Salt

Authors: Marie Kudrnova, Jana Petru

Abstract:

This study is part of the international research - Materials for Molten Salt Reactors (MSR) and addresses the part of the project dealing with the corrosion behavior of candidate construction materials. Inconel 625 was characterized by x-ray photoelectron spectroscopy (XPS) before and after high–temperature experiment in molten salt. The experiment was performed in a horizontal tube furnace molten salt reactor, at 450 °C in argon, at atmospheric pressure, for 150 hours. Industrially produced HITEC salt was used (NaNO3, KNO3, NaNO2). The XPS study was carried out using the ESCAProbe P apparatus (Omicron Nanotechnology Ltd.) equipped with a monochromatic Al Kα (1486.6 eV) X-ray source. The surface layer on alloy 625 after exposure contains only Na, C, O, and Ni (as NiOx) and Nb (as NbOx BE 206.8 eV). Ni was detected in the metallic state (Ni0 – Ni 2p BE-852.7 eV, NiOx - Ni 2p BE-854.7 eV) after a short Ar sputtering because the oxide layer on the surface was very thin. Nickel oxides can form a protective layer in the molten salt, but only future long-term exposures can determine the suitability of Inconel 625 for MSR.

Keywords: Inconel 625, molten salt, oxide layer, XPS

Procedia PDF Downloads 119
4270 A Parametric Study on Aerodynamic Performance of Tyre Using CFD

Authors: Sowntharya L.

Abstract:

Aerodynamics is the most important factor when it comes to resistive forces such as lift, drag and side forces acting on the vehicle. In passenger vehicles, reducing the drag will not only unlock the door for higher achievable speed but will also reduce the fuel consumption of the vehicle. Generally, tyre contributes significantly to the overall aerodynamics of the vehicle. Hence, understanding the air-flow behaviour around the tyre is vital to optimize the aerodynamic performance in the early stage of design process. Nowadays, aerodynamic simulation employing Computational Fluid Dynamics (CFD) is gaining more importance as it reduces the number of physical wind-tunnel experiments during vehicle development process. This research develops a methodology to predict aerodynamic drag of a standalone tyre using Numerical CFD Solver and to validate the same using a wind tunnel experiment. A parametric study was carried out on different tread pattern tyres such as slick, circumferential groove & patterned tyre in stationary and rotating boundary conditions. In order to represent wheel rotation contact with the ground, moving reference frame (MRF) approach was used in this study. Aerodynamic parameters such as drag lift & air flow behaviour around the tire were simulated and compared with experimental results.

Keywords: aerodynamics, CFD, drag, MRF, wind-tunnel

Procedia PDF Downloads 165
4269 Optimization of Electrical Discharge Machining Parameters in Machining AISI D3 Tool Steel by Grey Relational Analysis

Authors: Othman Mohamed Altheni, Abdurrahman Abusaada

Abstract:

This study presents optimization of multiple performance characteristics [material removal rate (MRR), surface roughness (Ra), and overcut (OC)] of hardened AISI D3 tool steel in electrical discharge machining (EDM) using Taguchi method and Grey relational analysis. Machining process parameters selected were pulsed current Ip, pulse-on time Ton, pulse-off time Toff and gap voltage Vg. Based on ANOVA, pulse current is found to be the most significant factor affecting EDM process. Optimized process parameters are simultaneously leading to a higher MRR, lower Ra, and lower OC are then verified through a confirmation experiment. Validation experiment shows an improved MRR, Ra and OC when Taguchi method and grey relational analysis were used

Keywords: edm parameters, grey relational analysis, Taguchi method, ANOVA

Procedia PDF Downloads 268
4268 Density functional (DFT), Study of the Structural and Phase Transition of ThC and ThN: LDA vs GGA Computational

Authors: Hamza Rekab Djabri, Salah Daoud

Abstract:

The present paper deals with the computational of structural and electronic properties of ThC and ThN compounds using density functional theory within generalized-gradient (GGA) apraximation and local density approximation (LDA). We employ the full potential linear muffin-tin orbitals (FP-LMTO) as implemented in the Lmtart code. We have used to examine structure parameter in eight different structures such as in NaCl (B1), CsCl (B2), ZB (B3), NiAs (B8), PbO (B10), Wurtzite (B4) , HCP (A3) βSn (A5) structures . The equilibrium lattice parameter, bulk modulus, and its pressure derivative were presented for all calculated phases. The calculated ground state properties are in good agreement with available experimental and theoretical results.

Keywords: DFT, GGA, LDA, properties structurales, ThC, ThN

Procedia PDF Downloads 68
4267 Regional Low Gravity Anomalies Influencing High Concentrations of Heavy Minerals on Placer Deposits

Authors: T. B. Karu Jayasundara

Abstract:

Regions of low gravity and gravity anomalies both influence heavy mineral concentrations on placer deposits. Economically imported heavy minerals are likely to have higher levels of deposition in low gravity regions of placer deposits. This can be found in coastal regions of Southern Asia, particularly in Sri Lanka and Peninsula India and areas located in the lowest gravity region of the world. The area about 70 kilometers of the east coast of Sri Lanka is covered by a high percentage of ilmenite deposits, and the southwest coast of the island consists of Monazite placer deposit. These deposits are one of the largest placer deposits in the world. In India, the heavy mineral industry has a good market. On the other hand, based on the coastal placer deposits recorded, the high gravity region located around Papua New Guinea, has no such heavy mineral deposits. In low gravity regions, with the help of other depositional environmental factors, the grains have more time and space to float in the sea, this helps bring high concentrations of heavy mineral deposits to the coast. The effect of low and high gravity can be demonstrated by using heavy mineral separation devices.  The Wilfley heavy mineral separating table is one of these; it is extensively used in industries and in laboratories for heavy mineral separation. The horizontally oscillating Wilfley table helps to separate heavy and light mineral grains in to deferent fractions, with the use of water. In this experiment, the low and high angle of the Wilfley table are representing low and high gravity respectively. A sample mixture of grain size <0.85 mm of heavy and light mineral grains has been used for this experiment. The high and low angle of the table was 60 and 20 respectively for this experiment. The separated fractions from the table are again separated into heavy and light minerals, with the use of heavy liquid, which consists of a specific gravity of 2.85. The fractions of separated heavy and light minerals have been used for drawing the two-dimensional graphs. The graphs show that the low gravity stage has a high percentage of heavy minerals collected in the upper area of the table than in the high gravity stage. The results of the experiment can be used for the comparison of regional low gravity and high gravity levels of heavy minerals. If there are any heavy mineral deposits in the high gravity regions, these deposits will take place far away from the coast, within the continental shelf.

Keywords: anomaly, gravity, influence, mineral

Procedia PDF Downloads 173
4266 The Effect of Phonetics Factors in Interpretation of Japanese Degree Adverbs

Authors: Yan Lyu

Abstract:

Japanese degree adverbs can be explained in different ways, which is hard for Japanese learners to comprehend. For instance, when ‘tyotto’ is used as a degree word, it can be interpreted literally or not. In the sentence ‘Ano mise, tyotto oishi yo. zehi iku to ii yo.’, ‘tyotto’ can be interpreted as a high degree contextually. Despite pragmatic factors, phonetics factors can also affect the interpretation of such ‘tyotto’. Concentrating on the pattern of ‘tyotto +adjective’, the paper aims to investigate the correlation between the interpretation of ‘tyotto’ and the phonetic factors in some specific contexts based on a listening experiment via PRAAT. It is also investigated that how the phonetic factors affect the interpretation of high degree adverbs, including ‘soutou’ , ‘totemo’ , ‘kanari’ and ‘sugoku’. In the experiment, Japanese speakers listened to sentences which were composed of degree adverbs and adjectives in different intonations and judged which degree the sentences expressed. Two conclusions can be drawn from the experiment results. Firstly, for adverbs expressing a high degree, in the pattern of ‘degree adverb + adjective’, either degree adverb or adjective is pronounced in a higher pitch, or both are highly pronounced, a higher degree can be expressed. Besides, with the insertion of geminate consonant and the extension of the vowel, the longer the duration of the degree adverb becomes, the higher degree can be expressed. Secondly, for ‘tyotto’, which expresses a low degree, the interpretation will be influenced by both phonetic and contextual factors. Phonetically, there are three factors causing ‘tyotto’ to be interpreted as a common degree or a high degree. The three factors are the high pitch of the modified adjective, the extended silence period of the geminate consonant and the change in the intonations of ‘tyotto’. In some contexts just like the comparison sentences, no matter how ‘tyotto + adjective’ is pronounced, ‘tyotto’ tends to be interpreted as a low degree literally.

Keywords: contextual interpretation, Japanese degree adverbs, phonetic interpretation, PRAAT

Procedia PDF Downloads 239
4265 Study of Salinity Stress and Calcium Interaction on Morphological and Physiological Traits of Vicia villosa under Hydroponic Condition

Authors: Raheleh Khademian, Roghayeh Aminian

Abstract:

For the study of salinity stress on Vicia villosa and calcium effect for modulation of that, an experiment was conducted under hydroponic condition, and some important morphological and physiological characteristics were evaluated. This experiment was conducted as a factorial based on randomized complete design with three replications. The treatments include salinity stress in three levels (0, 50, and 100 mM NaCl) and calcium in two levels (content in Hoagland solution and double content). The results showed that all morphological and physiological traits include root and shoot length, root and shoot wet and dry weight, leaf area, leaf chlorophyll content, RWC, CMS, and biological yield was significantly different from the control and is affected by the salinity stress severely. But, calcium effect on them was not significant despite of decreasing salinity effect.

Keywords: Vicia villossa, salinity stress, calcium, hydroponic

Procedia PDF Downloads 230
4264 Affective Transparency in Compound Word Processing

Authors: Jordan Gallant

Abstract:

In the compound word processing literature, much attention has been paid to the relationship between a compound’s denotational meaning and that of its morphological whole-word constituents, which is referred to as ‘semantic transparency’. However, the parallel relationship between a compound’s connotation and that of its constituents has not been addressed at all. For instance, while a compound like ‘painkiller’ might be semantically transparent, it is not ‘affectively transparent’. That is, both constituents have primarily negative connotations, while the whole compound has a positive one. This paper investigates the role of affective transparency on compound processing using two methodologies commonly employed in this field: a lexical decision task and a typing task. The critical stimuli used were 112 English bi-constituent compounds that differed in terms of the effective transparency of their constituents. Of these, 36 stimuli contained constituents with similar connotations to the compound (e.g., ‘dreamland’), 36 contained constituents with more positive connotations (e.g. ‘bedpan’), and 36 contained constituents with more negative connotations (e.g. ‘painkiller’). Connotation of whole-word constituents and compounds were operationalized via valence ratings taken from an off-line ratings database. In Experiment 1, compound stimuli and matched non-word controls were presented visually to participants, who were then asked to indicate whether it was a real word in English. Response times and accuracy were recorded. In Experiment 2, participants typed compound stimuli presented to them visually. Individual keystroke response times and typing accuracy were recorded. The results of both experiments provided positive evidence that compound processing is influenced by effective transparency. In Experiment 1, compounds in which both constituents had more negative connotations than the compound itself were responded to significantly more slowly than compounds in which the constituents had similar or more positive connotations. Typed responses from Experiment 2 showed that inter-keystroke intervals at the morphological constituent boundary were significantly longer when the connotation of the head constituent was either more positive or more negative than that of the compound. The interpretation of this finding is discussed in the context of previous compound typing research. Taken together, these findings suggest that affective transparency plays a role in the recognition, storage, and production of English compound words. This study provides a promising first step in a new direction for research on compound words.

Keywords: compound processing, semantic transparency, typed production, valence

Procedia PDF Downloads 96
4263 Water Saving and Awareness Actions

Authors: R. Morbidelli, C. Saltalippi, A. Flammini, J. Dari

Abstract:

This work analyses what effect systematic awareness-raising of the population on domestic water consumption produces. In a period where the availability of water is continually decreasing due to reduced rainfall, it is of paramount importance to raise awareness among the population. We conducted an experiment on a large sample of homes in urban areas of Central Italy. In a first phase, lasting three weeks, normal per capita water consumption was quantified. Subsequently, instructions were given on how to save water during various uses in the household (showers, cleaning hands, use of water in toilets, watering small green areas, use of water in the kitchen, ...), and small visual messages were posted at water dispensers to remind users to behave properly. Finally, household consumption was assessed again for a further 3 weeks. This experiment made it possible to quantify the effect of the awareness-raising action on the reduction of water consumption without the use of any structural action (replacement of dispensers, improvement of the water system, ...).

Keywords: water saving, urban areas, awareness-raising, climate change

Procedia PDF Downloads 10
4262 Optimal Allocation of Multiple Emergency Resources for a Single Potential Accident Node: A Mixed Integer Linear Program

Authors: Yongjian Du, Jinhua Sun, Kim M. Liew, Huahua Xiao

Abstract:

Optimal allocation of emergency resources before a disaster is of great importance for emergency response. In reality, the pre-protection for a single critical node where accidents may occur is common. In this study, a model is developed to determine location and inventory decisions of multiple emergency resources among a set of candidate stations to minimize the total cost based on the constraints of budgetary and capacity. The total cost includes the economic accident loss which is accorded with probability distribution of time and the warehousing cost of resources which is increasing over time. A ratio is set to measure the degree of a storage station only serving the target node that becomes larger with the decrease of the distance between them. For the application of linear program, it is assumed that the length of travel time to the accident scene of emergency resources has a linear relationship with the economic accident loss. A computational experiment is conducted to illustrate how the proposed model works, and the results indicate its effectiveness and practicability.

Keywords: emergency response, integer linear program, multiple emergency resources, pre-allocation decisions, single potential accident node

Procedia PDF Downloads 126