Search results for: simulated annealing technique (SA)
7539 Identification of Mx Gene Polymorphism in Indragiri Hulu duck by PCR-RFLP
Authors: Restu Misrianti
Abstract:
The amino acid variation of Asn (allele A) at position 631 in Mx gene was specific to positive antiviral to avian viral desease. This research was aimed at identifying polymorphism of Mx gene in duck using molecular technique. Polymerase Chain Reaction-Restriction Fragment Length Polymorphism (PCR-RFLP) technique was used to select the genotype of AA, AG and GG. There were thirteen duck from Indragiri Hulu regency (Riau Province) used in this experiment. DNA amplification results showed that the Mx gene in duck is found in a 73 bp fragment. Mx gene in duck did not show any polymorphism. The frequency of the resistant allele (AA) was 0%, while the frequency of the susceptible allele (GG) was 100%.Keywords: duck, Mx gene, PCR, RFLP
Procedia PDF Downloads 3257538 A Three-Dimensional (3D) Numerical Study of Roofs Shape Impact on Air Quality in Urban Street Canyons with Tree Planting
Authors: Bouabdellah Abed, Mohamed Bouzit, Lakhdar Bouarbi
Abstract:
The objective of this study is to investigate numerically the effect of roof shaped on wind flow and pollutant dispersion in a street canyon with one row of trees of pore volume, Pvol = 96%. A three-dimensional computational fluid dynamics (CFD) model for evaluating air flow and pollutant dispersion within an urban street canyon using Reynolds-averaged Navier–Stokes (RANS) equations and the k-Epsilon EARSM turbulence model as close of the equation system. The numerical model is performed with ANSYS-CFX code. Vehicle emissions were simulated as double line sources along the street. The numerical model was validated against the wind tunnel experiment. Having established this, the wind flow and pollutant dispersion in urban street canyons of six roof shapes are simulated. The numerical simulation agrees reasonably with the wind tunnel data. The results obtained in this work, indicate that the flow in 3D domain is more complicated, this complexity is increased with presence of tree and variability of the roof shapes. The results also indicated that the largest pollutant concentration level for two walls (leeward and windward wall) is observed with the upwind wedge-shaped roof. But the smallest pollutant concentration level is observed with the dome roof-shaped. The results also indicated that the corners eddies provide additional ventilation and lead to lower traffic pollutant concentrations at the street canyon ends.Keywords: street canyon, pollutant dispersion, trees, building configuration, numerical simulation, k-Epsilon EARSM
Procedia PDF Downloads 3667537 A Standard Operating Procedure (SOP) for Forensic Soil Analysis: Tested Using a Simulated Crime Scene
Authors: Samara A. Testoni, Vander F. Melo, Lorna A. Dawson, Fabio A. S. Salvador
Abstract:
Soil traces are useful as forensic evidence due to their potential to transfer and adhere to different types of surfaces on a range of objects or persons. The great variability expressed by soil physical, chemical, biological and mineralogical properties show soil traces as complex mixtures. Soils are continuous and variable, no two soil samples being indistinguishable, nevertheless, the complexity of soil characteristics can provide powerful evidence for comparative forensic purposes. This work aimed to establish a Standard Operating Procedure (SOP) for forensic soil analysis in Brazil. We carried out a simulated crime scene with double blind sampling to calibrate the sampling procedures. Samples were collected at a range of locations covering a range of soil types found in South of Brazil: Santa Candida and Boa Vista, neighbourhoods from Curitiba (State of Parana) and in Guarani and Guaraituba, neighbourhoods from Colombo (Curitiba Metropolitan Region). A previously validated sequential analyses of chemical, physical and mineralogical analyses was developed in around 2 g of soil. The suggested SOP and the sequential range of analyses were effective in grouping the samples from the same place and from the same parent material together, as well as successfully discriminated samples from different locations and originated from different rocks. In addition, modifications to the sample treatment and analytical protocol can be made depending on the context of the forensic work.Keywords: clay mineralogy, forensic soils analysis, sequential analyses, kaolinite, gibbsite
Procedia PDF Downloads 2557536 Enhanced Cluster Based Connectivity Maintenance in Vehicular Ad Hoc Network
Authors: Manverpreet Kaur, Amarpreet Singh
Abstract:
The demand of Vehicular ad hoc networks is increasing day by day, due to offering the various applications and marvelous benefits to VANET users. Clustering in VANETs is most important to overcome the connectivity problems of VANETs. In this paper, we proposed a new clustering technique Enhanced cluster based connectivity maintenance in vehicular ad hoc network. Our objective is to form long living clusters. The proposed approach is grouping the vehicles, on the basis of the longest list of neighbors to form clusters. The cluster formation and cluster head selection process done by the RSU that may results it reduces the chances of overhead on to the network. The cluster head selection procedure is the vehicle which has closest speed to average speed will elect as a cluster Head by the RSU and if two vehicles have same speed which is closest to average speed then they will be calculate by one of the new parameter i.e. distance to their respective destination. The vehicle which has largest distance to their destination will be choosing as a cluster Head by the RSU. Our simulation outcomes show that our technique performs better than the existing technique.Keywords: VANETs, clustering, connectivity, cluster head, intelligent transportation system (ITS)
Procedia PDF Downloads 2487535 Laser Cooling of Internal Degrees of Freedom of Molecules: Cesium Case
Authors: R. Horchani
Abstract:
Optical pumping technique with laser fields combined with photo-association of ultra-cold atoms leads to control on demand the vibrational and/or the rotational population of molecules. Here, we review the basic concepts and main steps should be followed, including the excitation schemes and detection techniques we use to achieve the ro-vibrational cooling of Cs2 molecules. We also discuss the extension of this technique to other molecules. In addition, we present a theoretical model used to support the experiment. These simulations can be widely used for the preparation of various experiments since they allow the optimization of several important experimental parameters.Keywords: cold molecule, photo-association, optical pumping, vibrational and rotational cooling
Procedia PDF Downloads 3027534 Synthesis and in vitro Characterization of a Gel-Derived SiO2-CaO-P2O5-SrO-Li2O Bioactive Glass
Authors: Mehrnaz Aminitabar, Moghan Amirhosseinian, Morteza Elsa
Abstract:
Bioactive glasses (BGs) are a group of surface-reactive biomaterials used in clinical applications as implants or filler materials in the human body to repair and replace diseased or damaged bone. Sol-gel technique was employed to prepare a SiO2-CaO-P2O5 glass with nominal composition of 58S BG with the addition of Sr and Li modifiers which imparts special properties to the BG. The effect of simultaneous addition of Sr and Li on bioactivity and biocompatibility, proliferation, alkaline phosphatase (ALP) activity of osteoblast cell line MC3T3-E1 and antibacterial property against methicillin-resistant Staphylococcus aureus (MRSA) bacteria were examined. BGs were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy before and after soaking the samples in the simulated body fluid (SBF) for different time intervals to characterize the formation of hydroxyapatite (HA) formed on the surface of BGs. Structural characterization indicated that the simultaneous presence of 5% Sr and 5% Li in 58S-BG composition not only did not retard HA formation because of opposite effect of Sr and Li of the dissolution of BG in the SBF but also, stimulated the differentiation and proliferation of MC3T3-E1s. Moreover, the presence of Sr and Li on dissolution of the ions resulted in an increase in the mean number of DAPI-labeled nuclei which was in good agreement with live/dead assay. The result of antibacterial tests revealed that Sr and Li-substituted 58S BG exhibited a potential antibacterial effect against MRSA bacteria. Because of optimal proliferation and ALP activity of MC3T3-E1cells, proper bioactivity and high antibacterial potential against MRSA, BG-5/5 is suggested as a multifunctional candidate for bone tissue engineering.Keywords: antibacterial activity, bioactive glass, sol-gel, strontium
Procedia PDF Downloads 1217533 Minimizing Fresh and Wastewater Using Water Pinch Technique in Petrochemical Industries
Authors: Wasif Mughees, Malik Al-Ahmad, Muhammad Naeem
Abstract:
This research involves the design and analysis of pinch-based water/wastewater networks to minimize water utility in the petrochemical and petroleum industries. A study has been done on Tehran Oil Refinery to analyze feasibilities of regeneration, reuse and recycling of water network. COD is considered as a single key contaminant. Amount of freshwater was reduced about 149m3/h (43.8%) regarding COD. Re-design (or retrofitting) of water allocation in the networks was undertaken. The results were analyzed through graphical method and mathematical programming technique which clearly demonstrated that amount of required water would be determined by mass transfer of COD.Keywords: minimization, water pinch, water management, pollution prevention
Procedia PDF Downloads 4507532 An Investigation of Surface Texturing by Ultrasonic Impingement of Micro-Particles
Authors: Nagalingam Arun Prasanth, Ahmed Syed Adnan, S. H. Yeo
Abstract:
Surface topography plays a significant role in the functional performance of engineered parts. It is important to have a control on the surface geometry and understanding on the surface details to get the desired performance. Hence, in the current research contribution, a non-contact micro-texturing technique has been explored and developed. The technique involves ultrasonic excitation of a tool as a prime source of surface texturing for aluminum alloy workpieces. The specimen surface is polished first and is then immersed in a liquid bath containing 10% weight concentration of Ti6Al4V grade 5 spherical powders. A submerged slurry jet is used to recirculate the spherical powders under the ultrasonic horn which is excited at an ultrasonic frequency and amplitude of 40 kHz and 70 µm respectively. The distance between the horn and workpiece surface was remained fixed at 200 µm using a precision control stage. Texturing effects were investigated for different process timings of 1, 3 and 5 s. Thereafter, the specimens were cleaned in an ultrasonic bath for 5 mins to remove loose debris on the surface. The developed surfaces are characterized by optical and contact surface profiler. The optical microscopic images show a texture of circular spots on the workpiece surface indented by titanium spherical balls. Waviness patterns obtained from contact surface profiler supports the texturing effect produced from the proposed technique. Furthermore, water droplet tests were performed to show the efficacy of the proposed technique to develop hydrophilic surfaces and to quantify the texturing effect produced.Keywords: surface texturing, surface modification, topography, ultrasonic
Procedia PDF Downloads 2237531 Application of Biopolymer for Adsorption of Methylene Blue Dye from Simulated Effluent: A Green Method for Textile Industry Wastewater Treatment
Authors: Rabiya, Ramkrishna Sen
Abstract:
The textile industry releases huge volume of effluent containing reactive dyes in the nearby water bodies. These effluents are significant source of water pollution since most of the dyes are toxic in nature. Moreover, it scavenges the dissolved oxygen essential to the aquatic species. Therefore, it is necessary to treat the dye effluent before it is discharged in the nearby water bodies. The present study focuses on removing the basic dye methylene blue from simulated wastewater using biopolymer. The biopolymer was partially purified from the culture of Bacillus licheniformis by ultrafiltration. Based on the elution profile of the biopolymer from ion exchange column, it was found to be a negatively charged molecule. Its net anionic nature allows the biopolymer to adsorb positively charged molecule, methylene blue. The major factors which influence the removal of dye by the biopolymer such as incubation time, pH, initial dye concentration were evaluated. The methylene blue uptake by the biopolymer is more (14.84 mg/g) near neutral pH than in acidic pH (12.05mg/g) of the water. At low pH, the lower dissociation of the dye molecule as well as the low negative charge available on the biopolymer reduces the interaction between the biopolymer and dye. The optimum incubation time for maximum removal of dye was found to be 60 min. The entire study was done with 25 mL of dye solution in 100 mL flask at 25 °C with an amount of 11g/L of biopolymer. To study the adsorption isotherm, the dye concentration was varied in the range of 25mg/L to 205mg/L. The dye uptake by the biopolymer against the equilibrium concentration was plotted. The plot indicates that the adsorption of dye by biopolymer follows the Freundlich adsorption isotherm (R-square 0.99). Hence, these studies indicate the potential use of biopolymer for the removal of basic dye from textile wastewater in an ecofriendly and sustainable way.Keywords: biopolymer, methylene blue dye, textile industry, wastewater
Procedia PDF Downloads 1427530 Improved Network Construction Methods Based on Virtual Rails for Mobile Sensor Network
Authors: Noritaka Shigei, Kazuto Matsumoto, Yoshiki Nakashima, Hiromi Miyajima
Abstract:
Although Mobile Wireless Sensor Networks (MWSNs), which consist of mobile sensor nodes (MSNs), can cover a wide range of observation region by using a small number of sensor nodes, they need to construct a network to collect the sensing data on the base station by moving the MSNs. As an effective method, the network construction method based on Virtual Rails (VRs), which is referred to as VR method, has been proposed. In this paper, we propose two types of effective techniques for the VR method. They can prolong the operation time of the network, which is limited by the battery capabilities of MSNs and the energy consumption of MSNs. The first technique, an effective arrangement of VRs, almost equalizes the number of MSNs belonging to each VR. The second technique, an adaptive movement method of MSNs, takes into account the residual energy of battery. In the simulation, we demonstrate that each technique can improve the network lifetime and the combination of both techniques is the most effective.Keywords: mobile sensor node, relay of sensing data, residual energy, virtual rail, wireless sensor network
Procedia PDF Downloads 3317529 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model
Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim
Abstract:
Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph
Procedia PDF Downloads 2807528 An Intelligent Scheme Switching for MIMO Systems Using Fuzzy Logic Technique
Authors: Robert O. Abolade, Olumide O. Ajayi, Zacheaus K. Adeyemo, Solomon A. Adeniran
Abstract:
Link adaptation is an important strategy for achieving robust wireless multimedia communications based on quality of service (QoS) demand. Scheme switching in multiple-input multiple-output (MIMO) systems is an aspect of link adaptation, and it involves selecting among different MIMO transmission schemes or modes so as to adapt to the varying radio channel conditions for the purpose of achieving QoS delivery. However, finding the most appropriate switching method in MIMO links is still a challenge as existing methods are either computationally complex or not always accurate. This paper presents an intelligent switching method for the MIMO system consisting of two schemes - transmit diversity (TD) and spatial multiplexing (SM) - using fuzzy logic technique. In this method, two channel quality indicators (CQI) namely average received signal-to-noise ratio (RSNR) and received signal strength indicator (RSSI) are measured and are passed as inputs to the fuzzy logic system which then gives a decision – an inference. The switching decision of the fuzzy logic system is fed back to the transmitter to switch between the TD and SM schemes. Simulation results show that the proposed fuzzy logic – based switching technique outperforms conventional static switching technique in terms of bit error rate and spectral efficiency.Keywords: channel quality indicator, fuzzy logic, link adaptation, MIMO, spatial multiplexing, transmit diversity
Procedia PDF Downloads 1557527 Multi-Vehicle Detection Using Histogram of Oriented Gradients Features and Adaptive Sliding Window Technique
Authors: Saumya Srivastava, Rina Maiti
Abstract:
In order to achieve a better performance of vehicle detection in a complex environment, we present an efficient approach for a multi-vehicle detection system using an adaptive sliding window technique. For a given frame, image segmentation is carried out to establish the region of interest. Gradient computation followed by thresholding, denoising, and morphological operations is performed to extract the binary search image. Near-region field and far-region field are defined to generate hypotheses using the adaptive sliding window technique on the resultant binary search image. For each vehicle candidate, features are extracted using a histogram of oriented gradients, and a pre-trained support vector machine is applied for hypothesis verification. Later, the Kalman filter is used for tracking the vanishing point. The experimental results show that the method is robust and effective on various roads and driving scenarios. The algorithm was tested on highways and urban roads in India.Keywords: gradient, vehicle detection, histograms of oriented gradients, support vector machine
Procedia PDF Downloads 1247526 Perceived Restorativeness Scale– 6: A Short Version of the Perceived Restorativeness Scale for Mixed (or Mobile) Devices
Authors: Sara Gallo, Margherita Pasini, Margherita Brondino, Daniela Raccanello, Roberto Burro, Elisa Menardo
Abstract:
Most of the studies on the ability of environments to recover people’s cognitive resources have been conducted in laboratory using simulated environments (e.g., photographs, videos, or virtual reality), based on the implicit assumption that exposure to simulated environments has the same effects of exposure to real environments. However, the technical characteristics of simulated environments, such as the dynamic or static characteristics of the stimulus, critically affect their perception. Measuring perceived restorativeness in situ rather than in laboratory could increase the validity of the obtained measurements. Personal mobile devices could be useful because they allow accessing immediately online surveys when people are directly exposed to an environment. At the same time, it becomes important to develop short and reliable measuring instruments that allow a quick assessment of the restorative qualities of the environments. One of the frequently used self-report measures to assess perceived restorativeness is the “Perceived Restorativeness Scale” (PRS) based on Attention Restoration Theory. A lot of different versions have been proposed and used according to different research purposes and needs, without studying their validity. This longitudinal study reported some preliminary validation analyses on a short version of original scale, the PRS-6, developed to be quick and mobile-friendly. It is composed of 6 items assessing fascination and being-away. 102 Italian university students participated to the study, 84% female with age ranging from 18 to 47 (M = 20.7; SD = 2.9). Data were obtained through a survey online that asked them to report their perceived restorativeness of the environment they were in (and the kind of environment) and their positive emotion (Positive and Negative Affective Schedule, PANAS) once a day for seven days. Cronbach alpha and item-total correlations were used to assess reliability and internal consistency. Confirmatory Factor Analyses (CFA) models were run to study the factorial structure (construct validity). Correlation analyses between PRS and PANAS scores were used to check discriminant validity. In the end, multigroup CFA models were used to study measurement invariance (configural, metric, scalar, strict) between different mobile devices and between day of assessment. On the whole, the PRS-6 showed good psychometric proprieties, similar to those of the original scale, and invariance across devices and days. These results suggested that the PRS-6 could be a valid alternative to assess perceived restorativeness when researchers need a brief and immediate evaluation of the recovery quality of an environment.Keywords: restorativeness, validation, short scale development, psychometrics proprieties
Procedia PDF Downloads 2547525 Monte Carlo Simulation Study on Improving the Flatting Filter-Free Radiotherapy Beam Quality Using Filters from Low- z Material
Authors: H. M. Alfrihidi, H.A. Albarakaty
Abstract:
Flattening filter-free (FFF) photon beam radiotherapy has increased in the last decade, which is enabled by advancements in treatment planning systems and radiation delivery techniques like multi-leave collimators. FFF beams have higher dose rates, which reduces treatment time. On the other hand, FFF beams have a higher surface dose, which is due to the loss of beam hardening effect caused by the presence of the flatting filter (FF). The possibility of improving FFF beam quality using filters from low-z materials such as steel and aluminium (Al) was investigated using Monte Carlo (MC) simulations. The attenuation coefficient of low-z materials for low-energy photons is higher than that of high-energy photons, which leads to the hardening of the FFF beam and, consequently, a reduction in the surface dose. BEAMnrc user code, based on Electron Gamma Shower (EGSnrc) MC code, is used to simulate the beam of a 6 MV True-Beam linac. A phase-space (phosphor) file provided by Varian Medical Systems was used as a radiation source in the simulation. This phosphor file was scored just above the jaws at 27.88 cm from the target. The linac from the jaw downward was constructed, and radiation passing was simulated and scored at 100 cm from the target. To study the effect of low-z filters, steel and Al filters with a thickness of 1 cm were added below the jaws, and the phosphor file was scored at 100 cm from the target. For comparison, the FF beam was simulated using a similar setup. (BEAM Data Processor (BEAMdp) is used to analyse the energy spectrum in the phosphorus files. Then, the dose distribution resulting from these beams was simulated in a homogeneous water phantom using DOSXYZnrc. The dose profile was evaluated according to the surface dose, the lateral dose distribution, and the percentage depth dose (PDD). The energy spectra of the beams show that the FFF beam is softer than the FF beam. The energy peaks for the FFF and FF beams are 0.525 MeV and 1.52 MeV, respectively. The FFF beam's energy peak becomes 1.1 MeV using a steel filter, while the Al filter does not affect the peak position. Steel and Al's filters reduced the surface dose by 5% and 1.7%, respectively. The dose at a depth of 10 cm (D10) rises by around 2% and 0.5% due to using a steel and Al filter, respectively. On the other hand, steel and Al filters reduce the dose rate of the FFF beam by 34% and 14%, respectively. However, their effect on the dose rate is less than that of the tungsten FF, which reduces the dose rate by about 60%. In conclusion, filters from low-z material decrease the surface dose and increase the D10 dose, allowing for a high-dose delivery to deep tumors with a low skin dose. Although using these filters affects the dose rate, this effect is much lower than the effect of the FF.Keywords: flattening filter free, monte carlo, radiotherapy, surface dose
Procedia PDF Downloads 737524 CFD Simulation Approach for Developing New Powder Dispensing Device
Authors: Revanth Rallapalli
Abstract:
Manually dispensing powders can be difficult as it requires gradually pouring and checking the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and are user-dependent and difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various new powder dispensing mechanisms are being designed to overcome these challenges. A battery-operated screw conveyor mechanism is being innovated to overcome the above problems faced. These inventions are numerically evaluated at the concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in the development of such devices saving time and money by reducing the number of prototypes and testing. This paper describes a simulation of powder dispensation from the trocar’s end by considering the powder as secondary flow in the air, is simulated by using the technique called Dense Discrete Phase Model incorporated with Kinetic Theory of Granular Flow (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation of powder from the inlet side to the trocar’s end side is done by rotation of the screw conveyor. The performance is calculated for a 1-sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and the effective area within a quick turnaround time frame.Keywords: multiphase flow, screw conveyor, transient, dense discrete phase model (DDPM), kinetic theory of granular flow (KTGF)
Procedia PDF Downloads 1477523 Unsteady Flow Simulations for Microchannel Design and Its Fabrication for Nanoparticle Synthesis
Authors: Mrinalini Amritkar, Disha Patil, Swapna Kulkarni, Sukratu Barve, Suresh Gosavi
Abstract:
Micro-mixers play an important role in the lab-on-a-chip applications and micro total analysis systems to acquire the correct level of mixing for any given process. The mixing process can be classified as active or passive according to the use of external energy. Literature of microfluidics reports that most of the work is done on the models of steady laminar flow; however, the study of unsteady laminar flow is an active area of research at present. There are wide applications of this, out of which, we consider nanoparticle synthesis in micro-mixers. In this work, we have developed a model for unsteady flow to study the mixing performance of a passive micro mixer for reactants used for such synthesis. The model is developed in Finite Volume Method (FVM)-based software, OpenFOAM. The model is tested by carrying out the simulations at Re of 0.5. Mixing performance of the micro-mixer is investigated using simulated concentration values of mixed species across the width of the micro-mixer and calculating the variance across a line profile. Experimental validation is done by passing dyes through a Y shape micro-mixer fabricated using polydimethylsiloxane (PDMS) polymer and comparing variances with the simulated ones. Gold nanoparticles are later synthesized through the micro-mixer and collected at two different times leading to significantly different size distributions. These times match with the time scales over which reactant concentrations vary as obtained from simulations. Our simulations could thus be used to create design aids for passive micro-mixers used in nanoparticle synthesis.Keywords: Lab-on-chip, LOC, micro-mixer, OpenFOAM, PDMS
Procedia PDF Downloads 1637522 Simulation and Experimental Study on Dual Dense Medium Fluidization Features of Air Dense Medium Fluidized Bed
Authors: Cheng Sheng, Yuemin Zhao, Chenlong Duan
Abstract:
Air dense medium fluidized bed is a typical application of fluidization techniques for coal particle separation in arid areas, where it is costly to implement wet coal preparation technologies. In the last three decades, air dense medium fluidized bed, as an efficient dry coal separation technique, has been studied in many aspects, including energy and mass transfer, hydrodynamics, bubbling behaviors, etc. Despite numerous researches have been published, the fluidization features, especially dual dense medium fluidization features have been rarely reported. In dual dense medium fluidized beds, different combinations of different dense mediums play a significant role in fluidization quality variation, thus influencing coal separation efficiency. Moreover, to what extent different dense mediums mix and to what extent the two-component particulate mixture affects the fluidization performance and quality have been in suspense. The proposed work attempts to reveal underlying mechanisms of generation and evolution of two-component particulate mixture in the fluidization process. Based on computational fluid dynamics methods and discrete particle modelling, movement and evolution of dual dense mediums in air dense medium fluidized bed have been simulated. Dual dense medium fluidization experiments have been conducted. Electrical capacitance tomography was employed to investigate the distribution of two-component mixture in experiments. Underlying mechanisms involving two-component particulate fluidization are projected to be demonstrated with the analysis and comparison of simulation and experimental results.Keywords: air dense medium fluidized bed, particle separation, computational fluid dynamics, discrete particle modelling
Procedia PDF Downloads 3837521 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure
Authors: T. Nozu, K. Hibi, T. Nishiie
Abstract:
This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure
Procedia PDF Downloads 2457520 Use of Simulation in Medical Education: Role and Challenges
Authors: Raneem Osama Salem, Ayesha Nuzhat, Fatimah Nasser Al Shehri, Nasser Al Hamdan
Abstract:
Background: Recently, most medical schools around the globe are using simulation for teaching and assessing students’ clinical skills and competence. There are many obstacles that could face students and faculty when simulation sessions are introduced into undergraduate curriculum. Objective: The aim of this study is to obtain the opinion of undergraduate medical students and our faculty regarding the role of simulation in undergraduate curriculum, the simulation modalities used, and perceived barriers in implementing stimulation sessions. Methods: To address the role of simulation, modalities used, and perceived challenges to implementation of simulation sessions, a self-administered pilot tested questionnaire with 18 items using a 5 point Likert scale was distributed. Participants included undergraduate male medical students (n=125) and female students (n=70) as well as the faculty members (n=14). Result: Various learning outcomes are achieved and improved through the technology enhanced simulation sessions such as communication skills, diagnostic skills, procedural skills, self-confidence, and integration of basic and clinical sciences. The use of high fidelity simulators, simulated patients and task trainers was more desirable by our students and faculty for teaching and learning as well as an evaluation tool. According to most of the students,' institutional support in terms of resources, staff and duration of sessions was adequate. However, motivation to participate in the sessions and provision of adequate feedback by the staff was a constraint. Conclusion: The use of simulation laboratory is of great benefit to the students and a great teaching tool for the staff to ensure students learning of the various skills.Keywords: simulators, medical students, skills, simulated patients, performance, challenges, skill laboratory
Procedia PDF Downloads 4097519 Computer Aided Diagnostic System for Detection and Classification of a Brain Tumor through MRI Using Level Set Based Segmentation Technique and ANN Classifier
Authors: Atanu K Samanta, Asim Ali Khan
Abstract:
Due to the acquisition of huge amounts of brain tumor magnetic resonance images (MRI) in clinics, it is very difficult for radiologists to manually interpret and segment these images within a reasonable span of time. Computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of radiologists and reduce the time required for accurate diagnosis. An intelligent computer-aided technique for automatic detection of a brain tumor through MRI is presented in this paper. The technique uses the following computational methods; the Level Set for segmentation of a brain tumor from other brain parts, extraction of features from this segmented tumor portion using gray level co-occurrence Matrix (GLCM), and the Artificial Neural Network (ANN) to classify brain tumor images according to their respective types. The entire work is carried out on 50 images having five types of brain tumor. The overall classification accuracy using this method is found to be 98% which is significantly good.Keywords: brain tumor, computer-aided diagnostic (CAD) system, gray-level co-occurrence matrix (GLCM), tumor segmentation, level set method
Procedia PDF Downloads 5147518 An Improved Particle Swarm Optimization Technique for Combined Economic and Environmental Power Dispatch Including Valve Point Loading Effects
Authors: Badr M. Alshammari, T. Guesmi
Abstract:
In recent years, the combined economic and emission power dispatch is one of the main problems of electrical power system. It aims to schedule the power generation of generators in order to minimize cost production and emission of harmful gases caused by fossil-fueled thermal units such as CO, CO2, NOx, and SO2. To solve this complicated multi-objective problem, an improved version of the particle swarm optimization technique that includes non-dominated sorting concept has been proposed. Valve point loading effects and system losses have been considered. The three-unit and ten-unit benchmark systems have been used to show the effectiveness of the suggested optimization technique for solving this kind of nonconvex problem. The simulation results have been compared with those obtained using genetic algorithm based method. Comparison results show that the proposed approach can provide a higher quality solution with better performance.Keywords: power dispatch, valve point loading effects, multiobjective optimization, Pareto solutions
Procedia PDF Downloads 2757517 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease
Authors: Usama Ahmed
Abstract:
Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.Keywords: data mining, classification, diabetes, WEKA
Procedia PDF Downloads 1477516 Using Mind Map Technique to Enhance Medical Vocabulary Retention for the First Year Nursing Students at a Higher Education Institution
Authors: Nguyen Quynh Trang, Nguyễn Thị Hông Nhung
Abstract:
The study aimed to identify the effectiveness of using the mind map technique to enhance students’ medical vocabulary retention among a group of students at a higher education institution - Thai Nguyen University of Medicine and Pharmacy during the first semester of the school year 2022-2023. The research employed a quasi-experimental method, exploring primary sources such as questionnaires and the analyzed results of pre-and-post tests. Almost teachers and students showed high preferences for the implementation of the mind map technique in language teaching and learning. Furthermore, results from the pre-and-post tests between the experimental group and control one pointed out that this technique brought back positive academic performance in teaching and learning English. The research findings revealed that there should be more supportive policies to evoke the use of the mind map technique in a pedagogical context. Aim of the Study: The purpose of this research was to investigate whether using mind mapping can help students to enhance nursing students’ medical vocabulary retention and to assess the students’ attitudes toward using mind mapping as a tool to improve their vocabulary. The methodology of the study: The research employed a quasi-experimental method, exploring primary sources such as questionnaires and the analyzed results of pre-and-post tests. The contribution of the study: The research contributed to the innovation of teaching vocabulary methods for English teachers at a higher education institution. Moreover, the research helped the English teachers and the administrators at a university evoke and maintain the motivation of students not only in English classes but also in other subjects. The findings of this research were beneficial to teachers, students, and researchers interested in using mind mapping to teach and learn English vocabulary. The research explored and proved the effectiveness of applying mind mapping in teaching and learning English vocabulary. Therefore, teaching and learning activities were conducted more and more effectively and helped students overcome challenges in remembering vocabulary and creating motivation to learn English vocabulary.Keywords: medical vocabulary retention, mind map technique, nursing students, medical vocabulary
Procedia PDF Downloads 777515 Experimental Implementation of Model Predictive Control for Permanent Magnet Synchronous Motor
Authors: Abdelsalam A. Ahmed
Abstract:
Fast speed drives for Permanent Magnet Synchronous Motor (PMSM) is a crucial performance for the electric traction systems. In this paper, PMSM is drived with a Model-based Predictive Control (MPC) technique. Fast speed tracking is achieved through optimization of the DC source utilization using MPC. The technique is based on predicting the optimum voltage vector applied to the driver. Control technique is investigated by comparing to the cascaded PI control based on Space Vector Pulse Width Modulation (SVPWM). MPC and SVPWM-based FOC are implemented with the TMS320F2812 DSP and its power driver circuits. The designed MPC for a PMSM drive is experimentally validated on a laboratory test bench. The performances are compared with those obtained by a conventional PI-based system in order to highlight the improvements, especially regarding speed tracking response.Keywords: permanent magnet synchronous motor, model-based predictive control, DC source utilization, cascaded PI control, space vector pulse width modulation, TMS320F2812 DSP
Procedia PDF Downloads 6457514 Encapsulation of Probiotic Bacteria in Complex Coacervates
Authors: L. A. Bosnea, T. Moschakis, C. Biliaderis
Abstract:
Two probiotic strains of Lactobacillus paracasei subsp. paracasei (E6) and Lactobacillus paraplantarum (B1), isolated from traditional Greek dairy products, were microencapsulated by complex coacervation using whey protein isolate (WPI, 3% w/v) and gum arabic (GA, 3% w/v) solutions mixed at different polymer ratio (1:1, 2:1 and 4:1). The effect of total biopolymer concentration on cell viability was assessed using WPI and GA solutions of 1, 3 and 6% w/v at a constant ratio of 2:1. Also, several parameters were examined for optimization of the microcapsule formation, such as inoculum concentration and the effect of ionic strength. The viability of the bacterial cells during heat treatment and under simulated gut conditions was also evaluated. Among the different WPI/GA weight ratios tested (1:1, 2:1, and 4:1), the highest survival rate was observed for the coacervate structures made with the ratio of 2:1. The protection efficiency at low pH values is influenced by both concentration and the ratio of the added biopolymers. Moreover, the inoculum concentration seems to affect the efficiency of microcapsules to entrap the bacterial cells since an optimum level was noted at less than 8 log cfu/ml. Generally, entrapment of lactobacilli in the complex coacervate structure enhanced the viability of the microorganisms when exposed to a low pH environment (pH 2.0). Both encapsulated strains retained high viability in simulated gastric juice (>73%), especially in comparison with non-encapsulated (free) cells (<19%). The encapsulated lactobacilli also exhibited enhanced viability after 10–30 min of heat treatment (65oC) as well as at different NaCl concentrations (pH 4.0). Overall, the results of this study suggest that complex coacervation with WPI/GA has a potential to deliver live probiotics in low pH food systems and fermented dairy products; the complexes can dissolve at pH 7.0 (gut environment), releasing the microbial cells.Keywords: probiotic, complex coacervation, whey, encapsulation
Procedia PDF Downloads 2987513 Optimization of Multi Commodities Consumer Supply Chain: Part 1-Modelling
Authors: Zeinab Haji Abolhasani, Romeo Marian, Lee Luong
Abstract:
This paper and its companions (Part II, Part III) will concentrate on optimizing a class of supply chain problems known as Multi- Commodities Consumer Supply Chain (MCCSC) problem. MCCSC problem belongs to production-distribution (P-D) planning category. It aims to determine facilities location, consumers’ allocation, and facilities configuration to minimize total cost (CT) of the entire network. These facilities can be manufacturer units (MUs), distribution centres (DCs), and retailers/end-users (REs) but not limited to them. To address this problem, three major tasks should be undertaken. At the first place, a mixed integer non-linear programming (MINP) mathematical model is developed. Then, system’s behaviors under different conditions will be observed using a simulation modeling tool. Finally, the most optimum solution (minimum CT) of the system will be obtained using a multi-objective optimization technique. Due to the large size of the problem, and the uncertainties in finding the most optimum solution, integration of modeling and simulation methodologies is proposed followed by developing new approach known as GASG. It is a genetic algorithm on the basis of granular simulation which is the subject of the methodology of this research. In part II, MCCSC is simulated using discrete-event simulation (DES) device within an integrated environment of SimEvents and Simulink of MATLAB® software package followed by a comprehensive case study to examine the given strategy. Also, the effect of genetic operators on the obtained optimal/near optimal solution by the simulation model will be discussed in part III.Keywords: supply chain, genetic algorithm, optimization, simulation, discrete event system
Procedia PDF Downloads 3177512 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 1967511 Estimation of Reservoirs Fracture Network Properties Using an Artificial Intelligence Technique
Authors: Reda Abdel Azim, Tariq Shehab
Abstract:
The main objective of this study is to develop a subsurface fracture map of naturally fractured reservoirs by overcoming the limitations associated with different data sources in characterising fracture properties. Some of these limitations are overcome by employing a nested neuro-stochastic technique to establish inter-relationship between different data, as conventional well logs, borehole images (FMI), core description, seismic attributes, and etc. and then characterise fracture properties in terms of fracture density and fractal dimension for each data source. Fracture density is an important property of a system of fracture network as it is a measure of the cumulative area of all the fractures in a unit volume of a fracture network system and Fractal dimension is also used to characterize self-similar objects such as fractures. At the wellbore locations, fracture density and fractal dimension can only be estimated for limited sections where FMI data are available. Therefore, artificial intelligence technique is applied to approximate the quantities at locations along the wellbore, where the hard data is not available. It should be noted that Artificial intelligence techniques have proven their effectiveness in this domain of applications.Keywords: naturally fractured reservoirs, artificial intelligence, fracture intensity, fractal dimension
Procedia PDF Downloads 2567510 Outline of a Technique for the Recommendation of Tourism Products in Cuba Using GIS
Authors: Jesse D. Cano, Marlon J. Remedios
Abstract:
Cuban tourism has developed so much in the last 30 years to the point of becoming one of the engines of the Cuban economy. With such a development, Cuban companies opting for e-tourism as a way to publicize their products and attract customers has also grown. Despite this fact, the majority of Cuban tourism-themed websites simply provide information on the different products and services they offer which results in many cases, in the user getting overwhelmed with the amount of information available which results in the user abandoning the search before he can find a product that fits his needs. Customization has been recognized as a critical factor for successful electronic tourism business and the use of recommender systems is the best approach to address the problem of personalization. This paper aims to outline a preliminary technique to obtain predictions about which products a particular user would give a better evaluation; these products would be those which the website would show in the first place. To achieve this, the theoretical elements of the Cuban tourism environment are discussed; recommendation systems and geographic information systems as tools for information representation are also discussed. Finally, for each structural component identified, we define a set of rules that allows obtaining an electronic tourism system that handles the personalization of the service provided effectively.Keywords: geographic information system, technique, tourism products, recommendation
Procedia PDF Downloads 504