Search results for: source domain
5398 Identification of Text Domains and Register Variation through the Analysis of Lexical Distribution in a Bangla Mass Media Text Corpus
Authors: Mahul Bhattacharyya, Niladri Sekhar Dash
Abstract:
The present research paper is an experimental attempt to investigate the nature of variation in the register in three major text domains, namely, social, cultural, and political texts collected from the corpus of Bangla printed mass media texts. This present study uses a corpus of a moderate amount of Bangla mass media text that contains nearly one million words collected from different media sources like newspapers, magazines, advertisements, periodicals, etc. The analysis of corpus data reveals that each text has certain lexical properties that not only control their identity but also mark their uniqueness across the domains. At first, the subject domains of the texts are classified into two parameters namely, ‘Genre' and 'Text Type'. Next, some empirical investigations are made to understand how the domains vary from each other in terms of lexical properties like both function and content words. Here the method of comparative-cum-contrastive matching of lexical load across domains is invoked through word frequency count to track how domain-specific words and terms may be marked as decisive indicators in the act of specifying the textual contexts and subject domains. The study shows that the common lexical stock that percolates across all text domains are quite dicey in nature as their lexicological identity does not have any bearing in the act of specifying subject domains. Therefore, it becomes necessary for language users to anchor upon certain domain-specific lexical items to recognize a text that belongs to a specific text domain. The eventual findings of this study confirm that texts belonging to different subject domains in Bangla news text corpus clearly differ on the parameters of lexical load, lexical choice, lexical clustering, lexical collocation. In fact, based on these parameters, along with some statistical calculations, it is possible to classify mass media texts into different types to mark their relation with regard to the domains they should actually belong. The advantage of this analysis lies in the proper identification of the linguistic factors which will give language users a better insight into the method they employ in text comprehension, as well as construct a systemic frame for designing text identification strategy for language learners. The availability of huge amount of Bangla media text data is useful for achieving accurate conclusions with a certain amount of reliability and authenticity. This kind of corpus-based analysis is quite relevant for a resource-poor language like Bangla, as no attempt has ever been made to understand how the structure and texture of Bangla mass media texts vary due to certain linguistic and extra-linguistic constraints that are actively operational to specific text domains. Since mass media language is assumed to be the most 'recent representation' of the actual use of the language, this study is expected to show how the Bangla news texts reflect the thoughts of the society and how they leave a strong impact on the thought process of the speech community.Keywords: Bangla, corpus, discourse, domains, lexical choice, mass media, register, variation
Procedia PDF Downloads 1745397 Simultaneous Saccharification and Co-Fermentation of Paddy Straw and Fruit Wastes into Ethanol Production
Authors: Kamla Malik
Abstract:
For ethanol production from paddy straw firstly pretreatment was done by using sodium hydroxide solution (2.0%) at 15 psi for 1 hr. The maximum lignin removal was achieved with 0.5 mm mesh size of paddy straw. It contained 72.4 % cellulose, 15.9% hemicelluloses and 2.0 % lignin after pretreatment. Paddy straw hydrolysate (PSH) with fruits wastes (5%), such as sweet lime, apple, sapota, grapes, kinnow, banana, papaya, mango, and watermelon were subjected to simultaneous saccharification and co-fermentation (SSCF) for 72 hrs by co-culture of Saccharomyces cerevisiae HAU-1 and Candida sp. with 0.3 % urea as a cheap nitrogen source. Fermentation was carried out at 35°C and determined ethanol yield at 24 hours interval. The maximum production of ethanol was produced within 72 hrs of fermentation in PSH + sapota peels (3.9% v/v) followed by PSH + kinnow peels (3.6%) and PSH+ papaya peels extract (3.1 %). In case of PSH+ banana peels and mango peel extract the ethanol produced were 2.8 % and 2.2 % (v/v). The results of this study suggest that wastes from fruits that contain fermentable sugar should not be discarded into our environment, but should be supplemented in paddy straw which converted to useful products like bio-ethanol that can serve as an alternative energy source.Keywords: ethanol, fermentation, fruit wastes, paddy straw
Procedia PDF Downloads 3895396 A Perspective on Teaching Mathematical Concepts to Freshman Economics Students Using 3D-Visualisations
Authors: Muhammad Saqib Manzoor, Camille Dickson-Deane, Prashan Karunaratne
Abstract:
Cobb-Douglas production (utility) function is a fundamental function widely used in economics teaching and research. The key reason is the function's characteristics to describe the actual production using inputs like labour and capital. The characteristics of the function like returns to scale, marginal, and diminishing marginal productivities are covered in the introductory units in both microeconomics and macroeconomics with a 2-dimensional static visualisation of the function. However, less insight is provided regarding three-dimensional surface, changes in the curvature properties due to returns to scale, the linkage of the short-run production function with its long-run counterpart and marginal productivities, the level curves, and the constraint optimisation. Since (freshman) learners have diverse prior knowledge and cognitive skills, the existing “one size fits all” approach is not very helpful. The aim of this study is to bridge this gap by introducing technological intervention with interactive animations of the three-dimensional surface and sequential unveiling of the characteristics mentioned above using Python software. A small classroom intervention has helped students enhance their analytical and visualisation skills towards active and authentic learning of this topic. However, to authenticate the strength of our approach, a quasi-Delphi study will be conducted to ask domain-specific experts, “What value to the learning process in economics is there using a 2-dimensional static visualisation compared to using a 3-dimensional dynamic visualisation?’ Here three perspectives of the intervention were reviewed by a panel comprising of novice students, experienced students, novice instructors, and experienced instructors in an effort to determine the learnings from each type of visualisations within a specific domain of knowledge. The value of this approach is key to suggesting different pedagogical methods which can enhance learning outcomes.Keywords: cobb-douglas production function, quasi-Delphi method, effective teaching and learning, 3D-visualisations
Procedia PDF Downloads 1455395 Electrochemical Deposition of Pb and PbO2 on Polymer Composites Electrodes
Authors: A. Merzouki, N. Haddaoui
Abstract:
Polymers have a large reputation as electric insulators. These materials are characterized by weak weight, reduced price and a large domain of physical and chemical properties. They conquered new application domains that were until a recent past the exclusivity of metals. In this work, we used some composite materials (polymers/conductive fillers), as electrodes and we try to cover them with metallic lead layers in order to use them as courant collector grids in lead-acid battery plates.Keywords: electrodeposition, polymer composites, carbon black, acetylene black
Procedia PDF Downloads 4565394 The Analysis of Thermal Conductivity in Porcine Meat Due to Electricity by Finite Element Method
Authors: Orose Rugchati, Sarawut Wattanawongpitak
Abstract:
This research studied the analysis of the thermal conductivity and heat transfer in porcine meat due to the electric current flowing between the electrode plates in parallel. Hot-boned pork sample was prepared in 2*1*1 cubic centimeter. The finite element method with ANSYS workbench program was applied to simulate this heat transfer problem. In the thermal simulation, the input thermoelectric energy was calculated from measured current that flowing through the pork and the input voltage from the dc voltage source. The comparison of heat transfer in pork according to two voltage sources: DC voltage 30 volts and dc pulsed voltage 60 volts (pulse width 50 milliseconds and 50 % duty cycle) were demonstrated. From the result, it shown that the thermal conductivity trends to be steady at temperature 40C and 60C around 1.39 W/mC and 2.65 W/mC for dc voltage source 30 volts and dc pulsed voltage 60 volts, respectively. For temperature increased to 50C at 5 minutes, the appearance color of porcine meat at the exposer point has become to fade. This technique could be used for predicting of thermal conductivity caused by some meat’s characteristics.Keywords: thermal conductivity, porcine meat, electricity, finite element method
Procedia PDF Downloads 1405393 Therapeutic Potential of GSTM2-2 C-Terminal Domain and Its Mutants, F157A and Y160A on the Treatment of Cardiac Arrhythmias: Effect on Ca2+ Transients in Neonatal Ventricular Cardiomyocytes
Authors: R. P. Hewawasam, A. F. Dulhunty
Abstract:
The ryanodine receptor (RyR) is an intracellular ion channel that releases Ca2+ from the sarcoplasmic reticulum and is essential for the excitation-contraction coupling and contraction in striated muscle. Human muscle specific glutathione transferase M2-2 (GSTM2-2) is a highly specific inhibitor of cardiac ryanodine receptor (RyR2) activity. Single channel-lipid bilayer studies and Ca2+ release assays performed using the C-terminal half of the GSTM2-2 and its mutants F157A and Y160A confirmed the ability of the C terminal domain of GSTM2-2 to specifically inhibit the cardiac ryanodine receptor activity. Objective of the present study is to determine the effect of C terminal domain of GSTM2-2 (GSTM2-2C) and the mutants, F157A and Y160A on the Ca2+ transients of neonatal ventricular cardiomyocytes. Primary cardiomyocytes were cultured from neonatal rats. They were treated with GSTM2-2C and the two mutants F157A and Y160A at 15µM and incubated for 2 hours. Then the cells were led with Fluo-4AM, fluorescent Ca2+ indicator, and the field stimulated (1 Hz, 3V and 2ms) cells were excited using the 488 nm argon laser. Contractility of the cells were measured and the Ca2+ transients in the stained cells were imaged using Leica SP5 confocal microscope. Peak amplitude of the Ca2+ transient, rise time and decay time from the peak were measured for each transient. In contrast to GSTM2C which significantly reduced the % shortening (42.8%) in the field stimulated cells, F157A and Y160A failed to reduce the % shortening.Analysis revealed that the average amplitude of the Ca2+ transient was significantly reduced (P<0.001) in cells treated with the wild type GSTM2-2C compared to that of untreated cells. Cells treated with the mutants F157A and Y160A didn’t change the Ca2+ transient significantly compared to the control. A significant increase in the rise time (P< 0.001) and a significant reduction in the decay time (P< 0.001) were observed in cardiomyocytes treated with GSTM2-2C compared to the control but not with F157A and Y160A. These results are consistent with the observation that GSTM2-2C reduced the Ca2+ release from the cardiac SR significantly whereas the mutants, F157A and Y160A didn’t show any effect compared to the control. GSTM2-2C has an isoform-specific effect on the cardiac ryanodine receptor activity and also it inhibits RyR2 channel activity only during diastole. Selective inhibition of RyR2 by GSTM2-2C has significant clinical potential in the treatment of cardiac arrhythmias and heart failure. Since GSTM2-2C-terminal construct has no GST enzyme activity, its introduction to the cardiomyocyte would not exert any unwanted side effects that may alter its enzymatic action. The present study further confirms that GSTM2-2C is capable of decreasing the Ca2+ release from the cardiac SR during diastole. These results raise the future possibility of using GSTM2-2C as a template for therapeutics that can depress RyR2 function when the channel is hyperactive in cardiac arrhythmias and heart failure.Keywords: arrhythmia, cardiac muscle, cardiac ryanodine receptor, GSTM2-2
Procedia PDF Downloads 2845392 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 695391 Assessment of Drinking Water Contamination from the Water Source to the Consumer in Palapye Region, Botswana
Authors: Tshegofatso Galekgathege
Abstract:
Poor water quality is of great concern to human health as it can cause disease outbreaks. A standard practice today, in developed countries, is that people should be provided with safe-reliable drinking water, as safe drinking water is recognized as a basic human right and a cost effective measure of reducing diseases. Over 1.1 billion people worldwide lack access to a safe water supply and as a result, the majority are forced to use polluted surface or groundwater. It is widely accepted that our water supply systems are susceptible to the intentional or accidental contamination .Water quality degradation may occur anywhere in the path that water takes from the water source to the consumer. Chlorine is believed to be an effective tool in disinfecting water, but its concentration may decrease with time due to consumption by chemical reactions. This shows that we are at the risk of being infected by waterborne diseases if chlorine in water falls below the required level of 0.2-1mg/liter which should be maintained in water and some contaminants enter into the water distribution system. It is believed that the lack of adequate sanitation also contributes to the contamination of water globally. This study therefore, assesses drinking water contamination from the source to the consumer by identifying the point vulnerable to contamination from the source to the consumer in the study area .To identify the point vulnerable to contamination, water was sampled monthly from boreholes, water treatment plant, water distribution system (WDS), service reservoirs and consumer taps from all the twenty (20) villages of Palapye region. Sampled water was then taken to the laboratory for testing and analysis of microbiological and chemical parameters. Water quality analysis were then compared with Botswana drinking water quality standards (BOS32:2009) to see if they comply. Major sources of water contamination identified during site visits were the livestock which were found drinking stagnant water from leaking pipes in 90 percent of the villages. Soils structure around the area was negatively affected because of livestock movement even vegetation in the area. In conclusion microbiological parameters of water in the study area do not comply with drinking water standards, some microbiological parameters in water indicated that livestock do not only affect land degradation but also the quality of water. Chlorine has been applied to water over some years but it is not effective enough thus preventative measures have to be developed, to prevent contaminants from reaching water. Remember: Prevention is better than cure.Keywords: land degradation, leaking systems, livestock, water contamination
Procedia PDF Downloads 3525390 Thermal Resistance of Special Garments Exposed to a Radiant Heat
Authors: Jana Pichova, Lubos Hes, Vladimir Bajzik
Abstract:
Protective clothing is designed to keep a wearer save in hazardous conditions or enable perform short time working operation without being injured or feeling discomfort. Firefighters or other related workers are exposed to abnormal heat which can be conductive, convective or radiant type. Their garment is proposed to resist this conditions and prevent burn injuries or dead of human. However thermal comfort of firefighter exposed to high heat source have not been studied yet. Thermal resistance is the best representative parameter of thermal comfort. In this study a new method of testing of thermal resistance of special clothing exposed to high radiation heat source was designed. This method simulates human body wearing single or multi-layered garment which is exposed to radiative heat. Setup of this method enables measuring of radiative heat flow in time without effect of convection. The new testing method is verified on chosen group of textiles for firefighters.Keywords: protective clothing, radiative heat, thermal comfort of firefighters, thermal resistance of special garments
Procedia PDF Downloads 3795389 Large Eddy Simulation of Particle Clouds Using Open-Source CFD
Authors: Ruo-Qian Wang
Abstract:
Open-source CFD has become increasingly popular and promising. The recent progress in multiphase flow enables new CFD applications, which provides an economic and flexible research tool for complex flow problems. Our numerical study using four-way coupling Euler-Lagrangian Large-Eddy Simulations to resolve particle cloud dynamics with OpenFOAM and CFDEM will be introduced: The fractioned Navier-Stokes equations are numerically solved for fluid phase motion, solid phase motion is addressed by Lagrangian tracking for every single particle, and total momentum is conserved by fluid-solid inter-phase coupling. The grid convergence test was performed, which proves the current resolution of the mesh is appropriate. Then, we validated the code by comparing numerical results with experiments in terms of particle cloud settlement and growth. A good comparison was obtained showing reliability of the present numerical schemes. The time and height at phase separations were defined and analyzed for a variety of initial release conditions. Empirical formulas were drawn to fit the results.Keywords: four-way coupling, dredging, land reclamation, multiphase flows, oil spill
Procedia PDF Downloads 4295388 Magnetohemodynamic of Blood Flow Having Impact of Radiative Flux Due to Infrared Magnetic Hyperthermia: Spectral Relaxation Approach
Authors: Ebenezer O. Ige, Funmilayo H. Oyelami, Joshua Olutayo-Irheren, Joseph T. Okunlola
Abstract:
Hyperthermia therapy is an adjuvant procedure during which perfused body tissues is subjected to elevated range of temperature in bid to achieve improved drug potency and efficacy of cancer treatment. While a selected class of hyperthermia techniques is shouldered on the thermal radiations derived from single-sourced electro-radiation measures, there are deliberations on conjugating dual radiation field sources in an attempt to improve the delivery of therapy procedure. This paper numerically explores the thermal effectiveness of combined infrared hyperemia having nanoparticle recirculation in the vicinity of imposed magnetic field on subcutaneous strata of a model lesion as ablation scheme. An elaborate Spectral relaxation method (SRM) was formulated to handle equation of coupled momentum and thermal equilibrium in the blood-perfused tissue domain of a spongy fibrous tissue. Thermal diffusion regimes in the presence of external magnetic field imposition were described leveraging on the renowned Roseland diffusion approximation to delineate the impact of radiative flux within the computational domain. The contribution of tissue sponginess was examined using mechanics of pore-scale porosity over a selected of clinical informed scenarios. Our observations showed for a substantial depth of spongy lesion, magnetic field architecture constitute the control regimes of hemodynamics in the blood-tissue interface while facilitating thermal transport across the depth of the model lesion. This parameter-indicator could be utilized to control the dispensing of hyperthermia treatment in intravenous perfused tissue.Keywords: spectra relaxation scheme, thermal equilibrium, Roseland diffusion approximation, hyperthermia therapy
Procedia PDF Downloads 1185387 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG
Procedia PDF Downloads 1825386 Seismic Stratigraphy of the First Deposits of the Kribi-Campo Offshore Sub-basin (Gulf of Guinea): Pre-cretaceous Early Marine Incursion and Source Rocks Modeling
Authors: Mike-Franck Mienlam Essi, Joseph Quentin Yene Atangana, Mbida Yem
Abstract:
The Kribi-Campo sub-basin belongs to the southern domain of the Cameroon Atlantic Margin in the Gulf of Guinea. It is the African homologous segment of the Sergipe-Alagoas Basin, located at the northeast side of the Brazil margin. The onset of the seafloor spreading period in the Southwest African Margin in general and the study area particularly remains controversial. Various studies locate this event during the Cretaceous times (Early Aptian to Late Albian), while others suggested that this event occurred during Pre-Cretaceous period (Palaeozoic or Jurassic). This work analyses 02 Cameroon Span seismic lines to re-examine the Early marine incursion period of the study area for a better understanding of the margin evolution. The methodology of analysis in this study is based on the delineation of the first seismic sequence, using the reflector’s terminations tracking and the analysis of its internal reflections associated to the external configuration of the package. The results obtained indicate from the bottom upwards that the first deposits overlie a first seismic horizon (H1) associated to “onlap” terminations at its top and underlie a second horizon which shows “Downlap” terminations at its top (H2). The external configuration of this package features a prograded fill pattern, and it is observed within the depocenter area with discontinuous reflections that pinch out against the basement. From east to west, this sequence shows two seismic facies (SF1 and SF2). SF1 has parallel to subparallel reflections, characterized by high amplitude, and SF2 shows parallel and stratified reflections, characterized by low amplitude. The distribution of these seismic facies reveals a lateral facies variation observed. According to the fundamentals works on seismic stratigraphy and the literature review of the geological context of the study area, particularly, the stratigraphical natures of the identified horizons and seismic facies have been highlighted. The seismic horizons H1 and H2 correspond to Top basement and “Downlap Surface,” respectively. SF1 indicates continental sediments (Sands/Sandstone) and SF2 marine deposits (shales, clays). Then, the prograding configuration observed suggests a marine regression. The correlation of these results with the lithochronostratigraphic chart of Sergipe-Alagoas Basin reveals that the first marine deposits through the study area are dated from Pre-Cretaceous times (Palaeozoic or Jurassic). The first deposits onto the basement represents the end of a cycle of sedimentation. The hypothesis of Mike.F. Mienlam Essi is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Joseph.Q. Yene Atangana is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Mbida Yem is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Cretaceous seafloor spreading through the study area is the onset of another cycle of sedimentation. Furthermore, the presence of marine sediments into the first deposits implies that this package could contain marine source rocks. The spatial tracking of these deposits reveals that they could be found in some onshore parts of the Kribi-Campo area or even in the northern side.Keywords: cameroon span seismic, early marine incursion, kribi-campo sub-basin, pre-cretaceous period, sergipe-alagoas basin
Procedia PDF Downloads 1075385 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan
Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail
Abstract:
Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.Keywords: credibility, decision making, food bloggers, generation z, e-wom
Procedia PDF Downloads 735384 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kumar Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform
Procedia PDF Downloads 1155383 A Guide to the Implementation of Ambisonics Super Stereo
Authors: Alessio Mastrorillo, Giuseppe Silvi, Francesco Scagliola
Abstract:
In this work, we introduce an Ambisonics decoder with an implementation of the C-format, also called Super Stereo. This format is an alternative to conventional stereo and binaural decoding. Unlike those, this format conveys audio information from the horizontal plane and works with stereo speakers and headphones. The two C-format channels can also return a reconstructed planar B-format. This work provides an open-source implementation for this format. We implement an all-pass filter for signal quadrature, as required by the decoding equations. This filter works with six Biquads in a cascade configuration, with values for control frequency and quality factor discovered experimentally. The phase response of the filter delivers a small error in the 20-14.000Hz range. The decoder has been tested with audio sources up to 192kHz sample rate, returning pristine sound quality and detailed stereo image. It has been included in the Envelop for Live suite and is available as an open-source repository. This decoder has applications in Virtual Reality and 360° audio productions, music composition, and online streaming.Keywords: ambisonics, UHJ, quadrature filter, virtual reality, Gerzon, decoder, stereo, binaural, biquad
Procedia PDF Downloads 915382 Analysis of Secondary Peak in Hα Emission Profile during Gas Puffing in Aditya Tokamak
Authors: Harshita Raj, Joydeep Ghosh, Rakesh L. Tanna, Prabal K. Chattopadhyay, K. A. Jadeja, Sharvil Patel, Kaushal M. Patel, Narendra C. Patel, S. B. Bhatt, V. K. Panchal, Chhaya Chavda, C. N. Gupta, D. Raju, S. K. Jha, J. Raval, S. Joisa, S. Purohit, C. V. S. Rao, P. K. Atrey, Umesh Nagora, R. Manchanda, M. B. Chowdhuri, Nilam Ramaiya, S. Banerjee, Y. C. Saxena
Abstract:
Efficient gas fueling is a critical aspect that needs to be mastered in order to maintain plasma density, to carry out fusion. This requires a fair understanding of fuel recycling in order to optimize the gas fueling. In Aditya tokamak, multiple gas puffs are used in a precise and controlled manner, for hydrogen fueling during the flat top of plasma discharge which has been instrumental in achieving discharges with enhanced density as well as energy confinement time. Following each gas puff, we observe peaks in temporal profile of Hα emission, Soft X-ray (SXR) and chord averaged electron density in a number of discharges, indicating efficient gas fueling. Interestingly, Hα temporal profile exhibited an additional peak following the peak corresponding to each gas puff. These additional peak Hα appeared in between the two gas puffs, indicating the presence of a secondary hydrogen source apart from the gas puffs. A thorough investigation revealed that these secondary Hα peaks coincide with Hard X- ray bursts which come from the interaction of runaway electrons with vessel limiters. This leads to consider that the runaway electrons (REs), which hit the wall, in turn, bring out the absorbed hydrogen and oxygen from the wall and makes the interaction of REs with limiter a secondary hydrogen source. These observations suggest that runaway electron induced recycling should also be included in recycling particle source in the particle balance calculations in tokamaks. Observation of two Hα peaks associated with one gas puff and their roles in enhancing and maintaining plasma density in Aditya tokamak will be discussed in this paper.Keywords: fusion, gas fueling, recycling, Tokamak, Aditya
Procedia PDF Downloads 4025381 Isolation and Identification of Biosurfactant Producing Microorganism for Bioaugmentation
Authors: Karthick Gopalan, Selvamohan Thankiah
Abstract:
Biosurfactants are lipid compounds produced by microbes, which are amphipathic molecules consisting of hydrophophic and hydrophilic domains. In the present investigation, ten bacterial strains were isolated from petroleum oil contaminated sites near petrol bunk. Oil collapsing test, haemolytic activity were used as a criteria for primary isolation of biosurfactant producing bacteria. In this study, all the bacterial strains gave positive results. Among the ten strains, two were observed as good biosurfactant producers, they utilize the diesel as a sole carbon source. Optimization of biosurfactant producing bacteria isolated from petroleum oil contaminated sites was carried out using different parameters such as, temperature (20ºC, 25ºC, 30ºC, 37ºC and 45ºC), pH (5,6,7,8 & 9) and nitrogen sources (ammonium chloride, ammonium carbonate and sodium nitrate). Biosurfactants produced by bacteria were extracted, dried and quantified. As a result of optimization of parameters the suitable values for the production of more amount of biosurfactant by the isolated bacterial species was observed as 30ºC (0.543 gm/lt) in the pH 7 (0.537 gm/lt) with ammonium nitrate (0.431 gm/lt) as sole carbon source.Keywords: isolation and identification, biosurfactant, microorganism, bioaugmentation
Procedia PDF Downloads 3485380 Identifying Controlling Factors for the Evolution of Shallow Groundwater Chemistry of Ellala Catchment, Northern Ethiopia
Authors: Grmay Kassa Brhane, Hailemariam Siyum Mekonen
Abstract:
This study was designed to identify the hydrogeochemical and anthropogenic processes controlling the evaluation of groundwater chemistry in the Ellala catchment which covers about 296.5 km2 areal extent. The chemical analysis revealed that the major ions in the groundwater are Ca2+, Mg2+, Na+, and K+ (cations) and HCO3-, PO43-, Cl-, NO3-, and SO42-(anions). Most of the groundwater samples (68.42%) revealed that the groundwater in the catchment is non-alkaline. In addition to the contribution of aquifer material, the solid materials and liquid wastes discharged from different sources can be the main sources of pH and EC in the groundwater. It is observed that the EC of the groundwater is fairly correlated with the DTS. This indicates that high mineralized water is more conductor than water with low concentration. The degree of salinity of the groundwater increases along the groundwater flow path from East to West; then, areas surrounding Mekelle City are highly saline due to the liquid and solid wastes discharged from the city and the industries. The groundwater facies in the catchment are predominated with calcium, magnesium, and bicarbonate which are labeled as Ca-Mg-HCO3 and Mg-Ca-HCO3. The main geochemical process controlling the evolution of the groundwater chemistry in the catchment is rock-water interaction, particularly carbonate dissolution. Due to the clay layer in the aquifer, the reverse is ion exchange. Non-significant silicate weathering and halite dissolution also contribute to the evolution of groundwater chemistry in the catchment. The groundwater in the catchment is dominated by the meteoritic origin although it needs further groundwater chemistry study with isotope dating analysis. The groundwater is under-saturated with calcite, dolomite, and aragonite minerals; hence, the more these minerals encounter the groundwater, the more the minerals dissolve. The main source of calcium and magnesium in groundwater is the dissolution of carbonate minerals (calcite and dolomite) since carbonate rocks are the dominant aquifer materials in the catchment. In addition to this, the weathering of dolerite rock is a possible source of magnesium ions. The relatively higher concentration of sodium over chloride indicates that the source of sodium-ion is reverse ion exchange and/or weathering of sodium-bearing materials, such as shale and dolerite rather than halite dissolution. High concentration of phosphate, nitrate, and chloride in the groundwater is the main anthropogenic source that needs treatment, quality control, and management in the catchment. From the Base Exchange Index Analysis, it is possible to understand that, in the catchment, the groundwater is dominated by the meteoritic origin, although it needs further groundwater chemistry study with isotope dating analysis.Keywords: Ellala catchment, factor, chemistry, geochemical, groundwater
Procedia PDF Downloads 765379 Preliminary Phytochemical Screening and Comparison of Different Extracts of Capparidaceae Family
Authors: Noshaba Dilbar, Maria Jabbar
Abstract:
Medicinal plants are considered to be the richest source of drug discovery. The main cause of medicinal properties of plants is the presence of bioactive compounds in them. Phytochemical screening is the valuable process that detects bioactive compounds(secondary metabolites) in plants. The present study was carried out to determine phytochemical profile and ethnobotanical importance of Capparidaceae species. ( Capparis spinosa and Dipterygium glaucum). The selection of plants was made on basis of traditional knowledge of their usage in ayurvedic medicines. Different type of solvents(ethanol, methanol, chloroform, benzene and petroleum ether) were used to make extracts of dry and fresh plants. Phytochemical screening was made by using various standard techniques. Results reveal the presence of large range of bioactive compounds i.e alakloids, saponins, flavonoids, terpenoids, glycosides, phenols and steroids. Methanol, petroleum ether and chloroform extracts showed high extractability of bioactive compounds. The results obtained ensure these plants a reliable source of pharmacological industry and can be used in making of various biological friendly drugs.Keywords: bioactive compounds, Capparidaceae, phytochemical screening, secondary metabolites
Procedia PDF Downloads 1745378 Cell Biomass and Lipid Productivities of Meyerella planktonica under Autotrophic and Heterotrophic Growth Conditions
Authors: Rory Anthony Hutagalung, Leonardus Widjaja
Abstract:
Microalgae Meyerella planktonica is a potential biofuel source because it can grow in bulk in either autotrophic or heterotrophic condition. However, the quantitative growth of this algal type is still low as it tends to precipitates on the bottom. Beside, the lipid concentration is still low when grown in autotrophic condition. In contrast, heterotrophic condition can enhance the lipid concentration. The combination of autotrophic condition and agitation treatment was conducted to increase the density of the culture. On the other hand, a heterotrophic condition was set up to raise the lipid production. A two-stage experiment was applied to increase the density at the first step and to increase the lipid concentration in the next step. The autotrophic condition resulted higher density but lower lipid concentration compared to heterotrophic one. The agitation treatment produced higher density in both autotrophic and heterotrophic conditions. The two-stage experiment managed to enhance the density during the autotrophic stage and the lipid concentration during the heterotrophic stage. The highest yield was performed by using 0.4% v/v glycerol as a carbon source (2.9±0.016 x 106 cells w/w) attained 7 days after the heterotrophic stage began. The lipid concentration was stable starting from day 7.Keywords: agitation, glycerol, heterotrophic, lipid productivity, Meyerella planktonica
Procedia PDF Downloads 3375377 Experimental Investigation on the Effects of Electroless Nickel Phosphorus Deposition, pH and Temperature with the Varying Coating Bath Parameters on Impact Energy by Taguchi Method
Authors: D. Kari Basavaraja, M. G. Skanda, C. Soumya, V. Ramesh
Abstract:
This paper discusses the effects of sodium hypophosphite concentration, pH, and temperature on deposition rate. This paper also discusses the evaluation of coating strength, surface, and subsurface by varying the bath parameters, percentage of phosphate, plating temperature, and pH of the plating solution. Taguchi technique has been used for the analysis. In the experiment, nickel chloride which is a source of nickel when mixed with sodium hypophosphite has been used as the reducing agent and the source of phosphate and sodium hydroxide has been used to vary the pH of the coating bath. The coated samples are tested for impact energy by conducting impact test. Finally, the effects of coating bath parameters on the impact energy absorbed have been plotted, and analysis has been carried out. Further, percentage contribution of coating bath parameters using Design of Experiments approach (DOE) has been analysed. Finally, it can be concluded that the bath parameters of the Ni-P coating will certainly influence on the strength of the specimen.Keywords: bath parameters, coatings, design of experiment, fracture toughness, impact strength
Procedia PDF Downloads 3515376 Analysis of Bio-Oil Produced by Pyrolysis of Coconut Shell
Authors: D. S. Fardhyanti, A. Damayanti
Abstract:
The utilization of biomass as a source of new and renewable energy is being carried out. One of the technologies to convert biomass as an energy source is pyrolysis which is converting biomass into more valuable products, such as bio-oil. Bio-oil is a liquid which is produced by steam condensation process from the pyrolysis of coconut shells. The composition of a coconut shell e.g. hemicellulose, cellulose and lignin will be oxidized to phenolic compounds as the main component of the bio-oil. The phenolic compounds in bio-oil are corrosive; they cause various difficulties in the combustion system because of a high viscosity, low calorific value, corrosiveness, and instability. Phenolic compounds are very valuable components which phenol has used as the main component for the manufacture of antiseptic, disinfectant (known as Lysol) and deodorizer. The experiments typically occurred at the atmospheric pressure in a pyrolysis reactor at temperatures ranging from 300 oC to 350 oC with a heating rate of 10 oC/min and a holding time of 1 hour at the pyrolysis temperature. The Gas Chromatography-Mass Spectroscopy (GC-MS) was used to analyze the bio-oil components. The obtained bio-oil has the viscosity of 1.46 cP, the density of 1.50 g/cm3, the calorific value of 16.9 MJ/kg, and the molecular weight of 1996.64. By GC-MS, the analysis of bio-oil showed that it contained phenol (40.01%), ethyl ester (37.60%), 2-methoxy-phenol (7.02%), furfural (5.45%), formic acid (4.02%), 1-hydroxy-2-butanone (3.89%), and 3-methyl-1,2-cyclopentanedione (2.01%).Keywords: bio-oil, pyrolysis, coconut shell, phenol, gas chromatography-mass spectroscopy
Procedia PDF Downloads 2475375 Effects of Audiovisual Contextualization of L2 Idioms on Enhancing Students’ Comprehension and Retention
Authors: Monica Karlsson
Abstract:
The positive effect of a supportive written context on comprehension and retention when faced with a previously unknown idiomatic expression is today an indisputable fact, especially if relevant clues are given in close proximity of the item in question. Also, giving learners a chance of visualizing the meaning of an idiom by offering them its source domain and/or by elaborating etymologically, i.e. providing a mental picture in addition to the spoken/written form (referred to as dual coding), seems to enhance comprehension and retention even further, especially if the idiom is of a more transparent kind. For example, by explaining that walk the plank has a maritime origin and a canary in a coal mine comes from the time when canaries were kept in cages to warn miners if gas was leaking out at which point the canaries succumbed immediately, learners’ comprehension and retention have been shown to increase. The present study aims to investigate whether contextualization of an audiovisual kind could help increase comprehension and retention of L2 idioms. 40 Swedish first-term university students studying English as part of their education to become middle-school teachers participated in the investigation, which tested 24 idioms, all of which were ascertained to be previously unknown to the informants. While half of the learners were subjected to a test in which they were asked to watch scenes from various TV programmes, each scene including one idiomatic expression in a supportive context, the remaining 20 students, as a point of reference, were only offered written contexts, though equally supportive. Immediately after these sessions, both groups were given the same idioms in a decontextualized form and asked to give their meaning. After five weeks, finally, the students were subjected to yet another decontextualized comprehension test. Furthermore, since mastery of idioms in one’s L1 appears to correlate to a great extent with a person’s ability to comprehend idioms in an L2, all the informants were also asked to take a test focusing on idioms in their L1. The result on this test is thus seen to indicate each student’s potential for understanding and memorizing various idiomatic expressions from a more general perspective. Preliminary results clearly show that audiovisual contextualization indeed has a positive effect on learners’ retention. In addition, preliminary results also show that those learners’ who were able to recall most meanings were those who had a propensity for idiom comprehension in their L1.Keywords: English, L2, idioms, audiovisual context
Procedia PDF Downloads 3465374 Manganese Imidazole Complexes: Electrocatalytic Hydrogen Production
Authors: Vishakha Kaim, Mookan Natarajan, Sandeep Kaur-Ghumaan
Abstract:
Hydrogen is one of the most abundant elements present on earth’s crust and considered to be the simplest element in existence. It is not found naturally as a gas on earth and thus has to be manufactured. Hydrogen can be produced from a variety of sources, i.e., water, fossil fuels, or biomass and it is a byproduct of many chemical processes. It is also considered as a secondary source of energy commonly referred to as an energy carrier. Though hydrogen is not widely used as a fuel, it still has the potential for greater use in the future as a clean and renewable source of energy. Electrocatalysis is one of the important source for the production of hydrogen which could contribute to this prominent challenge. Metals such as platinum and palladium are considered efficient for hydrogen production but with limited applications. As a result, a wide variety of metal complexes with earth abundant elements and varied ligand environments have been explored for the electrochemical production of hydrogen. In nature, [FeFe] hydrogenase enzyme present in DesulfoVibrio desulfuricans and Clostridium pasteurianum catalyses the reversible interconversion of protons and electrons into dihydrogen. Since the first structure for the enzyme was reported in 1990s, a range of iron complexes has been synthesized as structural and functional mimics of the enzyme active site. Mn is one of the most desirable element for sustainable catalytic transformations, immediately behind Fe and Ti. Only limited number manganese complexes have been reported in the last two decades as catalysts for proton reduction. Furthermore, redox reactions could be carried out in a facile manner, due to the capability of manganese complexes to be stable at different oxidation states. Herein are reported, four µ2-thiolate bridged manganese complexes [Mn₂(CO)₆(μ-S₂N₄C₁₄H₁₀)] 1, [Mn₂(CO)7(μ- S₂N₄C₁₄H₁₀)] 2, Mn₂(CO)₆(μ-S₄N₂C₁₄H₁₀)] 3 and [Mn₂(CO)(μ- S₄N₂C₁₄H₁₀)] 4 have been synthesized and characterized. The cyclic voltammograms of the complexes displayed irreversible reduction peaks in the range - 0.9 to -1.3 V (vs. Fc⁺/Fc in acetonitrile at 0.1 Vs⁻¹). The complexes were catalytically active towards proton reduction in the presence of trifluoroacetic acid as seen from electrochemical investigations.Keywords: earth abundant, electrocatalytic, hydrogen, manganese
Procedia PDF Downloads 1725373 A Comparative Study of Regional Climate Models and Global Coupled Models over Uttarakhand
Authors: Sudip Kumar Kundu, Charu Singh
Abstract:
As a great physiographic divide, the Himalayas affecting a large system of water and air circulation which helps to determine the climatic condition in the Indian subcontinent to the south and mid-Asian highlands to the north. It creates obstacles by defending chill continental air from north side into India in winter and also defends rain-bearing southwesterly monsoon to give up maximum precipitation in that area in monsoon season. Nowadays extreme weather conditions such as heavy precipitation, cloudburst, flash flood, landslide and extreme avalanches are the regular happening incidents in the region of North Western Himalayan (NWH). The present study has been planned to investigate the suitable model(s) to find out the rainfall pattern over that region. For this investigation, selected models from Coordinated Regional Climate Downscaling Experiment (CORDEX) and Coupled Model Intercomparison Project Phase 5 (CMIP5) has been utilized in a consistent framework for the period of 1976 to 2000 (historical). The ability of these driving models from CORDEX domain and CMIP5 has been examined according to their capability of the spatial distribution as well as time series plot of rainfall over NWH in the rainy season and compared with the ground-based Indian Meteorological Department (IMD) gridded rainfall data set. It is noted from the analysis that the models like MIROC5 and MPI-ESM-LR from the both CORDEX and CMIP5 provide the best spatial distribution of rainfall over NWH region. But the driving models from CORDEX underestimates the daily rainfall amount as compared to CMIP5 driving models as it is unable to capture daily rainfall data properly when it has been plotted for time series (TS) individually for the state of Uttarakhand (UK) and Himachal Pradesh (HP). So finally it can be said that the driving models from CMIP5 are better than CORDEX domain models to investigate the rainfall pattern over NWH region.Keywords: global warming, rainfall, CMIP5, CORDEX, NWH
Procedia PDF Downloads 1695372 Characterization of Bovine SERPIN- Alpha-1 Antitrypsin (AAT)
Authors: Sharique Ahmed, Khushtar Anwar Salman
Abstract:
Alpha-1-antitrypsin (AAT) is a major plasma serine protease inhibitor (SERPIN). Hereditary AAT deficiency is one of the common diseases in some part of the world. AAT is mainly produced in the liver and functions to protect the lung against proteolytic damage (e.g., from neutrophil elastase) acting as the major inhibitor for neutrophil elastase. α (1)-Antitrypsin (AAT) deficiency is an under recognized genetic condition that affects approximately 1 in 2,000 to 1 in 5,000 individuals and predisposes to liver disease and early-onset emphysema. Not only does α-1-antitrypsin deficiency lead to disabling syndrome of pulmonary emphysema, there are other disorders too which include ANCA (antineutrophilic cytoplasmic antibody) positive Wegener's granulomatosis, diffuse bronchiectasis, necrotizing panniculitis in α-1-antitrypsin phenotype (S), idiopathic pulmonary fibrosis and steroid dependent asthma. Augmentation therapy with alpha-1 antitrypsin (AAT) from human plasma has been available for specific treatment of emphysema due to AAT deficiency. Apart from this several observations have also suggested a role for endogenous suppressors of HIV-1, alpha-1 antitrypsin (AAT) has been identified to be one of those. In view of its varied important role in humans, serum from a mammalian source was chosen for the isolation and purification. Studies were performed on the homogeneous fraction. This study suggests that the buffalo serum α-1-antritrypsin has characteristics close to ovine, dog, horse and more importantly to human α-1-antritrypsin in terms of its hydrodynamic properties such as molecular weight, carbohydrate content, etc. The similarities in the hydrodynamic properties of buffalo serum α-1-antitrypsin with other sources of mammalian α-1-antitrypsin mean that it can be further studied and be a potential source for "augmentation therapy", as well as a source of AAT replacement therapy to raise serum levels above the protective threshold. Other parameters like the amino acid sequence, the effect of denaturants, and the thermolability or thermostability of the inhibitor will be the interesting basis of future studies on buffalo serum alpha-1 antitrypsin (AAT).Keywords: α-1-antitrypsin, augmentation therapy , hydrodynamic properties, serine protease inhibitor
Procedia PDF Downloads 4895371 Two Component Source Apportionment Based on Absorption and Size Distribution Measurement
Authors: Tibor Ajtai, Noémi Utry, Máté Pintér, Gábor Szabó, Zoltán Bozóki
Abstract:
Beyond its climate and health related issues ambient light absorbing carbonaceous particulate matter (LAC) has also become a great scientific interest in terms of its regulations recently. It has been experimentally demonstrated in recent studies, that LAC is dominantly composed of traffic and wood burning aerosol particularly under wintertime urban conditions, when the photochemical and biological activities are negligible. Several methods have been introduced to quantitatively apportion aerosol fractions emitted by wood burning and traffic but most of them require costly and time consuming off-line chemical analysis. As opposed to chemical features, the microphysical properties of airborne particles such as optical absorption and size distribution can be easily measured on-line, with high accuracy and sensitivity, especially under highly polluted urban conditions. Recently a new method has been proposed for the apportionment of wood burning and traffic aerosols based on the spectral dependence of their absorption quantified by the Aerosol Angström Exponent (AAE). In this approach the absorption coefficient is deduced from transmission measurement on a filter accumulated aerosol sample and the conversion factor between the measured optical absorption and the corresponding mass concentration (the specific absorption cross section) are determined by on-site chemical analysis. The recently developed multi-wavelength photoacoustic instruments provide novel, in-situ approach towards the reliable and quantitative characterization of carbonaceous particulate matter. Therefore, it also opens up novel possibilities on the source apportionment through the measurement of light absorption. In this study, we demonstrate an in-situ spectral characterization method of the ambient carbon fraction based on light absorption and size distribution measurements using our state-of-the-art multi-wavelength photoacoustic instrument (4λ-PAS) and Single Mobility Particle Sizer (SMPS) The carbonaceous particulate selective source apportionment study was performed for ambient particulate matter in the city center of Szeged, Hungary where the dominance of traffic and wood burning aerosol has been experimentally demonstrated earlier. The proposed model is based on the parallel, in-situ measurement of optical absorption and size distribution. AAEff and AAEwb were deduced from the measured data using the defined correlation between the AOC(1064nm)/AOC(266nm) and N100/N20 ratios. σff(λ) and σwb(λ) were determined with the help of the independently measured temporal mass concentrations in the PM1 mode. Furthermore, the proposed optical source apportionment is based on the assumption that the light absorbing fraction of PM is exclusively related to traffic and wood burning. This assumption is indirectly confirmed here by the fact that the measured size distribution is composed of two unimodal size distributions identified to correspond to traffic and wood burning aerosols. The method offers the possibility of replacing laborious chemical analysis with simple in-situ measurement of aerosol size distribution data. The results by the proposed novel optical absorption based source apportionment method prove its applicability whenever measurements are performed at an urban site where traffic and wood burning are the dominant carbonaceous sources of emission.Keywords: absorption, size distribution, source apportionment, wood burning, traffic aerosol
Procedia PDF Downloads 2275370 Robust Numerical Solution for Flow Problems
Authors: Gregor Kosec
Abstract:
Simple and robust numerical approach for solving flow problems is presented, where involved physical fields are represented through the local approximation functions, i.e., the considered field is approximated over a local support domain. The approximation functions are then used to evaluate the partial differential operators. The type of approximation, the size of support domain, and the type and number of basis function can be general. The solution procedure is formulated completely through local computational operations. Besides local numerical method also the pressure velocity is performed locally with retaining the correct temporal transient. The complete locality of the introduced numerical scheme has several beneficial effects. One of the most attractive is the simplicity since it could be understood as a generalized Finite Differences Method, however, much more powerful. Presented methodology offers many possibilities for treating challenging cases, e.g. nodal adaptivity to address regions with sharp discontinuities or p-adaptivity to treat obscure anomalies in physical field. The stability versus computation complexity and accuracy can be regulated by changing number of support nodes, etc. All these features can be controlled on the fly during the simulation. The presented methodology is relatively simple to understand and implement, which makes it potentially powerful tool for engineering simulations. Besides simplicity and straightforward implementation, there are many opportunities to fully exploit modern computer architectures through different parallel computing strategies. The performance of the method is presented on the lid driven cavity problem, backward facing step problem, de Vahl Davis natural convection test, extended also to low Prandtl fluid and Darcy porous flow. Results are presented in terms of velocity profiles, convergence plots, and stability analyses. Results of all cases are also compared against published data.Keywords: fluid flow, meshless, low Pr problem, natural convection
Procedia PDF Downloads 2335369 Microarray Data Visualization and Preprocessing Using R and Bioconductor
Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava
Abstract:
Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.Keywords: microarray analysis, R language, affymetrix visualization, bioconductor
Procedia PDF Downloads 480