Search results for: computer experiments
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2911

Search results for: computer experiments

1561 Proteins Length and their Phenotypic Potential

Authors: Tom Snir, Eitan Rubin

Abstract:

Mendelian Disease Genes represent a collection of single points of failure for the various systems they constitute. Such genes have been shown, on average, to encode longer proteins than 'non-disease' proteins. Existing models suggest that this results from the increased likeli-hood of longer genes undergoing mutations. Here, we show that in saturated mutagenesis experiments performed on model organisms, where the likelihood of each gene mutating is one, a similar relationship between length and the probability of a gene being lethal was observed. We thus suggest an extended model demonstrating that the likelihood of a mutated gene to produce a severe phenotype is length-dependent. Using the occurrence of conserved domains, we bring evidence that this dependency results from a correlation between protein length and the number of functions it performs. We propose that protein length thus serves as a proxy for protein cardinality in different networks required for the organism's survival and well-being. We use this example to argue that the collection of Mendelian Disease Genes can, and should, be used to study the rules governing systems vulnerability in living organisms.

Keywords: Systems Biology, Protein Length

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
1560 Li-Fi Technology: Data Transmission through Visible Light

Authors: Shahzad Hassan, Kamran Saeed

Abstract:

People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.

Keywords: Communication, LED, Li-Fi, Wi-Fi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2169
1559 Computational Intelligence Hybrid Learning Approach to Time Series Forecasting

Authors: Chunshien Li, Jhao-Wun Hu, Tai-Wei Chiang, Tsunghan Wu

Abstract:

Time series forecasting is an important and widely popular topic in the research of system modeling. This paper describes how to use the hybrid PSO-RLSE neuro-fuzzy learning approach to the problem of time series forecasting. The PSO algorithm is used to update the premise parameters of the proposed prediction system, and the RLSE is used to update the consequence parameters. Thanks to the hybrid learning (HL) approach for the neuro-fuzzy system, the prediction performance is excellent and the speed of learning convergence is much faster than other compared approaches. In the experiments, we use the well-known Mackey-Glass chaos time series. According to the experimental results, the prediction performance and accuracy in time series forecasting by the proposed approach is much better than other compared approaches, as shown in Table IV. Excellent prediction performance by the proposed approach has been observed.

Keywords: forecasting, hybrid learning (HL), Neuro-FuzzySystem (NFS), particle swarm optimization (PSO), recursiveleast-squares estimator (RLSE), time series

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
1558 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D’Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: Data Assimilation, Parallel Algorithm, GPU architectures, Ocean Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
1557 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification

Authors: Samiah Alammari, Nassim Ammour

Abstract:

When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on hyperspectral image (HSI) dataset on Indian Pines. The results confirm the capability of the proposed method.

Keywords: Continual learning, data reconstruction, remote sensing, hyperspectral image segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234
1556 Line Heating Forming: Methodology and Application Using Kriging and Fifth Order Spline Formulations

Authors: Henri Champliaud, Zhengkun Feng, Ngan Van Lê, Javad Gholipour

Abstract:

In this article, a method is presented to effectively estimate the deformed shape of a thick plate due to line heating. The method uses a fifth order spline interpolation, with up to C3 continuity at specific points to compute the shape of the deformed geometry. First and second order derivatives over a surface are the resulting parameters of a given heating line on a plate. These parameters are determined through experiments and/or finite element simulations. Very accurate kriging models are fitted to real or virtual surfaces to build-up a database of maps. Maps of first and second order derivatives are then applied on numerical plate models to evaluate their evolving shapes through a sequence of heating lines. Adding an optimization process to this approach would allow determining the trajectories of heating lines needed to shape complex geometries, such as Francis turbine blades.

Keywords: Deformation, kriging, fifth order spline interpolation, first, second and third order derivatives, C3 continuity, line heating, plate forming, thermal forming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158
1555 Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara

Abstract:

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Keywords: Multibiometrics, Fusion, Score level, Score normalization, Adaptive normalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3554
1554 Optimisation of Polycyclic AromaticHydrocarbon Removal from Contaminated Soilusing Modified Fenton Treatment

Authors: Venny, S. Gan, H. K. Ng

Abstract:

The performance of modified Fenton (MF) treatment to promote PAH oxidation in artificially contaminated soil was investigated in packed soil column with a hydrogen peroxide (H2O2) delivery system simulating in situ injection. Soil samples were spiked with phenanthrene (low molecular weight PAH) and fluoranthene (high molecular weight PAH) to an initial concentration of 500 mg/kg dried soil each. The effectiveness of process parameters H2O2/soil, iron/soil, chelating agent/soil weight ratios and reaction time were studied using a 24 three level factorial design experiments. Statistically significant quadratic models were developed using Response Surface Methodology (RSM) for degrading PAHs from the soil samples. Optimum operating condition was achieved at mild range of H2O2/soil, iron/soil and chelating agent/soil weight ratios, indicating cost efficient method for treating highly contaminated lands.

Keywords: Fenton, polycyclic aromatic hydrocarbon, chelate, response surface methodology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
1553 Revising the Student Experiment Materials and Practices at the National University of Laos

Authors: Syhalath Xaphakdy, Toshio Nagata, Saykham Phommathat, Pavy Souwannavong, Vilayvanh Srithilat, Phoxay Sengdala, Bounaom Phetarnousone, Boualay Siharath, Xaya Chemcheng

Abstract:

The National University of Laos (NUOL) invited a group of volunteers from the Japan International Cooperation Agency (JICA) to revise the physics experiments to utilize the materials that were already available to students. The intension was to review and revise the materials regularly utilized in physics class. The project had access to limited materials and a small budget for the class in the unit; however, by developing experimental textbooks related to mechanics, electricity, and wave and vibration, the group found a way to apply them in the classroom and enhance the students teaching activities. The aim was to introduce a way to incorporate the materials and practices in the classroom to enhance the students learning and teaching skills, particularly when they graduate and begin working as high school teachers.

Keywords: NUOL, JICA, physics experiment materials, small budget, mechanics, electricity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1247
1552 Kinetic and Optimization Studies on Ethanol Production from Corn Flour

Authors: K. Manikandan, T. Viruthagiri

Abstract:

Studies on Simultaneous Saccharification and Fermentation (SSF) of corn flour, a major agricultural product as the substrate using starch digesting glucoamylase enzyme derived from Aspergillus niger and non starch digesting and sugar fermenting Saccharomyces cerevisiae in a batch fermentation. Experiments based on Central Composite Design (CCD) were conducted to study the effect of substrate concentration, pH, temperature, enzyme concentration on Ethanol Concentration and the above parameters were optimized using Response Surface Methodology (RSM). The optimum values of substrate concentration, pH, temperature and enzyme concentration were found to be 160 g/l, 5.5, 30°C and 50 IU respectively. The effect of inoculums age on ethanol concentration was also investigated. The corn flour solution equivalent to 16% initial starch concentration gave the highest ethanol concentration of 63.04 g/l after 48 h of fermentation at optimum conditions of pH and temperature. Monod model and Logistic model were used for growth kinetics and Leudeking – Piret model was used for product formation kinetics.

Keywords: Simultaneous Saccharification and Fermentation(SSF), Corn Starch, Ethanol, Logisitic Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3914
1551 The Application of an Experimental Design for the Defect Reduction of Electrodeposition Painting on Stainless Steel Washers

Authors: Chansiri Singhtaun, Nattaporn Prasartthong

Abstract:

The purpose of this research is to reduce the amount of incomplete coating of stainless steel washers in the electrodeposition painting process by using an experimental design technique. The surface preparation was found to be a major cause of painted surface quality. The influence of pretreating and painting process parameters, which are cleaning time, chemical concentration and shape of hanger were studied. A 23 factorial design with two replications was performed. The analysis of variance for the designed experiment showed the great influence of cleaning time and shape of hanger. From this study, optimized cleaning time was determined and a newly designed electrical conductive hanger was proved to be superior to the original one. The experimental verification results showed that the amount of incomplete coating defects decreased from 4% to 1.02% and operation cost decreased by 10.5%.

Keywords: Defect reduction, design of experiments, electrodeposition painting, stainless steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2270
1550 Hardiness vs Alienation Personality Construct Essentially Explains Burnout Proclivity and Erroneous Computer Entry Problems in Rural Hellenic Hospital Labs

Authors: Angela–M. Paleologou, Aphrodite Dellaporta

Abstract:

Erroneous computer entry problems [here: 'e'errors] in hospital labs threaten the patients-–health carers- relationship, undermining the health system credibility. Are e-errors random, and do lab professionals make them accidentally, or may they be traced through meaningful determinants? Theories on internal causality of mistakes compel to seek specific causal ascriptions of hospital lab eerrors instead of accepting some inescapability. Undeniably, 'To Err is Human'. But in view of rapid global health organizational changes, e-errors are too expensive to lack in-depth considerations. Yet, that efunction might supposedly be entrenched in the health carers- job description remains under dispute – at least for Hellenic labs, where e-use falls behind generalized(able) appreciation and application. In this study: i) an empirical basis of a truly high annual cost of e-errors at about €498,000.00 per rural Hellenic hospital was established, hence interest in exploring the issue was sufficiently substantiated; ii) a sample of 270 lab-expert nurses, technicians and doctors were assessed on several personality, burnout and e-error measures, and iii) the hypothesis that the Hardiness vs Alienation personality construct disposition explains resistance vs proclivity to e-errors was tested and verified: Hardiness operates as a resilience source in the encounter of high pressures experienced in the hospital lab, whereas its 'opposite', i.e., Alienation, functions as a predictor, not only of making e-errors, but also of leading to burn-out. Implications for apt interventions are discussed.

Keywords: Hospital lab, personality hardiness/alienation, e-errors' cost, burnout.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
1549 Stock Portfolio Selection Using Chemical Reaction Optimization

Authors: Jin Xu, Albert Y.S. Lam, Victor O.K. Li

Abstract:

Stock portfolio selection is a classic problem in finance, and it involves deciding how to allocate an institution-s or an individual-s wealth to a number of stocks, with certain investment objectives (return and risk). In this paper, we adopt the classical Markowitz mean-variance model and consider an additional common realistic constraint, namely, the cardinality constraint. Thus, stock portfolio optimization becomes a mixed-integer quadratic programming problem and it is difficult to be solved by exact optimization algorithms. Chemical Reaction Optimization (CRO), which mimics the molecular interactions in a chemical reaction process, is a population-based metaheuristic method. Two different types of CRO, named canonical CRO and Super Molecule-based CRO (S-CRO), are proposed to solve the stock portfolio selection problem. We test both canonical CRO and S-CRO on a benchmark and compare their performance under two criteria: Markowitz efficient frontier (Pareto frontier) and Sharpe ratio. Computational experiments suggest that S-CRO is promising in handling the stock portfolio optimization problem.

Keywords: Stock portfolio selection, Markowitz model, Chemical Reaction Optimization, Sharpe ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075
1548 Data Gathering and Analysis for Arabic Historical Documents

Authors: Ali Dulla

Abstract:

This paper introduces a new dataset (and the methodology used to generate it) based on a wide range of historical Arabic documents containing clean data simple and homogeneous-page layouts. The experiments are implemented on printed and handwritten documents obtained respectively from some important libraries such as Qatar Digital Library, the British Library and the Library of Congress. We have gathered and commented on 150 archival document images from different locations and time periods. It is based on different documents from the 17th-19th century. The dataset comprises differing page layouts and degradations that challenge text line segmentation methods. Ground truth is produced using the Aletheia tool by PRImA and stored in an XML representation, in the PAGE (Page Analysis and Ground truth Elements) format. The dataset presented will be easily available to researchers world-wide for research into the obstacles facing various historical Arabic documents such as geometric correction of historical Arabic documents.

Keywords: Dataset production, ground truth production, historical documents, arbitrary warping, geometric correction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 866
1547 Rotation Invariant Fusion of Partial Image Parts in Vista Creation using Missing View Regeneration

Authors: H. B. Kekre, Sudeep D. Thepade

Abstract:

The automatic construction of large, high-resolution image vistas (mosaics) is an active area of research in the fields of photogrammetry [1,2], computer vision [1,4], medical image processing [4], computer graphics [3] and biometrics [8]. Image stitching is one of the possible options to get image mosaics. Vista Creation in image processing is used to construct an image with a large field of view than that could be obtained with a single photograph. It refers to transforming and stitching multiple images into a new aggregate image without any visible seam or distortion in the overlapping areas. Vista creation process aligns two partial images over each other and blends them together. Image mosaics allow one to compensate for differences in viewing geometry. Thus they can be used to simplify tasks by simulating the condition in which the scene is viewed from a fixed position with single camera. While obtaining partial images the geometric anomalies like rotation, scaling are bound to happen. To nullify effect of rotation of partial images on process of vista creation, we are proposing rotation invariant vista creation algorithm in this paper. Rotation of partial image parts in the proposed method of vista creation may introduce some missing region in the vista. To correct this error, that is to fill the missing region further we have used image inpainting method on the created vista. This missing view regeneration method also overcomes the problem of missing view [31] in vista due to cropping, irregular boundaries of partial image parts and errors in digitization [35]. The method of missing view regeneration generates the missing view of vista using the information present in vista itself.

Keywords: Vista, Overlap Estimation, Rotation Invariance, Missing View Regeneration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
1546 Anticoagulatory Role of an Ergot Mesylate: Hydergine

Authors: Fareeha A., Irfan Z Qureshi

Abstract:

Thrombosis can be life threatening, necessitating therefore its instant treatment. Hydergine, a nootropic agent is used as a cognition enhancer in stroke patients but relatively little is known about its anti-thrombolytic effect. To investigate this aspect, in vivo and ex vivo experiments were designed and conducted. Three groups of rats were injected 1.5mg, 3.0mg and 4.5mg hydergine intraperitonealy with and without prior exposure to fresh plasma. Positive and negative controls were run in parallel. Animals were sacrificed after 1.5hrs and BT, CT, PT, INR, APTT, plasma calcium levels were estimated. For ex vivo analyses, each 1ml blood aspirated was exposed to 0.1mg, 0.2mg, 0.3mg dose of hydergine with parallel controls. Parameters analyzed were as above. Statistical analysis was through one-way ANOVA. Dunken-s and Tukey-s tests provided intra-group variance. BT, CT, PT, INR and APTT increased while calcium levels dropped significantly (P<0.05). Ex vivo, CT, PT and APTT were elevated while plasma calcium levels lowered significantly (P<0.05). Our study suggests that hydergine may act as a thrombolytic agent but warrants further studies to elucidate this role of ergot mesylates.

Keywords: Hydergine, Coagulation assays, plasma calcium, ergot mesylates, thrombosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2025
1545 Improved Safety Science: Utilizing a Design Hierarchy

Authors: Ulrica Pettersson

Abstract:

Collection of information on incidents is regularly done through pre-printed incident report forms. These tend to be incomplete and frequently lack essential information. ne consequence is that reports with inadequate information, that do not fulfil analysts’ requirements, are transferred into the analysis process. To improve an incident reporting form, theory in design science, witness psychology and interview and questionnaire research has been used. Previously three experiments have been conducted to evaluate the form and shown significant improved results. The form has proved to capture knowledge, regardless of the incidents’ character or context. The aim in this paper is to describe how design science, in more detail a design hierarchy can be used to construct a collection form for improvements in safety science.

Keywords: Design science, data collection, form, incident report, safety science.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
1544 Globally Convergent Edge-preserving Reconstruction with Contour-line Smoothing

Authors: Marc C. Robini, Pierre-Jean Viverge, Yuemin Zhu, Jianhua Luo

Abstract:

The standard approach to image reconstruction is to stabilize the problem by including an edge-preserving roughness penalty in addition to faithfulness to the data. However, this methodology produces noisy object boundaries and creates a staircase effect. The existing attempts to favor the formation of smooth contour lines take the edge field explicitly into account; they either are computationally expensive or produce disappointing results. In this paper, we propose to incorporate the smoothness of the edge field in an implicit way by means of an additional penalty term defined in the wavelet domain. We also derive an efficient half-quadratic algorithm to solve the resulting optimization problem, including the case when the data fidelity term is non-quadratic and the cost function is nonconvex. Numerical experiments show that our technique preserves edge sharpness while smoothing contour lines; it produces visually pleasing reconstructions which are quantitatively better than those obtained without wavelet-domain constraints.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1344
1543 Solid Concentration in Circulating Fluidized Bed Reactor for the MTO Process

Authors: Biao Wang, Tao Li, Qi-wen Sun, Wei-yong Ying, Ding-ye Fang

Abstract:

Methanol-to-olefins (MTO) coupled with transformation of coal or natural gas to methanol gives an interesting and promising way to produce ethylene and propylene. To investigate solid concentration in gas-solid fluidized bed for methanol-to-olefins process catalyzed by SAPO-34, a cold model experiment system is established in this paper. The system comprises a gas distributor in a 300mm internal diameter and 5000mm height acrylic column, the fiber optic probe system and series of cyclones. The experiments are carried out at ambient conditions and under different superficial gas velocity ranging from 0.3930m/s to 0.7860m/s and different initial bed height ranging from 600mm to 1200mm. The effects of radial distance, axial distance, superficial gas velocity, initial bed height on solid concentration in the bed are discussed. The effects of distributor shape and porosity on solid concentration are also discussed. The time-averaged solid concentration profiles under different conditions are obtained.

Keywords: Branched pipe distributor, distributor porosity, gas-solid fluidized bed, solid concentration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2276
1542 Modeling and Simulation for Physical Vapor Deposition: Multiscale Model

Authors: Jürgen Geiser, Robert Röhle

Abstract:

In this paper we present modeling and simulation for physical vapor deposition for metallic bipolar plates. In the models we discuss the application of different models to simulate the transport of chemical reactions of the gas species in the gas chamber. The so called sputter process is an extremely sensitive process to deposit thin layers to metallic plates. We have taken into account lower order models to obtain first results with respect to the gas fluxes and the kinetics in the chamber. The model equations can be treated analytically in some circumstances and complicated multi-dimensional models are solved numerically with a software-package (UG unstructed grids, see [1]). Because of multi-scaling and multi-physical behavior of the models, we discuss adapted schemes to solve more accurate in the different domains and scales. The results are discussed with physical experiments to give a valid model for the assumed growth of thin layers.

Keywords: Convection-diffusion equations, multi-scale problem, physical vapor deposition, reaction equations, splitting methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
1541 Hexavalent Chromium Removal from Aqueous Solutions by Adsorption onto Synthetic Nano Size ZeroValent Iron (nZVI)

Authors: A.R. Rahmani, M.T. Samadi, R. Noroozi

Abstract:

The present work was conducted for the synthesis of nano size zerovalent iron (nZVI) and hexavalent chromium (Cr(VI)) removal as a highly toxic pollutant by using this nanoparticles. Batch experiments were performed to investigate the effects of Cr(VI), nZVI concentration, pH of solution and contact time variation on the removal efficiency of Cr(VI). nZVI was synthesized by reduction of ferric chloride using sodium borohydrid. SEM and XRD examinations applied for determination of particle size and characterization of produced nanoparticles. The results showed that the removal efficiency decreased with Cr(VI) concentration and pH of solution and increased with adsorbent dosage and contact time. The Langmuir and Freundlich isotherm models were used for the adsorption equilibrium data and the Langmuir isotherm model was well fitted. Nanoparticle ZVI presented an outstanding ability to remove Cr(VI) due to high surface area, low particle size and high inherent activity.

Keywords: Adsorption, aqueous solution, Chromium, nZVI, removal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2565
1540 The Resource Description Framework (RDF) as a Modern Structure for Medical Data

Authors: Gabriela Lindemann, Danilo Schmidt, Thomas Schrader, Dietmar Keune

Abstract:

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.

Keywords: Medical databases, Resource Description Framework (RDF), metadata repository.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2032
1539 Effect of Fuel Lean Reburning Process on NOx Reduction and CO Emission

Authors: Changyeop Lee, Sewon Kim

Abstract:

Reburning is a useful technology in reducing nitric oxide through injection of a secondary hydrocarbon fuel. In this paper, an experimental study has been conducted to evaluate the effect of fuel lean reburning on NOx/CO reduction in LNG flame. Experiments were performed in flames stabilized by a co-flow swirl burner, which was mounted at the bottom of the furnace. Tests were conducted using LNG gas as the reburn fuel as well as the main fuel. The effects of reburn fuel fraction and injection manner of the reburn fuel were studied when the fuel lean reburning system was applied. The paper reports data on flue gas emissions and temperature distribution in the furnace for a wide range of experimental conditions. At steady state, temperature distribution and emission formation in the furnace have been measured and compared. This paper makes clear that in order to decrease both NOx and CO concentrations in the exhaust when the pulsated fuel lean reburning system was adapted, it is important that the control of some factors such as frequency and duty ratio. Also it shows the fuel lean reburning is also effective method to reduce NOx as much as reburning.

Keywords: Fuel lean reburn, NOx, CO, LNG flame.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2201
1538 3D Sensing and Mapping for a Tracked Mobile Robot with a Movable Laser Ranger Finder

Authors: Toyomi Fujita

Abstract:

This paper presents a sensing system for 3D sensing and mapping by a tracked mobile robot with an arm-type sensor movable unit and a laser range finder (LRF). The arm-type sensor movable unit is mounted on the robot and the LRF is installed at the end of the unit. This system enables the sensor to change position and orientation so that it avoids occlusions according to terrain by this mechanism. This sensing system is also able to change the height of the LRF by keeping its orientation flat for efficient sensing. In this kind of mapping, it may be difficult for moving robot to apply mapping algorithms such as the iterative closest point (ICP) because sets of the 2D data at each sensor height may be distant in a common surface. In order for this kind of mapping, the authors therefore applied interpolation to generate plausible model data for ICP. The results of several experiments provided validity of these kinds of sensing and mapping in this sensing system.

Keywords: Laser Range Finder, Arm-Type Sensor Movable Unit, Tracked Mobile Robot, 3D Mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
1537 Training in Psychology in Brazil – Reflections on the Role of Early Supervised Internships in Undergraduate Courses

Authors: Ana Paula Melchiors Stahlschmidt, Cristina Py de Pinto Gomes Mairesse

Abstract:

This paper presents observations on the early supervised internships in Psychology, currently called basic internships in Brazil, and its importance in professional training. The work is an experience report and focuses on the Professional training, illustrated by the reality of a Brazilian institution, used as a case study. It was developed from the authors' experience as academic supervisors of this kind of practice throughout this undergraduate course, combined with aspects investigated in the post-doctoral research of one of them. Theoretical references on the subject and related national legislation are analyzed, as well as reports of students who experienced at least one semester of this type of practice, articulated to the observations of the authors. The results demonstrate the importance of the early supervised internships as a way of creating opportunities for the students of a first contact with the professional reality and the practice of psychologists in different fields of insertion, preparing them for further experiments that require more involvement in activities of training and practices in Psychology.

Keywords: Training of psychologists, Internships in Psychology, Supervised internships, Combination of theory and practice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
1536 An Additive Watermarking Technique in Gray Scale Images Using Discrete Wavelet Transformation and Its Analysis on Watermark Strength

Authors: Kamaldeep Joshi, Rajkumar Yadav, Ashok Kumar Yadav

Abstract:

Digital Watermarking is a procedure to prevent the unauthorized access and modification of personal data. It assures that the communication between two parties remains secure and their communication should be undetected. This paper investigates the consequence of the watermark strength of the grayscale image using a Discrete Wavelet Transformation (DWT) additive technique. In this method, the gray scale host image is divided into four sub bands: LL (Low-Low), HL (High-Low), LH (Low-High), HH (High-High) and the watermark is inserted in an LL sub band using DWT technique. As the image is divided into four sub bands, a watermark of equal size of the LL sub band has been inserted and the results are discussed. LL represents the average component of the host image which contains the maximum information of the image. Two kinds of experiments are performed. In the first, the same watermark is embedded in different images and in the later on the strength of the watermark varies by a factor of s i.e. (s=10, 20, 30, 40, 50) and it is inserted in the same image.

Keywords: Watermarking, discrete wavelet transform, scaling factor, steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
1535 A Programming Assessment Software Artefact Enhanced with the Help of Learners

Authors: Romeo A. Botes, Imelda Smit

Abstract:

The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.

Keywords: Programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994
1534 In situ Biodegradation of Endosulfan, Imidacloprid, and Carbendazim Using Indigenous Bacterial Cultures of Agriculture Fields of Uttarakhand, India

Authors: Geeta Negi, Pankaj, Anjana Srivastava, Anita Sharma

Abstract:

In the present study, presence of endosulfan, imidacloprid, carbendazim, in the soil /vegetables/cereals and water samples was observed in agriculture fields of Uttarakhand. In view of biodegradation of these pesticides, 9 bacterial isolates were recovered from the soil samples of the fields which tolerated endosulfan, imidacloprid, carbendazim from 100 to 200 µg/ml. Three bacterial consortia used for in vitro bioremediation experiments were consisted of 3 bacterial isolates for carbendazim, imidacloprid and endosulfan, respectively. Maximum degradation (87 and 83%) of α and β endosulfan respectively was observed in soil slurry by consortium. Degradation of Imidacloprid and carbendazim under similar conditions was 88.4 and 77.5% respectively. FT-IR analysis of biodegraded samples of pesticides in liquid media showed stretching of various bonds. GC-MS of biodegraded endosulfan sample in soil slurry showed the presence of nontoxic intermediates. A pot trial with Bacterial treatments lowered down the uptake of pesticides in onion plants.

Keywords: Biodegradation, carbendazim, consortium, Endosulfan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3650
1533 Development of a Non-invasive System to Measure the Thickness of the Subcutaneous Adipose Tissue Layer for Human

Authors: Hyuck Ki Hong, Young Chang Jo, Yeon Shik Choi, Beom Joon Kim, Hyo Derk Park

Abstract:

To measure the thickness of the subcutaneous adipose tissue layer, a non-invasive optical measurement system (λ=1300 nm) is introduced. Animal and human subjects are used for the experiments. The results of human subjects are compared with the data of ultrasound device measurements, and a high correlation (r=0.94 for n=11) is observed. There are two modes in the corresponding signals measured by the optical system, which can be explained by two-layered and three-layered tissue models. If the target tissue is thinner than the critical thickness, detected data using diffuse reflectance method follow the three-layered tissue model, so the data increase as the thickness increases. On the other hand, if the target tissue is thicker than the critical thickness, the data follow the two-layered tissue model, so they decrease as the thickness increases.

Keywords: Subcutaneous adipose tissue layer, non-invasive measurement system, two-layered and three-layered tissue models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
1532 Error Analysis of Nonconventional Electrical Moisture-meter under Simplified Conditions

Authors: Kamil Ďurana, Robert Černý

Abstract:

An electrical apparatus for measuring moisture content was developed by our laboratory and uses dependence of electrical properties on water content in studied material. Error analysis of the apparatus was run by measuring different volumes of water in a simplified specimen, i.e. hollow plexiglass block, in order to avoid as many side-effects as possible. Obtained data were processed using both basic and advanced statistics and results were compared with each other. The influence of water content on accuracy of measured data was studied as well as the influence of variation of apparatus' proper arrangement or factual methodics of its usage. The overall coefficient of variation was 4%. There was no trend found in results of error dependence on water content. Comparison with current surveys led to a conclusion, that the studied apparatus can be used for indirect measurement of water content in porous materials, with expectable error and under known conditions. Factual experiments with porous materials are not involved, but are currently under investigation.

Keywords: device, capacitance method, error analysis, moisture meter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1389