Search results for: Signal Processing
808 3D Numerical Modelling of a Pulsed Pumping Process of a Large Dense Non-Aqueous Phase Liquid Pool: In situ Pilot-Scale Case Study of Hexachlorobutadiene in a Keyed Enclosure
Authors: Q. Giraud, J. Gonçalvès, B. Paris
Abstract:
Remediation of dense non-aqueous phase liquids (DNAPLs) represents a challenging issue because of their persistent behaviour in the environment. This pilot-scale study investigates, by means of in situ experiments and numerical modelling, the feasibility of the pulsed pumping process of a large amount of a DNAPL in an alluvial aquifer. The main compound of the DNAPL is hexachlorobutadiene, an emerging organic pollutant. A low-permeability keyed enclosure was built at the location of the DNAPL source zone in order to isolate a finite undisturbed volume of soil, and a 3-month pulsed pumping process was applied inside the enclosure to exclusively extract the DNAPL. The water/DNAPL interface elevation at both the pumping and observation wells and the cumulated pumped volume of DNAPL were also recorded. A total volume of about 20m³ of purely DNAPL was recovered since no water was extracted during the process. The three-dimensional and multiphase flow simulator TMVOC was used, and a conceptual model was elaborated and generated with the pre/post-processing tool mView. Numerical model consisted of 10 layers of variable thickness and 5060 grid cells. Numerical simulations reproduce the pulsed pumping process and show an excellent match between simulated, and field data of DNAPL cumulated pumped volume and a reasonable agreement between modelled and observed data for the evolution of the water/DNAPL interface elevations at the two wells. This study offers a new perspective in remediation since DNAPL pumping system optimisation may be performed where a large amount of DNAPL is encountered.Keywords: dense non-aqueous phase liquid (DNAPL), hexachlorobutadiene, in situ pulsed pumping, multiphase flow, numerical modelling, porous media
Procedia PDF Downloads 174807 The Need for Embodiment Perspectives and Somatic Methods in Social Work Curriculum: Lessons Learned from a Decade of Developing a Program to Support College Students Who Exited the State Foster Care System
Authors: Yvonne A. Unrau
Abstract:
Social work education is a competency-based curriculum that relies mostly on cognitive frameworks and problem-solving models. Absent from the curriculum is knowledge and skills that draw from an embodiment perspective, especially somatic practice methods. Embodiment broadly encompasses the understanding that biological, political, historical, and social factors impact human development via changes to the nervous system. In the past 20 years, research has well-established that unresolved traumatic events, especially during childhood, negatively impacts long-term health and well-being. Furthermore, traumatic stress compromises cognitive processing and activates reflexive action such as ‘fight’ or ‘flight,’ which are the focus of somatic methods. The main objective of this paper is to show how embodiment perspectives and somatic methods can enhance social work practice overall. Using an exploratory approach, the author shares a decade-long journey that involved creating an education-support program for college students who exited the state foster care system. Personal experience, program outcomes and case study narratives revealed that ‘classical’ social work methods were insufficient to fully address the complex needs of college students who were living with complex traumatic stressors. The paper chronicles select case study scenarios and key program development milestones over a 10-year period to show the benefit of incorporating embodiment perspectives in social work practice. The lessons reveal that there is an immediate need for social work curriculum to include embodiment perspectives so that social workers may be equipped to respond competently to their many clients who live with unresolved trauma.Keywords: social work practice, social work curriculum, embodiment, traumatic stress
Procedia PDF Downloads 124806 Optimization of Mechanical Cacao Shelling Parameters Using Unroasted Cocoa Beans
Authors: Jeffrey A. Lavarias, Jessie C. Elauria, Arnold R. Elepano, Engelbert K. Peralta, Delfin C. Suministrado
Abstract:
Shelling process is one of the primary processes and critical steps in the processing of chocolate or any product that is derived from cocoa beans. It affects the quality of the cocoa nibs in terms of flavor and purity. In the Philippines, small-scale food processor cannot really compete with large scale confectionery manufacturers because of lack of available postharvest facilities that are appropriate to their level of operation. The impact of this study is to provide the needed intervention that will pave the way for cacao farmers of engaging on the advantage of value-adding as way to maximize the economic potential of cacao. Thus, provision and availability of needed postharvest machines like mechanical cacao sheller will revolutionize the current state of cacao industry in the Philippines. A mechanical cacao sheller was developed, fabricated, and evaluated to establish optimum shelling conditions such as moisture content of cocoa beans, clearance where of cocoa beans passes through the breaker section and speed of the breaking mechanism on shelling recovery, shelling efficiency, shelling rate, energy utilization and large nib recovery; To establish the optimum level of shelling parameters of the mechanical sheller. These factors were statistically analyzed using design of experiment by Box and Behnken and Response Surface Methodology (RSM). By maximizing shelling recovery, shelling efficiency, shelling rate, large nib recovery and minimizing energy utilization, the optimum shelling conditions were established at moisture content, clearance and breaker speed of 6.5%, 3 millimeters and 1300 rpm, respectively. The optimum values for shelling recovery, shelling efficiency, shelling rate, large nib recovery and minimizing energy utilization were recorded at 86.51%, 99.19%, 21.85kg/hr, 89.75%, and 542.84W, respectively. Experimental values obtained using the optimum conditions were compared with predicted values using predictive models and were found in good agreement.Keywords: cocoa beans, optimization, RSM, shelling parameters
Procedia PDF Downloads 358805 A Web Service Based Sensor Data Management System
Authors: Rose A. Yemson, Ping Jiang, Oyedeji L. Inumoh
Abstract:
The deployment of wireless sensor network has rapidly increased, however with the increased capacity and diversity of sensors, and applications ranging from biological, environmental, military etc. generates tremendous volume of data’s where more attention is placed on the distributed sensing and little on how to manage, analyze, retrieve and understand the data generated. This makes it more quite difficult to process live sensor data, run concurrent control and update because sensor data are either heavyweight, complex, and slow. This work will focus on developing a web service platform for automatic detection of sensors, acquisition of sensor data, storage of sensor data into a database, processing of sensor data using reconfigurable software components. This work will also create a web service based sensor data management system to monitor physical movement of an individual wearing wireless network sensor technology (SunSPOT). The sensor will detect movement of that individual by sensing the acceleration in the direction of X, Y and Z axes accordingly and then send the sensed reading to a database that will be interfaced with an internet platform. The collected sensed data will determine the posture of the person such as standing, sitting and lying down. The system is designed using the Unified Modeling Language (UML) and implemented using Java, JavaScript, html and MySQL. This system allows real time monitoring an individual closely and obtain their physical activity details without been physically presence for in-situ measurement which enables you to work remotely instead of the time consuming check of an individual. These details can help in evaluating an individual’s physical activity and generate feedback on medication. It can also help in keeping track of any mandatory physical activities required to be done by the individuals. These evaluations and feedback can help in maintaining a better health status of the individual and providing improved health care.Keywords: HTML, java, javascript, MySQL, sunspot, UML, web-based, wireless network sensor
Procedia PDF Downloads 212804 Detecting Memory-Related Gene Modules in sc/snRNA-seq Data by Deep-Learning
Authors: Yong Chen
Abstract:
To understand the detailed molecular mechanisms of memory formation in engram cells is one of the most fundamental questions in neuroscience. Recent single-cell RNA-seq (scRNA-seq) and single-nucleus RNA-seq (snRNA-seq) techniques have allowed us to explore the sparsely activated engram ensembles, enabling access to the molecular mechanisms that underlie experience-dependent memory formation and consolidation. However, the absence of specific and powerful computational methods to detect memory-related genes (modules) and their regulatory relationships in the sc/snRNA-seq datasets has strictly limited the analysis of underlying mechanisms and memory coding principles in mammalian brains. Here, we present a deep-learning method named SCENTBOX, to detect memory-related gene modules and causal regulatory relationships among themfromsc/snRNA-seq datasets. SCENTBOX first constructs codifferential expression gene network (CEGN) from case versus control sc/snRNA-seq datasets. It then detects the highly correlated modules of differential expression genes (DEGs) in CEGN. The deep network embedding and attention-based convolutional neural network strategies are employed to precisely detect regulatory relationships among DEG genes in a module. We applied them on scRNA-seq datasets of TRAP; Ai14 mouse neurons with fear memory and detected not only known memory-related genes, but also the modules and potential causal regulations. Our results provided novel regulations within an interesting module, including Arc, Bdnf, Creb, Dusp1, Rgs4, and Btg2. Overall, our methods provide a general computational tool for processing sc/snRNA-seq data from case versus control studie and a systematic investigation of fear-memory-related gene modules.Keywords: sc/snRNA-seq, memory formation, deep learning, gene module, causal inference
Procedia PDF Downloads 120803 Quality Characteristics of Treated Wastewater of 'Industrial Area Foggia'
Authors: Grazia Disciglio, Annalisa Tarantino, Emanuele Tarantino
Abstract:
The production system of Foggia province (Apulia, Southern Italy) is characterized by the presence of numerous agro-food industries whose activities include the processing of vegetables products that release large quantities of wastewater. The reuse in agriculture of these wastewaters offers the opportunity to reduce the costs of their disposal and minimizing their environmental impact. In addition, in this area, which suffers from water shortage, the use of agro-industrial wastewater is essential in the very intensive irrigation cropping systems. The present investigation was carried out in years 2009 and 2010 to monitor the physico-chemical and microbiological characteristics of the industrial wastewater (IWW) from the secondary treatment plant of the 'Industrial Area of Foggia'. The treatment plant released on average about 567,000 m3y-1 of IWW, which distribution was not uniform over the year. The monthly values were about 250,000 m3 from November to June and about 90,000 m3 from July to October. The obtained results revealed that IWW was characterized by low values of Total Suspended Solids (TSS), Biological Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Electrical Conductivity (EC) and Sodium Absorption Rate (SAR). An occasional presence of heavy metal and high concentration of total phosphorus, total nitrogen, ammoniacal nitrogen and microbial organisms (Escherichia coli and Salmonella) were observed. Due to the presence of this pathogenic microorganisms and sometimes of heavy metals, which may raise sanitary and environmental problems in order to the possible irrigation reuse of this IWW, a tertiary treatment of wastewater based on filtration and disinfection in line are recommended. Researches on the reuse of treated IWW on crops (olive, artichoke, industrial tomatoes, fennel, lettuce etc.) did not show significant differences among the irrigated plots for most of the soil and yield characteristics.Keywords: agroindustrial wastewater, irrigation, microbiological characteristic, physico-chemical characteristics
Procedia PDF Downloads 316802 Oxalate Method for Assessing the Electrochemical Surface Area for Ni-Based Nanoelectrodes Used in Formaldehyde Sensing Applications
Authors: S. Trafela, X. Xua, K. Zuzek Rozmana
Abstract:
In this study, we used an accurate and precise method to measure the electrochemically active surface areas (Aecsa) of nickel electrodes. Calculated Aecsa is really important for the evaluation of an electro-catalyst’s activity in electrochemical reaction of different organic compounds. The method involves the electrochemical formation of Ni(OH)₂ and NiOOH in the presence of adsorbed oxalate in alkaline media. The studies were carried out using cyclic voltammetry with polycrystalline nickel as a reference material and electrodeposited nickel nanowires, homogeneous and heterogeneous nickel films. From cyclic voltammograms, the charge (Q) values for the formation of Ni(OH)₂ and NiOOH surface oxides were calculated under various conditions. At sufficiently fast potential scan rates (200 mV s⁻¹), the adsorbed oxalate limits the growth of the surface hydroxides to a monolayer. Although the Ni(OH)₂/NiOOH oxidation peak overlaps with the oxygen evolution reaction, in the reverse scan, the NiOOH/ Ni(OH)₂ reduction peak is well-separated from other electrochemical processes and can be easily integrated. The values of these integrals were used to correlate experimentally measured charge density with an electrochemically active surface layer. The Aecsa of the nickel nanowires, homogeneous and heterogeneous nickel films were calculated to be Aecsa-NiNWs = 4.2066 ± 0.0472 cm², Aecsa-homNi = 1.7175 ± 0.0503 cm² and Aecsa-hetNi = 2.1862 ± 0.0154 cm². These valuable results were expanded and used in electrochemical studies of formaldehyde oxidation. As mentioned nickel nanowires, heterogeneous and homogeneous nickel films were used as simple and efficient sensor for formaldehyde detection. For this purpose, electrodeposited nickel electrodes were modified in 0.1 mol L⁻¹ solution of KOH in order to expect electrochemical activity towards formaldehyde. The investigation of the electrochemical behavior of formaldehyde oxidation in 0.1 mol L⁻¹ NaOH solution at the surface of modified nickel nanowires, homogeneous and heterogeneous nickel films were carried out by means of electrochemical techniques such as cyclic voltammetric and chronoamperometric methods. From investigations of effect of different formaldehyde concentrations (from 0.001 to 0.1 mol L⁻¹) on electrochemical signal - current we provided catalysis mechanism of formaldehyde oxidation, detection limit and sensitivity of nickel electrodes. The results indicated that nickel electrodes participate directly in the electrocatalytic oxidation of formaldehyde. In the overall reaction, formaldehyde in alkaline aqueous solution exists predominantly in form of CH₂(OH)O⁻, which is oxidized to CH₂(O)O⁻. Taking into account the determined (Aecsa) values we have been able to calculate the sensitivities: 7 mA mol L⁻¹ cm⁻² for nickel nanowires, 3.5 mA mol L⁻¹ cm⁻² for heterogeneous nickel film and 2 mA mol L⁻¹ cm⁻² for heterogeneous nickel film. The detection limit was 0.2 mM for nickel nanowires, 0.5 mM for porous Ni film and 0.8 mM for homogeneous Ni film. All of these results make nickel electrodes capable for further applications.Keywords: electrochemically active surface areas, nickel electrodes, formaldehyde, electrocatalytic oxidation
Procedia PDF Downloads 161801 Safeguarding Product Quality through Pre-Qualification of Material Manufacturers: A Ship and Offshore Classification Society's Perspective
Authors: Sastry Y. Kandukuri, Isak Andersen
Abstract:
Despite recent advances in the manufacturing sector, quality issues remain a frequent occurrence, and can result in fatal accidents, equipment downtime, and loss of life. Adequate quality is of high importance in high-risk industries such as sea-going vessels and offshore installations in which third party quality assurance and product control play an important essential role in ensuring manufacturing quality of critical components. Classification societies play a vital role in mitigating risk in these industries by making sure that all the stakeholders i.e. manufacturers, builders, and end users are provided with adequate rules and standards that effectively ensures components produced at a high level of quality based on the area of application and risk of its failure. Quality issues have also been linked to the lack of competence or negligence of stakeholders in supply value chain. However, continued actions and regulatory reforms through modernization of rules and requirements has provided additional tools for purchasers and manufacturers to confront these issues. Included among these tools are updated ‘approval of manufacturer class programs’ aimed at developing and implementing a set of standardized manufacturing quality metrics for use by the manufacturer and verified by the classification society. The establishment and collection of manufacturing and testing requirements described in these programs could provide various stakeholders – from industry to vessel owners – with greater insight into the state of quality at a given manufacturing facility, and allow stakeholders to anticipate better and address quality issues while simultaneously reducing unnecessary failures that are costly to the industry. The publication introduces, explains and discusses critical manufacturing and testing requirements set in a leading class society’s approval of manufacturer regime and its rationale and some case studies.Keywords: classification society, manufacturing, materials processing, materials testing, quality control
Procedia PDF Downloads 355800 Anaerobic Co-Digestion of Sewage Sludge and Bagasse for Biogas Recovery
Authors: Raouf Ahmed Mohamed Hassan
Abstract:
In Egypt, the excess sewage sludge from wastewater Treatment Plants (WWTPs) is rapidly increasing due to the continuous increase of population, urban planning and industrial developments. Also, cane bagasses constitute an important component of Urban Solid Waste (USW), especially at the south of Egypt, which are difficult to degrade under normal composting conditions. These wastes need to be environmentally managed to reduce the negative impacts of its application or disposal. In term of biogas recovery, the anaerobic digestion of sewage sludge or bagasse separately is inefficient, due to the presence of nutrients and minerals. Also, the Carbone-Nitrogen Ratio (C/N) play an important role, sewage sludge has a ratio varies from 6-16, where cane bagasse has a ratio around 150, whereas the suggested optimum C/N ratio for anaerobic digestion is in the range of 20 to 30. The anaerobic co-digestion is presented as a successful methodology that combines several biodegradable organic substrates able to decrease the amount of output wastes by biodegradation, sharing processing facilities, reducing operating costs, while enabling recovery of biogas. This paper presents the study of co-digestion of sewage sludge from wastewater treatment plants as a type of organic wastes and bagasse as agriculture wastes. Laboratory-scale mesophilic and thermophilic digesters were operated with varied hydraulic retention times. Different percentage of sludge and bagasse are investigated based on the total solids (TS). Before digestion, the bagasse was subjected to grinding pretreatment and soaked in distilled water (water pretreatment). The effect of operating parameters (mixing, temperature) is investigated in order to optimize the process in the biogas production. The yield and the composition of biogas from the different experiments were evaluated and the cumulative curves were estimated. The conducted tests did show that there is a good potential to using the co-digestion of wastewater sludge and bagasse for biogas production.Keywords: co-digestion, sewage sludge, bagasse, mixing, mesophilic, thermophilic
Procedia PDF Downloads 512799 A Fast Optimizer for Large-scale Fulfillment Planning based on Genetic Algorithm
Authors: Choonoh Lee, Seyeon Park, Dongyun Kang, Jaehyeong Choi, Soojee Kim, Younggeun Kim
Abstract:
Market Kurly is the first South Korean online grocery retailer that guarantees same-day, overnight shipping. More than 1.6 million customers place an average of 4.7 million orders and add 3 to 14 products into a cart per month. The company has sold almost 30,000 kinds of various products in the past 6 months, including food items, cosmetics, kitchenware, toys for kids/pets, and even flowers. The company is operating and expanding multiple dry, cold, and frozen fulfillment centers in order to store and ship these products. Due to the scale and complexity of the fulfillment, pick-pack-ship processes are planned and operated in batches, and thus, the planning that decides the batch of the customers’ orders is a critical factor in overall productivity. This paper introduces a metaheuristic optimization method that reduces the complexity of batch processing in a fulfillment center. The method is an iterative genetic algorithm with heuristic creation and evolution strategies; it aims to group similar orders into pick-pack-ship batches to minimize the total number of distinct products. With a well-designed approach to create initial genes, the method produces streamlined plans, up to 13.5% less complex than the actual plans carried out in the company’s fulfillment centers in the previous months. Furthermore, our digital-twin simulations show that the optimized plans can reduce 3% of operation time for packing, which is the most complex and time-consuming task in the process. The optimization method implements a multithreading design on the Spring framework to support the company’s warehouse management systems in near real-time, finding a solution for 4,000 orders within 5 to 7 seconds on an AWS c5.2xlarge instance.Keywords: fulfillment planning, genetic algorithm, online grocery retail, optimization
Procedia PDF Downloads 83798 Electroencephalogram during Natural Reading: Theta and Alpha Rhythms as Analytical Tools for Assessing a Reader’s Cognitive State
Authors: D. Zhigulskaya, V. Anisimov, A. Pikunov, K. Babanova, S. Zuev, A. Latyshkova, K. Сhernozatonskiy, A. Revazov
Abstract:
Electrophysiology of information processing in reading is certainly a popular research topic. Natural reading, however, has been relatively poorly studied, despite having broad potential applications for learning and education. In the current study, we explore the relationship between text categories and spontaneous electroencephalogram (EEG) while reading. Thirty healthy volunteers (mean age 26,68 ± 1,84) participated in this study. 15 Russian-language texts were used as stimuli. The first text was used for practice and was excluded from the final analysis. The remaining 14 were opposite pairs of texts in one of 7 categories, the most important of which were: interesting/boring, fiction/non-fiction, free reading/reading with an instruction, reading a text/reading a pseudo text (consisting of strings of letters that formed meaningless words). Participants had to read the texts sequentially on an Apple iPad Pro. EEG was recorded from 12 electrodes simultaneously with eye movement data via ARKit Technology by Apple. EEG spectral amplitude was analyzed in Fz for theta-band (4-8 Hz) and in C3, C4, P3, and P4 for alpha-band (8-14 Hz) using the Friedman test. We found that reading an interesting text was accompanied by an increase in theta spectral amplitude in Fz compared to reading a boring text (3,87 µV ± 0,12 and 3,67 µV ± 0,11, respectively). When instructions are given for reading, we see less alpha activity than during free reading of the same text (3,34 µV ± 0,20 and 3,73 µV ± 0,28, respectively, for C4 as the most representative channel). The non-fiction text elicited less activity in the alpha band (C4: 3,60 µV ± 0,25) than the fiction text (C4: 3,66 µV ± 0,26). A significant difference in alpha spectral amplitude was also observed between the regular text (C4: 3,64 µV ± 0,29) and the pseudo text (C4: 3,38 µV ± 0,22). These results suggest that some brain activity we see on EEG is sensitive to particular features of the text. We propose that changes in theta and alpha bands during reading may serve as electrophysiological tools for assessing the reader’s cognitive state as well as his or her attitude to the text and the perceived information. These physiological markers have prospective practical value for developing technological solutions and biofeedback systems for reading in particular and for education in general.Keywords: EEG, natural reading, reader's cognitive state, theta-rhythm, alpha-rhythm
Procedia PDF Downloads 80797 Intrusion Detection in SCADA Systems
Authors: Leandros A. Maglaras, Jianmin Jiang
Abstract:
The protection of the national infrastructures from cyberattacks is one of the main issues for national and international security. The funded European Framework-7 (FP7) research project CockpitCI introduces intelligent intrusion detection, analysis and protection techniques for Critical Infrastructures (CI). The paradox is that CIs massively rely on the newest interconnected and vulnerable Information and Communication Technology (ICT), whilst the control equipment, legacy software/hardware, is typically old. Such a combination of factors may lead to very dangerous situations, exposing systems to a wide variety of attacks. To overcome such threats, the CockpitCI project combines machine learning techniques with ICT technologies to produce advanced intrusion detection, analysis and reaction tools to provide intelligence to field equipment. This will allow the field equipment to perform local decisions in order to self-identify and self-react to abnormal situations introduced by cyberattacks. In this paper, an intrusion detection module capable of detecting malicious network traffic in a Supervisory Control and Data Acquisition (SCADA) system is presented. Malicious data in a SCADA system disrupt its correct functioning and tamper with its normal operation. OCSVM is an intrusion detection mechanism that does not need any labeled data for training or any information about the kind of anomaly is expecting for the detection process. This feature makes it ideal for processing SCADA environment data and automates SCADA performance monitoring. The OCSVM module developed is trained by network traces off line and detects anomalies in the system real time. The module is part of an IDS (intrusion detection system) developed under CockpitCI project and communicates with the other parts of the system by the exchange of IDMEF messages that carry information about the source of the incident, the time and a classification of the alarm.Keywords: cyber-security, SCADA systems, OCSVM, intrusion detection
Procedia PDF Downloads 552796 Expression of DNMT Enzymes-Regulated miRNAs Involving in Epigenetic Event of Tumor and Margin Tissues in Patients with Breast Cancer
Authors: Fatemeh Zeinali Sehrig
Abstract:
Background: miRNAs play an important role in the post-transcriptional regulation of genes, including genes involved in DNA methylation (DNMTs), and are also important regulators of oncogenic pathways. The study of microRNAs and DNMTs in breast cancer allows the development of targeted treatments and early detection of this cancer. Methods and Materials: Clinical Patients and Samples: Institutional guidelines, including ethical approval and informed consent, were followed by the Ethics Committee (Ethics code: IR.IAU.TABRIZ.REC.1401.063) of Tabriz Azad University, Tabriz, Iran. In this study, tissues of 100 patients with breast cancer and tissues of 100 healthy women were collected from Noor Nejat Hospital in Tabriz. The basic characteristics of the patients with breast cancer included: 1)Tumor grade(Grade 3 = 5%, Grade 2 = 87.5%, Grade 1 = 7.5%), 2)Lymph node(Yes = 87.5%, No = 12.5%), 3)Family cancer history(Yes = 47.5%, No = 41.3%, Unknown = 11.2%), 4) Abortion history(Yes = 36.2%).In silico methods (data gathering, process, and build networks): Gene Expression Omnibus (GEO), a high-throughput genomic database, was queried for miRNAs expression profiles in breast cancer. For Experimental protocol Tissue Processing, Total RNA isolation, complementary DNA(cDNA) synthesis, and quantitative real time PCR (QRT-PCR) analysis were performed. Results: In the present study, we found significant (p.value<0.05) changes in the expression level of miRNAs and DNMTs in patients with breast cancer. In bioinformatics studies, the GEO microarray data set, similar to qPCR results, showed a decreased expression of miRNAs and increased expression of DNMTs in breast cancer. Conclusion: According to the results of the present study, which showed a decrease in the expression of miRNAs and DNMTs in breast cancer, it can be said that these genes can be used as important diagnostic and therapeutic biomarkers in breast cancer.Keywords: gene expression omnibus, microarray dataset, breast cancer, miRNA, DNMT (DNA methyltransferases)
Procedia PDF Downloads 35795 Phelipanche Ramosa (L. - Pomel) Control in Field Tomato Crop
Authors: G. Disciglio, F. Lops, A. Carlucci, G. Gatta, A. Tarantino, L. Frabboni, F. Carriero, F. Cibelli, M. L. Raimondo, E. Tarantino
Abstract:
The Phelipanche ramosa is is an important crop whose cultivation in the Mediterranean basin is severely contained the phitoparasitic weed Phelipanche ramose. The semiarid regions of the world are considered the main center of this parasitic weed, where heavy infestation is due to the ability to produce high numbers of seeds (up to 500,000 per plant), that remain viable for extended period (more than 19 years). In this paper 12 treatments of parasitic weed control including chemical, agronomic, biological and biotechnological methods have been carried out. In 2014 a trial was performed at Foggia (southern Italy). on processing tomato (cv Docet), grown in field infested by Phelipanche ramosa, Tomato seedlings were transplant on May 5, 2014 on a clay-loam soil (USDA) fertilized by 100 kg ha-1 of N; 60 kg ha-1 of P2O5 and 20 kg ha-1 of S. Afterwards, top dressing was performed with 70 kg ha-1 of N. The randomized block design with 3 replicates was adopted. During the growing cycle of the tomato, at 56-78 and 92 days after transplantation, the number of parasitic shoots emerged in each pot was detected. At harvesting, on August 18, the major quantity-quality yield parameters were determined (marketable yield, mean weight, dry matter, pH, soluble solids and color of fruits). All data were subjected to analysis of variance (ANOVA), using the JMP software (SAS Institute Inc., Cary, NC, USA), and for comparison of means was used Tukey's test. Each treatment studied did not provide complete control against Phelipanche ramosa. However among the 12 tested methods, Fusarium, gliphosate, radicon biostimulant and Red Setter tomato cv (improved genotypes obtained by Tilling technology) proved to mitigate the virulence of the attacks of Phelipanche ramose. It is assumed that these effects can be improved by combining some of these treatments each other, especially for a gradual and continuing reduction of the “seed bank” of the parasite in the soil.Keywords: control methods, Phelipanche ramosa, tomato crop, mediterranean basin
Procedia PDF Downloads 563794 Sustainable Membranes Based on 2D Materials for H₂ Separation and Purification
Authors: Juan A. G. Carrio, Prasad Talluri, Sergio G. Echeverrigaray, Antonio H. Castro Neto
Abstract:
Hydrogen as a fuel and environmentally pleasant energy carrier is part of this transition towards low-carbon systems. The extensive deployment of hydrogen production, purification and transport infrastructures still represents significant challenges. Independent of the production process, the hydrogen generally is mixed with light hydrocarbons and other undesirable gases that need to be removed to obtain H₂ with the required purity for end applications. In this context, membranes are one of the simplest, most attractive, sustainable, and performant technologies enabling hydrogen separation and purification. They demonstrate high separation efficiencies and low energy consumption levels in operation, which is a significant leap compared to current energy-intensive options technologies. The unique characteristics of 2D laminates have given rise to a diversity of research on their potential applications in separation systems. Specifically, it is already known in the scientific literature that graphene oxide-based membranes present the highest reported selectivity of H₂ over other gases. This work explores the potential of a new type of 2D materials-based membranes in separating H₂ from CO₂ and CH₄. We have developed nanostructured composites based on 2D materials that have been applied in the fabrication of membranes to maximise H₂ selectivity and permeability, for different gas mixtures, by adjusting the membranes' characteristics. Our proprietary technology does not depend on specific porous substrates, which allows its integration in diverse separation modules with different geometries and configurations, looking to address the technical performance required for industrial applications and economic viability. The tuning and precise control of the processing parameters allowed us to control the thicknesses of the membranes below 100 nanometres to provide high permeabilities. Our results for the selectivity of new nanostructured 2D materials-based membranes are in the range of the performance reported in the available literature around 2D materials (such as graphene oxide) applied to hydrogen purification, which validates their use as one of the most promising next-generation hydrogen separation and purification solutions.Keywords: membranes, 2D materials, hydrogen purification, nanocomposites
Procedia PDF Downloads 134793 Tracking the Effect of Ibutilide on Amplitude and Frequency of Fibrillatory Intracardiac Electrograms Using the Regression Analysis
Authors: H. Hajimolahoseini, J. Hashemi, D. Redfearn
Abstract:
Background: Catheter ablation is an effective therapy for symptomatic atrial fibrillation (AF). The intracardiac electrocardiogram (IEGM) collected during this procedure contains precious information that has not been explored to its full capacity. Novel processing techniques allow looking at these recordings from different perspectives which can lead to improved therapeutic approaches. In our previous study, we showed that variation in amplitude measured through Shannon Entropy could be used as an AF recurrence risk stratification factor in patients who received Ibutilide before the electrograms were recorded. The aim of this study is to further investigate the effect of Ibutilide on characteristics of the recorded signals from the left atrium (LA) of a patient with persistent AF before and after administration of the drug. Methods: The IEGMs collected from different intra-atrial sites of 12 patients were studied and compared before and after Ibutilide administration. First, the before and after Ibutilide IEGMs that were recorded within a Euclidian distance of 3 mm in LA were selected as pairs for comparison. For every selected pair of IEGMs, the Probability Distribution Function (PDF) of the amplitude in time domain and magnitude in frequency domain was estimated using the regression analysis. The PDF represents the relative likelihood of a variable falling within a specific range of values. Results: Our observations showed that in time domain, the PDF of amplitudes was fitted to a Gaussian distribution while in frequency domain, it was fitted to a Rayleigh distribution. Our observations also revealed that after Ibutilide administration, the IEGMs would have significantly narrower short-tailed PDFs both in time and frequency domains. Conclusion: This study shows that the PDFs of the IEGMs before and after administration of Ibutilide represents significantly different properties, both in time and frequency domains. Hence, by fitting the PDF of IEGMs in time domain to a Gaussian distribution or in frequency domain to a Rayleigh distribution, the effect of Ibutilide can easily be tracked using the statistics of their PDF (e.g., standard deviation) while this is difficult through the waveform of IEGMs itself.Keywords: atrial fibrillation, catheter ablation, probability distribution function, time-frequency characteristics
Procedia PDF Downloads 159792 Management Methods of Food Losses in Polish Processing Plants
Authors: Beata Bilska, Marzena Tomaszewska, Danuta Kolozyn-Krajewska
Abstract:
Food loss and food waste are a global problem of the modern economy. The research undertaken aimed to analyze how food is handled in catering establishments when it comes to food waste and to demonstrate the main ways of management with foods/dishes not served to consumers. A survey study was conducted from January to June 2019. The selection of catering establishments participating in the study was deliberate. The study included establishments located only in Mazowieckie Voivodeship (Poland). Forty-two completed questionnaires were collected. In some questions, answers were based on a 5-point scale of 1 to 5 (from "always" / "every day" to "never"). The survey also included closed questions with a suggested cafeteria of answers. The respondents stated that in their workplaces, dishes served cold and hot ready meals are discarded every day or almost every day (23.7% and 20.5% of answers respectively). A procedure most frequently used for dealing with dishes not served to consumers on a given day is their storage at a cool temperature until the following day. In the research, 1/5 of respondents admitted that consumers "always" or "usually" leave uneaten meals on their plates, and over 41% "sometimes" do so. It was found additionally that food not used in the foodservice sector is most often thrown into a public container for rubbish. Most often thrown into the public container (with communal trash) were: expired products (80.0%), plate waste (80.0%) and inedible products (fruit and vegetable peels, eggshells) (77.5%). Most frequently into the container dedicated only to food waste were thrown out used deep-frying oil (62.5%). 10% of respondents indicated that inedible products in their workplaces are allocated for animal feeds. Food waste in the foodservice sector remains an insufficiently studied issue, as owners of these objects are often unwilling to disclose data about the subject. Incorrect ways of management with foods not served to consumers were observed. There is a need to develop educational activities for employees and management in the context of food waste management in the foodservice sector.Keywords: food waste, inedible products, plate waste, used deep-frying oil
Procedia PDF Downloads 125791 A Location-Based Search Approach According to Users’ Application Scenario
Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang
Abstract:
Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.Keywords: data mining, knowledge management, location-based service, user application scenario
Procedia PDF Downloads 123790 The Impact of Gender Difference on Crop Productivity: The Case of Decha Woreda, Ethiopia
Authors: Getinet Gezahegn Gebre
Abstract:
The study examined the impact of gender differences on Crop productivity in Decha woreda of south west Kafa zone, located 140 Km from Jimma Town and 460 km south west of Addis Ababa, between Bonga town and Omo River. The specific objectives were to assess the extent to which the agricultural production system is gender oriented, to examine access and control over productive resources, and to estimate men’s and women’s productivity in agriculture. Cross-sectional data collected from a total of 140 respondents were used in this study, whereby 65 were female headed and 75 were male headed households. The data were analyzed by using Statistical Package for Social Science (SPSS). Descriptive statistics such as frequency, mean, percentage, t-test, and chi-square were used to summarize and compare the information between the two groups. Moreover, Cobb-Douglas(CD) production function was to estimate the productivity difference in agriculture between male and female headed households. Results of the study showed that male headed households (MHH) own more productive resources such as land, livestock, labor, and other agricultural inputs as compared to female headed households (FHH). Moreover, the estimate of CD production function shows that livestock, herbicide use, land size, and male labor were statistically significant for MHH, while livestock, land size, herbicides use and female labor were significant variables for FHH. The crop productivity difference between MHH and FHH was about 68.83% in the study area. However, if FHH had equal access to the inputs as MHH, the gross value of the output would be higher by 23.58% for FHH. This might suggest that FHH would be more productive than MHH if they had equal access to inputs as MHH. Based on the results obtained, the following policy implication can be drawn: accessing FHH to inputs that increase the productivity of agriculture, such as herbicides, livestock, and male labor; increasing the productivity of land; and introducing technologies that reduce the time and energy of women, especially for inset processing.Keywords: gender difference, crop, productivity, efficiency
Procedia PDF Downloads 96789 Fabrication of High-Aspect Ratio Vertical Silicon Nanowire Electrode Arrays for Brain-Machine Interfaces
Authors: Su Yin Chiam, Zhipeng Ding, Guang Yang, Danny Jian Hang Tng, Peiyi Song, Geok Ing Ng, Ken-Tye Yong, Qing Xin Zhang
Abstract:
Brain-machine interfaces (BMI) is a ground rich of exploration opportunities where manipulation of neural activity are used for interconnect with myriad form of external devices. These research and intensive development were evolved into various areas from medical field, gaming and entertainment industry till safety and security field. The technology were extended for neurological disorders therapy such as obsessive compulsive disorder and Parkinson’s disease by introducing current pulses to specific region of the brain. Nonetheless, the work to develop a real-time observing, recording and altering of neural signal brain-machine interfaces system will require a significant amount of effort to overcome the obstacles in improving this system without delay in response. To date, feature size of interface devices and the density of the electrode population remain as a limitation in achieving seamless performance on BMI. Currently, the size of the BMI devices is ranging from 10 to 100 microns in terms of electrodes’ diameters. Henceforth, to accommodate the single cell level precise monitoring, smaller and denser Nano-scaled nanowire electrode arrays are vital in fabrication. In this paper, we would like to showcase the fabrication of high aspect ratio of vertical silicon nanowire electrodes arrays using microelectromechanical system (MEMS) method. Nanofabrication of the nanowire electrodes involves in deep reactive ion etching, thermal oxide thinning, electron-beam lithography patterning, sputtering of metal targets and bottom anti-reflection coating (BARC) etch. Metallization on the nanowire electrode tip is a prominent process to optimize the nanowire electrical conductivity and this step remains a challenge during fabrication. Metal electrodes were lithographically defined and yet these metal contacts outline a size scale that is larger than nanometer-scale building blocks hence further limiting potential advantages. Therefore, we present an integrated contact solution that overcomes this size constraint through self-aligned Nickel silicidation process on the tip of vertical silicon nanowire electrodes. A 4 x 4 array of vertical silicon nanowires electrodes with the diameter of 290nm and height of 3µm has been successfully fabricated.Keywords: brain-machine interfaces, microelectromechanical systems (MEMS), nanowire, nickel silicide
Procedia PDF Downloads 435788 Exploring Bidirectional Encoder Representations from the Transformers’ Capabilities to Detect English Preposition Errors
Authors: Dylan Elliott, Katya Pertsova
Abstract:
Preposition errors are some of the most common errors created by L2 speakers. In addition, improving error correction and detection methods remains an open issue in the realm of Natural Language Processing (NLP). This research investigates whether the bidirectional encoder representations from the transformers model (BERT) have the potential to correct preposition errors accurately enough to be useful in error correction software. This research finds that BERT performs strongly when the scope of its error correction is limited to preposition choice. The researchers used an open-source BERT model and over three hundred thousand edited sentences from Wikipedia, tagged for part of speech, where only a preposition edit had occurred. To test BERT’s ability to detect errors, a technique known as multi-level masking was used to generate suggestions based on sentence context for every prepositional environment in the test data. These suggestions were compared with the original errors in the data and their known corrections to evaluate BERT’s performance. The suggestions were further analyzed to determine if BERT more often agreed with the judgements of the Wikipedia editors. Both the untrained and fined-tuned models were compared. Finetuning led to a greater rate of error-detection which significantly improved recall, but lowered precision due to an increase in false positives or falsely flagged errors. However, in most cases, these false positives were not errors in preposition usage but merely cases where more than one preposition was possible. Furthermore, when BERT correctly identified an error, the model largely agreed with the Wikipedia editors, suggesting that BERT’s ability to detect misused prepositions is better than previously believed. To evaluate to what extent BERT’s false positives were grammatical suggestions, we plan to do a further crowd-sourcing study to test the grammaticality of BERT’s suggested sentence corrections against native speakers’ judgments.Keywords: BERT, grammatical error correction, preposition error detection, prepositions
Procedia PDF Downloads 147787 Processing and Evaluation of Jute Fiber Reinforced Hybrid Composites
Authors: Mohammad W. Dewan, Jahangir Alam, Khurshida Sharmin
Abstract:
Synthetic fibers (carbon, glass, aramid, etc.) are generally utilized to make composite materials for better mechanical and thermal properties. However, they are expensive and non-biodegradable. In the perspective of Bangladesh, jute fibers are available, inexpensive, and comprising good mechanical properties. The improved properties (i.e., low cost, low density, eco-friendly) of natural fibers have made them a promising reinforcement in hybrid composites without sacrificing mechanical properties. In this study, jute and e-glass fiber reinforced hybrid composite materials are fabricated utilizing hand lay-up followed by a compression molding technique. Room temperature cured two-part epoxy resin is used as a matrix. Approximate 6-7 mm thick composite panels are fabricated utilizing 17 layers of woven glass and jute fibers with different fiber layering sequences- only jute, only glass, glass, and jute alternatively (g/j/g/j---) and 4 glass - 9 jute – 4 glass (4g-9j-4g). The fabricated composite panels are analyzed through fiber volume calculation, tensile test, bending test, and water absorption test. The hybridization of jute and glass fiber results in better tensile, bending, and water absorption properties than only jute fiber-reinforced composites, but inferior properties as compared to only glass fiber reinforced composites. Among different fiber layering sequences, 4g-9j-4g fibers layering sequence resulted in better tensile, bending, and water absorption properties. The effect of chemical treatment on the woven jute fiber and chopped glass microfiber infusion are also investigated in this study. Chemically treated jute fiber and 2 wt. % chopped glass microfiber infused hybrid composite shows about 12% improvements in flexural strength as compared to untreated and no micro-fiber infused hybrid composite panel. However, fiber chemical treatment and micro-filler do not have a significant effect on tensile strength.Keywords: compression molding, chemical treatment, hybrid composites, mechanical properties
Procedia PDF Downloads 158786 The Impact of Sign Language on Generating and Maintaining a Mental Image
Authors: Yi-Shiuan Chiu
Abstract:
Deaf signers have been found to have better mental image performance than hearing nonsigners. The goal of this study was to investigate the ability to generate mental images, to maintain them, and to manipulate them in deaf signers of Taiwanese Sign Language (TSL). In the visual image task, participants first memorized digits formed in a cell of 4 × 5 grids. After presenting a cue of Chinese digit character shown on the top of a blank cell, participants had to form a corresponding digit. When showing a probe, which was a grid containing a red circle, participants had to decide as quickly as possible whether the probe would have been covered by the mental image of the digit. The ISI (interstimulus interval) between cue and probe was manipulated. In experiment 1, 24 deaf signers and 24 hearing nonsigners were asked to perform image generation tasks (ISI: 200, 400 ms) and image maintenance tasks (ISI: 800, 2000 ms). The results showed that deaf signers had had an enhanced ability to generate and maintain a mental image. To explore the process of mental image, in experiment 2, 30 deaf signers and 30 hearing nonsigners were asked to do visual searching when maintaining a mental image. Between a digit image cue and a red circle probe, participants were asked to search a visual search task to see if a target triangle apex was directed to the right or left. When there was only one triangle in the searching task, the results showed that both deaf signers and hearing non-signers had similar visual searching performance in which the searching targets in the mental image locations got facilitates. However, deaf signers could maintain better and faster mental image performance than nonsigners. In experiment 3, we increased the number of triangles to 4 to raise the difficulty of the visual search task. The results showed that deaf participants performed more accurately in visual search and image maintenance tasks. The results suggested that people may use eye movements as a mnemonic strategy to maintain the mental image. And deaf signers had enhanced abilities to resist the interference of eye movements in the situation of fewer distractors. In sum, these findings suggested that deaf signers had enhanced mental image processing.Keywords: deaf signers, image maintain, mental image, visual search
Procedia PDF Downloads 154785 River Network Delineation from Sentinel 1 Synthetic Aperture Radar Data
Authors: Christopher B. Obida, George A. Blackburn, James D. Whyatt, Kirk T. Semple
Abstract:
In many regions of the world, especially in developing countries, river network data are outdated or completely absent, yet such information is critical for supporting important functions such as flood mitigation efforts, land use and transportation planning, and the management of water resources. In this study, a method was developed for delineating river networks using Sentinel 1 imagery. Unsupervised classification was applied to multi-temporal Sentinel 1 data to discriminate water bodies from other land covers then the outputs were combined to generate a single persistent water bodies product. A thinning algorithm was then used to delineate river centre lines, which were converted into vector features and built into a topologically structured geometric network. The complex river system of the Niger Delta was used to compare the performance of the Sentinel-based method against alternative freely available water body products from United States Geological Survey, European Space Agency and OpenStreetMap and a river network derived from a Shuttle Rader Topography Mission Digital Elevation Model. From both raster-based and vector-based accuracy assessments, it was found that the Sentinel-based river network products were superior to the comparator data sets by a substantial margin. The geometric river network that was constructed permitted a flow routing analysis which is important for a variety of environmental management and planning applications. The extracted network will potentially be applied for modelling dispersion of hydrocarbon pollutants in Ogoniland, a part of the Niger Delta. The approach developed in this study holds considerable potential for generating up to date, detailed river network data for the many countries where such data are deficient.Keywords: Sentinel 1, image processing, river delineation, large scale mapping, data comparison, geometric network
Procedia PDF Downloads 139784 Enhanced Disk-Based Databases towards Improved Hybrid in-Memory Systems
Authors: Samuel Kaspi, Sitalakshmi Venkatraman
Abstract:
In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable in-memory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of disk-based database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of in-memory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.Keywords: in-memory database, disk-based system, hybrid database, concurrency control
Procedia PDF Downloads 417783 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images
Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu
Abstract:
The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.Keywords: level set model, multi-temporal image, lake contour extraction, contour update
Procedia PDF Downloads 366782 E4D-MP: Time-Lapse Multiphysics Simulation and Joint Inversion Toolset for Large-Scale Subsurface Imaging
Authors: Zhuanfang Fred Zhang, Tim C. Johnson, Yilin Fang, Chris E. Strickland
Abstract:
A variety of geophysical techniques are available to image the opaque subsurface with little or no contact with the soil. It is common to conduct time-lapse surveys of different types for a given site for improved results of subsurface imaging. Regardless of the chosen survey methods, it is often a challenge to process the massive amount of survey data. The currently available software applications are generally based on the one-dimensional assumption for a desktop personal computer. Hence, they are usually incapable of imaging the three-dimensional (3D) processes/variables in the subsurface of reasonable spatial scales; the maximum amount of data that can be inverted simultaneously is often very small due to the capability limitation of personal computers. Presently, high-performance or integrating software that enables real-time integration of multi-process geophysical methods is needed. E4D-MP enables the integration and inversion of time-lapsed large-scale data surveys from geophysical methods. Using the supercomputing capability and parallel computation algorithm, E4D-MP is capable of processing data across vast spatiotemporal scales and in near real time. The main code and the modules of E4D-MP for inverting individual or combined data sets of time-lapse 3D electrical resistivity, spectral induced polarization, and gravity surveys have been developed and demonstrated for sub-surface imaging. E4D-MP provides capability of imaging the processes (e.g., liquid or gas flow, solute transport, cavity development) and subsurface properties (e.g., rock/soil density, conductivity) critical for successful control of environmental engineering related efforts such as environmental remediation, carbon sequestration, geothermal exploration, and mine land reclamation, among others.Keywords: gravity survey, high-performance computing, sub-surface monitoring, electrical resistivity tomography
Procedia PDF Downloads 157781 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 269780 Molecular Topology and TLC Retention Behaviour of s-Triazines: QSRR Study
Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević
Abstract:
Quantitative structure-retention relationship (QSRR) analysis was used to predict the chromatographic behavior of s-triazine derivatives by using theoretical descriptors computed from the chemical structure. Fundamental basis of the reported investigation is to relate molecular topological descriptors with chromatographic behavior of s-triazine derivatives obtained by reversed-phase (RP) thin layer chromatography (TLC) on silica gel impregnated with paraffin oil and applied ethanol-water (φ = 0.5-0.8; v/v). Retention parameter (RM0) of 14 investigated s-triazine derivatives was used as dependent variable while simple connectivity index different orders were used as independent variables. The best QSRR model for predicting RM0 value was obtained with simple third order connectivity index (3χ) in the second-degree polynomial equation. Numerical values of the correlation coefficient (r=0.915), Fisher's value (F=28.34) and root mean square error (RMSE = 0.36) indicate that model is statistically significant. In order to test the predictive power of the QSRR model leave-one-out cross-validation technique has been applied. The parameters of the internal cross-validation analysis (r2CV=0.79, r2adj=0.81, PRESS=1.89) reflect the high predictive ability of the generated model and it confirms that can be used to predict RM0 value. Multivariate classification technique, hierarchical cluster analysis (HCA), has been applied in order to group molecules according to their molecular connectivity indices. HCA is a descriptive statistical method and it is the most frequently used for important area of data processing such is classification. The HCA performed on simple molecular connectivity indices obtained from the 2D structure of investigated s-triazine compounds resulted in two main clusters in which compounds molecules were grouped according to the number of atoms in the molecule. This is in agreement with the fact that these descriptors were calculated on the basis of the number of atoms in the molecule of the investigated s-triazine derivatives.Keywords: s-triazines, QSRR, chemometrics, chromatography, molecular descriptors
Procedia PDF Downloads 393779 An Evaluation of the Influence of Corn Cob Ash on the Strength Parameters of Lateritic SoiLs
Authors: O. A. Apampa, Y. A. Jimoh
Abstract:
The paper reports the investigation of Corn Cob Ash as a chemical stabilizing agent for laterite soils. Corn cob feedstock was obtained from Maya, a rural community in the derived savannah agro-ecological zone of South-Western Nigeria and burnt to ashes of pozzolanic quality. Reddish brown silty clayey sand material characterized as AASHTO A-2-6(3) lateritic material was obtained from a borrow pit in Abeokuta and subjected to strength characterization tests according to BS 1377: 2000. The soil was subsequently mixed with CCA in varying percentages of 0-7.5% at 1.5% intervals. The influence of CCA stabilized soil was determined for the Atterberg limits, compaction characteristics, CBR and the unconfined compression strength. The tests were repeated on laterite cement-soil mixture in order to establish a basis for comparison. The result shows a similarity in the compaction characteristics of soil-cement and soil-CCA. With increasing addition of binder from 1.5% to 7.5%, Maximum Dry Density progressively declined while the OMC steadily increased. For the CBR, the maximum positive impact was observed at 1.5% CCA addition at a value of 85% compared to the control value of 65% for the cement stabilization, but declined steadily thereafter with increasing addition of CCA, while that of soil-cement continued to increase with increasing addition of cement beyond 1.5% though at a relatively slow rate. Similar behavior was observed in the UCS values for the soil-CCA mix, increasing from a control value of 0.4 MN/m2 to 1.0 MN/m2 at 1.5% CCA and declining thereafter, while that for soil-cement continued to increase with increasing cement addition, but at a slower rate. This paper demonstrates that CCA is effective for chemical stabilization of a typical Nigerian AASHTO A-2-6 lateritic soil at maximum stabilizer content limit of 1.5% and therefore recommends its use as a way of finding further application for agricultural waste products and achievement of environmental sustainability in line with the ideals of the millennium development goals because of the economic and technical feasibility of the processing of the cobs from corn.Keywords: corn cob ash, pozzolan, cement, laterite, stabilizing agent, cation exchange capacity
Procedia PDF Downloads 297