Search results for: 2d and 3d data conversion
23961 One-Pot Synthesis of 5-Hydroxymethylfurfural from Hexose Sugar over Chromium Impregnated Zeolite Based Catalyst, Cr/H-ZSM-5
Authors: Samuel K. Degife, Kamal K. Pant, Sapna Jain
Abstract:
The world´s population and industrialization of countries continued to grow in an alarming rate irrespective of the security for food, energy supply, and pure water availability. As a result, the global energy consumption is observed to increase significantly. Fossil energy resources that mainly comprised of crude oil, coal, and natural gas have been used by mankind as the main energy source for almost two centuries. However, sufficient evidences are revealing that the consumption of fossil resource as transportation fuel emits environmental pollutants such as CO2, NOx, and SOx. These resources are dwindling rapidly besides enormous amount of problems associated such as fluctuation of oil price and instability of oil-rich regions. Biomass is a promising renewable energy candidate to replace fossil-based transportation fuel and chemical production. The present study aims at valorization of hexose sugars (glucose and fructose) using zeolite based catalysts in imidazolium based ionic liquid (1-butyl-3-methylimidazolium chloride, [BMIM] Cl) reaction media. The catalytic effect chromium impregnated H-ZSM-5 (Cr/H-ZSM-5) was studied for dehydration of hexose sugars. The wet impregnation method was used to prepare Cr/H-ZSM-5 catalyst. The characterization of the prepared catalyst was performed using techniques such as Fourier transform infrared spectroscopy (FT-IR), X-ray diffraction analysis (XRD), Temperature-programmed desorption of ammonia (NH3-TPD) and BET-surface area analysis. The dehydration product, 5-hydroxymethylfurfural (5-HMF), was analyzed using high-performance liquid chromatography (HPLC). Cr/H-ZSM-5 was effective in dehydrating fructose with 87% conversion and 55% yield 5-HMF at 180 oC for 30 min of reaction time compared with H-ZSM-5 catalyst which yielded only 31% of 5-HMF at identical reaction condition.Keywords: chromium, hexose, ionic liquid, , zeolite
Procedia PDF Downloads 17923960 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria
Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti
Abstract:
We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence
Procedia PDF Downloads 36823959 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area
Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras
Abstract:
Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.Keywords: empirical models, RMSE, shear wave velocity, standard penetration test
Procedia PDF Downloads 34123958 Exploring NLP for Mental Health Insights: Multi-Class Classification of Online Forum Texts
Authors: Jennifer Patricia
Abstract:
With the increasing incidence of mental health issues, there is a real need for early detection, which is currently limited by stigma and ignorance. This study attempts to explore multi-class classification models to analyze mental health problems through social media texts. The goal of the classification model is to categorize text into one of six categories of mental health problems and thus to provide patterns of the language which might serve as an early indication of these problems. After data collection and labeling, the dataset was resampled to balance the dataset for model training. Some of the important steps for data preprocessing included tokenization, the removal of unnecessary characters and labels, and one-hot encoding. To further understand the language used in expressing the different conditions, word clouds and bigram analyses were conducted. The models used for the first training are BERT + XGBoost, T5, and MentalBert. The final results demonstrated that T5 and MentalBERT achieved the highest accuracy of 0.83, significantly outperforming BERT + XGBoost, which obtained an accuracy of 0.6.Keywords: mental health detection, exploratory data analysis, natural language processing, multi-class classification, data preprocessing, BERT, XGBoost, T5, MentalBERT
Procedia PDF Downloads 423957 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites
Authors: Che-Wei Lee, Bay-Erl Lai
Abstract:
A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.Keywords: steganography, data hiding, secret authentication, secret sharing
Procedia PDF Downloads 25023956 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications
Authors: T. Gangadhararao, K. Krishna Kishore
Abstract:
Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code
Procedia PDF Downloads 43323955 Synthesis and Characterization of Poly(2-[[4-(Dimethylamino)Benzylidene] Amino]Phenol) in Organic Medium: Investigation of Thermal Stability, Conductivity, and Antimicrobial Properties
Authors: Nuray Yilmaz Baran, Mehmet Saçak
Abstract:
Schiff base polymers are one class of conjugated polymers, also called as poly(azomethines). They have drawn the attention of researchers in recent years due to their some properties such as, optoelectronic, semiconductive, and photovoltaic, antimicrobial activities and high thermal stability. In this study, Poly(2-[[4-(dimethylamino)benzylidene]amino] phenol) P(2-DBAP), which is a Schiff base polymer, was synthesized by an oxidative polycondensation reaction of -[[4-(dimethylamino)benzylidene]amino]phenol (2-DBAP) with oxidants NaOCl, H₂O₂ and O₂ in various organic medium. At the end of the polymerizations carried out at various temperatures and time, maximum conversion of the monomer to the polymer could be obtained as around 93.7 %. The structures of the monomer and polymer were characterized by UV-Vis, FTIR and ¹HNMR techniques. Thermal analysis of the polymer was identified by TG-DTG and DTA techniques, and the thermal degradation behavior was supported by Thermo-IR spectra recorded in the temperature range of 25-800 °C. The number average molecular weight (Mn), weight average molecular weight (Mw) and polydispersity index (PDI) of the polymer were found to be 26337, 9860 g/mol 2.67, respectively. The change of electrical conductivity value of the P(2-DBAP) doped with iodine vapor at different temperatures and time was investigated its maximum was measured by increasing 10¹⁰ fold as 2 x10⁻⁴ Scm⁻¹ after doping for 48 h at 60 °C. Antibacterial and antifungal activities of P(2-DBAP) Schiff base and its polymer were also investigated against Sarcina lutea, Enterobacter aerogenes, Escherichia coli, Enterococcus Faecalis, Klebsiella pneumoniae, Bacillus subtilis, and Candida albicans, Saccharomyces cerevisiae, respectively.Keywords: conductive properties, polyazomethines, polycondensation reaction, Schiff base polymers, thermal stability
Procedia PDF Downloads 29023954 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base
Authors: Omid Khodabakhshi, Amir Rozdel
Abstract:
The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.Keywords: code, cloud computing, security, virtual machines
Procedia PDF Downloads 19423953 Packet Analysis in Network Forensics: Insights, Tools, and Case Study
Authors: Dalal Nasser Fathi, Amal Saud Al-Mutairi, Mada Hamed Al-Towairqi, Enas Fawzi Khairallah
Abstract:
Network forensics is essential for investigating cyber incidents and detecting malicious activities by analyzing network traffic, with a focus on packet and protocol data. This process involves capturing, filtering, and examining network data to identify patterns and signs of attacks. Packet analysis, a core technique in this field, provides insights into the origins of data, the protocols used, and any suspicious payloads, which aids in detecting malicious activity. This paper explores network forensics, providing guidance for the analyst on what to look for and identifying attack sites guided by the seven layers of the OSI model. Additionally, it explains the most commonly used tools in network forensics and demonstrates a practical example using Wireshark.Keywords: network forensic, packet analysis, Wireshark tools, forensic investigation, digital evidence
Procedia PDF Downloads 1623952 Assessment of Biofilm Production Capacity of Industrially Important Bacteria under Electroinductive Conditions
Authors: Omolola Ojetayo, Emmanuel Garuba, Obinna Ajunwa, Abiodun A. Onilude
Abstract:
Introduction: Biofilm is a functional community of microorganisms that are associated with a surface or an interface. These adherent cells become embedded within an extracellular matrix composed of polymeric substances, i.e., biofilms refer to biological deposits consisting of both microbes and their extracellular products on biotic and abiotic surfaces. Despite their detrimental effects in medicine, biofilms as natural cell immobilization have found several applications in biotechnology, such as in the treatment of wastewater, bioremediation and biodegradation, desulfurization of gas, and conversion of agro-derived materials into alcohols and organic acids. The means of enhancing immobilized cells have been chemical-inductive, and this affects the medium composition and final product. Physical factors including electrical, magnetic, and electromagnetic flux have shown potential for enhancing biofilms depending on the bacterial species, nature, and intensity of emitted signals, the duration of exposure, and substratum used. However, the concept of cell immobilisation by electrical and magnetic induction is still underexplored. Methods: To assess the effects of physical factors on biofilm formation, six American typed culture collection (Acetobacter aceti ATCC15973, Pseudomonas aeruginosa ATCC9027, Serratia marcescens ATCC14756, Gluconobacter oxydans ATCC19357, Rhodobacter sphaeroides ATCC17023, and Bacillus subtilis ATCC6633) were used. Standard culture techniques for bacterial cells were adopted. Natural autoimmobilisation potentials of test bacteria were carried out by simple biofilms ring formation on tubes, while crystal violet binding assay techniques were adopted in the characterisation of biofilm quantity. Electroinduction of bacterial cells by direct current (DC) application in cell broth, static magnetic field exposure, and electromagnetic flux were carried out, and autoimmobilisation of cells in a biofilm pattern was determined on various substrata tested, including wood, glass, steel, polyvinylchloride (PVC) and polyethylene terephthalate. Biot Savart law was used in quantifying magnetic field intensity, and statistical analyses of data obtained were carried out using the analyses of variance (ANOVA) as well as other statistical tools. Results: Biofilm formation by the selected test bacteria was enhanced by the physical factors applied. Electromagnetic induction had the greatest effect on biofilm formation, with magnetic induction producing the least effect across all substrata used. Microbial cell-cell communication could be a possible means via which physical signals affected the cells in a polarisable manner. Conclusion: The enhancement of biofilm formation by bacteria using physical factors has shown that their inherent capability as a cell immobilization method can be further optimised for industrial applications. A possible relationship between the presence of voltage-dependent channels, mechanosensitive channels, and bacterial biofilms could shed more light on this phenomenon.Keywords: bacteria, biofilm, cell immobilization, electromagnetic induction, substrata
Procedia PDF Downloads 19323951 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran
Authors: Rojin Bana Derakhshan, Abbas Toloie
Abstract:
For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.Keywords: energy saving, key elements of success, optimization of energy consumption, data mining
Procedia PDF Downloads 47323950 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach
Authors: Kwaku Damoah
Abstract:
This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis
Procedia PDF Downloads 5423949 Effect of Bactocellon White Leg Shrimp (Litopenaeusvannamei) Growth Performance and the Shrimp Survival to Vibrio paraheamolyticus
Authors: M. Soltani, K. Pakzad, A. Haghigh-Khiyabani, M. Alavi, R. Naderi, M. Castex
Abstract:
Effect of probiotic Bactocell (Pediococcus acidilactici) as a supplementary diet was studied on post-larvae 12-15 of white leg shrimp (Litopenaeus vannamei) (150000 PL/0.5 h pond, average body weight=0.02 g) growth performance under farm condition for 102 days at water quality parameters consisting of temperature at 30.5-36οC, dissolved oxygen 4.1-6.6 mg/l, salinity 40-54 g/l, turbidity 35-110 cm, ammonia 0.1-0.8 mg/l and nitrite 0.1-0.9 mg/l. Also, the resistance level of the treated shrimps was assessed against a virulent strain of Vibrio paraheamolyticus as intramuscular injection route at 1.4 x 106 cells/shrimp. Significantly higher growth rate (11.3±1.54 g) and lower feed conversion ratio (1.1) were obtained in shrimps fed diets supplemented with Bactocell at 350 g/ tone feed compared to other treatments of 250 g Bactocell/ton feed (10.8±2 g, 1.3), 500 g Bactocell/ton feed (10.3±1.7 g, 1.3) and untreated control (10.1±2 g, 1.4). Also, thermal growth coefficient (0.057%) and protein efficiency ratio (2.13) were significantly improved in shrimps fed diets supplemented with Bactocell at 350 g/ton feed compare to other groups. Shrimps fed diet supplemented with Bactocell at 350 g/tone feed showed significantly higher protein content (72.56%) in their carcass composition than treatments of 250 g/ton feed (65.9%), 500 g/ton feed (67.5%) and control group (65.9%), while the carcass contents of moisture, lipid and ash in all shrimp groups were not significantly affected by different concentrations of Bactocell. No mortality occurred in the experimentally infected shrimps fed with Bactocell at 500 g/tone feed after 7 hours post-challenge with V. parahemolyticus. The mortality levels of 100%, 40%, 50% and 70% were obtained in shrimps fed with 0.0, 500 g/tone feed, 350 g/ton feed and 250 g/ton feed, respectively 14 hours post-infection. Also, the cumulative mortalities were achieved in 100%, 92% and 81% in shrimps few with Bactocell at 500 g/ton feed, 250 g/ton feed and 350 g/ton feed, respectively.Keywords: litopenaeus vannamei, vibrio paraheamolyticus, pediococcus acidilactici, growth performance, bactocell
Procedia PDF Downloads 68023948 An Exploration of Renewal Utilization of Under-bridge Space Based on Spatial Potential Evaluation - Taking Chongqing Municipality as an Example
Authors: Xuelian Qin
Abstract:
Urban "organic renewal" based on the development of existing resources in high-density urban areas has become the mainstream of urban development in the new era. As an important stock resource of public space in high-density urban areas, promoting its value remodeling is an effective way to alleviate the shortage of public space resources. However, due to the lack of evaluation links in the process of underpass space renewal, a large number of underpass space resources have been left idle, facing the problems of low space conversion efficiency, lack of accuracy in development decision-making, and low adaptability of functional positioning to citizens' needs. Therefore, it is of great practical significance to construct the evaluation system of under-bridge space renewal potential and explore the renewal mode. In this paper, some of the under-bridge spaces in the main urban area of Chongqing are selected as the research object. Through the questionnaire interviews with the users of the built excellent space under the bridge, three types of six levels and twenty-two potential evaluation indexes of "objective demand factor, construction feasibility factor and construction suitability factor" are selected, including six levels of land resources, infrastructure, accessibility, safety, space quality and ecological environment. The analytical hierarchy process and expert scoring method are used to determine the index weight, construct the potential evaluation system of the space under the bridge in high-density urban areas of Chongqing, and explore the direction of renewal and utilization of its suitability. To provide feasible theoretical basis and scientific decision support for the use of under bridge space in the future.Keywords: high density urban area, potential evaluation, space under bridge, updated using
Procedia PDF Downloads 10223947 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures
Authors: Salima Kouici, Abdelkader Khelladi
Abstract:
In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering
Procedia PDF Downloads 48623946 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies
Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal
Abstract:
Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model
Procedia PDF Downloads 23323945 Beneficiation of Low Grade Chromite Ore and Its Characterization for the Formation of Magnesia-Chromite Refractory by Economically Viable Process
Authors: Amit Kumar Bhandary, Prithviraj Gupta, Siddhartha Mukherjee, Mahua Ghosh Chaudhuri, Rajib Dey
Abstract:
Chromite ores are primarily used for extraction of chromium, which is an expensive metal. For low grade chromite ores (containing less than 40% Cr2O3), the chromium extraction is not usually economically viable. India possesses huge quantities of low grade chromite reserves. This deposit can be utilized after proper physical beneficiation. Magnetic separation techniques may be useful after reduction for the beneficiation of low grade chromite ore. The sample collected from the sukinda mines is characterized by XRD which shows predominant phases like maghemite, chromite, silica, magnesia and alumina. The raw ore is crushed and ground to below 75 micrometer size. The microstructure of the ore shows that the chromite grains surrounded by a silicate matrix and porosity observed the exposed side of the chromite ore. However, this ore may be utilized in refractory applications. Chromite ores contain Cr2O3, FeO, Al2O3 and other oxides like Fe-Cr, Mg-Cr have a high tendency to form spinel compounds, which usually show high refractoriness. Initially, the low grade chromite ore (containing 34.8% Cr2O3) was reduced at 1200 0C for 80 minutes with 30% coke fines by weight, before being subjected to magnetic separation. The reduction by coke leads to conversion of higher state of iron oxides converted to lower state of iron oxides. The pre-reduced samples are then characterized by XRD. The magnetically inert mass was then reacted with 20% MgO by weight at 1450 0C for 2 hours. The resultant product was then tested for various refractoriness parameters like apparent porosity, slag resistance etc. The results were satisfactory, indicating that the resultant spinel compounds are suitable for refractory applications for elevated temperature processes.Keywords: apparent porosity, beneficiation, low-grade chromite, refractory, spinel compounds, slag resistance
Procedia PDF Downloads 39023944 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013
Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran
Abstract:
Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka
Procedia PDF Downloads 48123943 Enhanced Growth of Microalgae Chlamydomonas reinhardtii Cultivated in Different Organic Waste and Effective Conversion of Algal Oil to Biodiesel
Authors: Ajith J. Kings, L. R. Monisha Miriam, R. Edwin Raj, S. Julyes Jaisingh, S. Gavaskar
Abstract:
Microalgae are a potential bio-source for rejuvenated solutions in various disciplines of science and technology, especially in medicine and energy. Biodiesel is being replaced for conventional fuels in automobile industries with reduced pollution and equivalent performance. Since it is a carbon neutral fuel by recycling CO2 in photosynthesis, global warming potential can be held in control using this fuel source. One of the ways to meet the rising demand of automotive fuel is to adopt with eco-friendly, green alternative fuels called sustainable microalgal biodiesel. In this work, a microalga Chlamydomonas reinhardtii was cultivated and optimized in different media compositions developed from under-utilized waste materials in lab scale. Using the optimized process conditions, they are then mass propagated in out-door ponds, harvested, dried and oils extracted for optimization in ambient conditions. The microalgal oil was subjected to two step esterification processes using acid catalyst to reduce the acid value (0.52 mg kOH/g) in the initial stage, followed by transesterification to maximize the biodiesel yield. The optimized esterification process parameters are methanol/oil ratio 0.32 (v/v), sulphuric acid 10 vol.%, duration 45 min at 65 ºC. In the transesterification process, commercially available alkali catalyst (KOH) is used and optimized to obtain a maximum biodiesel yield of 95.4%. The optimized parameters are methanol/oil ratio 0.33(v/v), alkali catalyst 0.1 wt.%, duration 90 min at 65 ºC 90 with smooth stirring. Response Surface Methodology (RSM) is employed as a tool for optimizing the process parameters. The biodiesel was then characterized with standard procedures and especially by GC-MS to confirm its compatibility for usage in internal combustion engine.Keywords: microalgae, organic media, optimization, transesterification, characterization
Procedia PDF Downloads 23923942 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 16623941 Exchanging Radiology Reporting System with Electronic Health Record: Designing a Conceptual Model
Authors: Azadeh Bashiri
Abstract:
Introduction: In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. Background: This study, provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. Methods: This is a cross-sectional study that was conducted in 2013. The student community was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also, Visual Paradigm software was used to design a conceptual model. Result: Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. Conclusion: According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, provide the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.Keywords: structured radiology report, information needs, minimum data set, electronic health record system in Iran
Procedia PDF Downloads 25723940 Effect of Different Level of Pomegranate Molasses on Performance, Egg Quality Trait, Serological and Hematological Parameters in Older Laying Hens
Authors: Ismail Bayram, Aamir Iqbal, E. Eren Gultepe, Cangir Uyarlar, Umit Ozcınar, I. Sadi Cetingul
Abstract:
The current study was planned with the objective to explore the potential of pomegranate molasses (PM) on performance, egg quality and blood parameters in older laying hens. A total of 240 Babcock white laying hens (52 weeks old) were divided into 5 groups (n=48) with 8 subgroups having 6 hens in each. Pomegranate molasses was added in the drinking water to experimental groups with 0 %, 0.1%, 0.25 %, 0.5%, and 1%, respectively during one month. In our results, egg weight values were remained the same in all pomegranate molasses supplemented groups except 1% group over control. However, feed consumption, egg production, feed conversion ratio (FCR), egg mass, egg yolk cholesterol, body weights, and water consumption remained unaffected (P > 0.05). During mid-study (15 Days) analyses, egg quality parameters such as Haugh unit, eggshell thickness, albumin index, yolk index, and egg yolk color were remained non-significant (P > 0.05) while after final (30 Days) egg analyses, only egg yolk color had positively (P < 0.05) increased in 0.5% group. Moreover, Haugh unit, eggshell thickness, and albumin index were not significantly (P > 0.05) affected by the supplementation of pomegranate molasses. Regarding serological parameters, pomegranate molasses did not show any positive effect on cholesterol, total protein, LDL, HDL, GGT, AST, ALT, and glucose level. Similarly, pomegranate molasses also showed non-significant (P > 0.05) results on different blood parameters such as HCT, RBC, MCV, MCH, MCHC, PLT, RDWC, MPV except hemoglobin level. Only hemoglobin level was increased in all experimental groups over control showing that pomegranate molasses can be used as an enhancer in animals with low hemoglobin level.Keywords: pomegranate molasses, laying hen, egg yield, blood parameters
Procedia PDF Downloads 17523939 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter
Authors: Jisun Lee, Jay Hyoun Kwon
Abstract:
As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain
Procedia PDF Downloads 35323938 An Application of Remote Sensing for Modeling Local Warming Trend
Authors: Khan R. Rahaman, Quazi K. Hassan
Abstract:
Global changes in climate, environment, economies, populations, governments, institutions, and cultures converge in localities. Changes at a local scale, in turn, contribute to global changes as well as being affected by them. Our hypothesis is built on a consideration that temperature does vary at local level (i.e., termed as local warming) in comparison to the predicted models at the regional and/or global scale. To date, the bulk of the research relating local places to global climate change has been top-down, from the global toward the local, concentrating on methods of impact analysis that use as a starting point climate change scenarios derived from global models, even though these have little regional or local specificity. Thus, our focus is to understand such trends over the southern Alberta, which will enable decision makers, scientists, researcher community, and local people to adapt their policies based on local level temperature variations and to act accordingly. Specific objectives in this study are: (i) to understand the local warming (temperature in particular) trend in context of temperature normal during the period 1961-2010 at point locations using meteorological data; (ii) to validate the data by using specific yearly data, and (iii) to delineate the spatial extent of the local warming trends and understanding influential factors to adopt situation by local governments. Existing data has brought the evidence of such changes and future research emphasis will be given to validate this hypothesis based on remotely sensed data (i.e. MODIS product by NASA).Keywords: local warming, climate change, urban area, Alberta, Canada
Procedia PDF Downloads 34223937 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges
Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch
Abstract:
Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.Keywords: big data interpretation, datathon, systems toxicology, verification
Procedia PDF Downloads 27923936 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 27023935 On Estimating the Low Income Proportion with Several Auxiliary Variables
Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández
Abstract:
Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.Keywords: inclusion probability, poverty, poverty line, survey sampling
Procedia PDF Downloads 46023934 TessPy – Spatial Tessellation Made Easy
Authors: Jonas Hamann, Siavash Saki, Tobias Hagen
Abstract:
Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies
Procedia PDF Downloads 13223933 A CFD Analysis of Hydraulic Characteristics of the Rod Bundles in the BREST-OD-300 Wire-Spaced Fuel Assemblies
Authors: Dmitry V. Fomichev, Vladimir V. Solonin
Abstract:
This paper presents the findings from a numerical simulation of the flow in 37-rod fuel assembly models spaced by a double-wire trapezoidal wrapping as applied to the BREST-OD-300 experimental nuclear reactor. Data on a high static pressure distribution within the models, and equations for determining the fuel bundle flow friction factors have been obtained. Recommendations are provided on using the closing turbulence models available in the ANSYS Fluent. A comparative analysis has been performed against the existing empirical equations for determining the flow friction factors. The calculated and experimental data fit has been shown. An analysis into the experimental data and results of the numerical simulation of the BREST-OD-300 fuel rod assembly hydrodynamic performance are presented.Keywords: BREST-OD-300, ware-spaces, fuel assembly, computation fluid dynamics
Procedia PDF Downloads 38623932 Analysis of Lead Time Delays in Supply Chain: A Case Study
Authors: Abdel-Aziz M. Mohamed, Nermeen Coutry
Abstract:
Lead time is an important measure of supply chain performance. It impacts both customer satisfactions as well as the total cost of inventory. This paper presents the result of a study on the analysis of the customer order lead-time for a multinational company. In the study, the lead time was divided into three stages: order entry, order fulfillment, and order delivery. A sample of size 2,425 order lines from the company records were considered for this study. The sample data includes information regarding customer orders from the time of order entry until order delivery. Data regarding the lead time of each sage for different orders were also provided. Summary statistics on lead time data reveals that about 30% of the orders were delivered after the scheduled due date. The result of the multiple linear regression analysis technique revealed that component type, logistics parameter, order size and the customer type have significant impact on lead time. Data analysis on the stages of lead time indicates that stage 2 consumes over 50% of the lead time. Pareto analysis was made to study the reasons for the customer order delay in each of the 3 stages. Recommendation was given to resolve the problem.Keywords: lead time reduction, customer satisfaction, service quality, statistical analysis
Procedia PDF Downloads 737