Search results for: channel estimation
1031 Extraction of Dyes Using an Aqueous Two-Phase System in Stratified and Slug Flow Regimes of a Microchannel
Authors: Garima, S. Pushpavanam
Abstract:
In this work, analysis of an Aqueous two-phase (polymer-salt) system for extraction of sunset yellow dye is carried out. A polymer-salt ATPS i.e.; Polyethylene glycol-600 and anhydrous sodium sulfate is used for the extraction. Conditions are chosen to ensure that the extraction results in a concentration of the dye in one of the phases. The dye has a propensity to come to the Polyethylene glycol-600 phase. This extracted sunset yellow dye is degraded photo catalytically into less harmful components. The cloud point method was used to obtain the binodal curve of ATPS. From the binodal curve, the composition of salt and Polyethylene glycol -600 was chosen such that the volume of Polyethylene glycol-600 rich phase is low. This was selected to concentrate the dye from a dilute solution in a large volume of contaminated solution into a small volume. This pre-concentration step provides a high reaction rate for photo catalytic degradation reaction. Experimentally the dye is extracted from the salt phase to Polyethylene glycol -600 phase in batch extraction. This was found to be very fast and all dye was extracted. The concentration of sunset yellow dye in salt and polymer phase is measured at 482nm by ultraviolet-visible spectrophotometry. The extraction experiment in micro channels under stratified flow is analyzed to determine factors which affect the dye extraction. Focus will be on obtaining slug flow by adding nanoparticles in micro channel. The primary aim is to exploit the fact that slug flow will help improve mass transfer rate from one phase to another through internal circulation in dispersed phase induced by shear.Keywords: aqueous two phase system, binodal curve, extraction, sunset yellow dye
Procedia PDF Downloads 3581030 Colour Segmentation of Satellite Imagery to Estimate Total Suspended Solid at Rawa Pening Lake, Central Java, Indonesia
Authors: Yulia Chalri, E. T. P. Lussiana, Sarifuddin Madenda, Bambang Trisakti, Yuhilza Hanum
Abstract:
Water is a natural resource needed by humans and other living creatures. The territorial water of Indonesia is 81% of the country area, consisting of inland waters and the sea. The research object is inland waters in the form of lakes and reservoirs, since 90% of inland waters are in them, therefore the water quality should be monitored. One of water quality parameters is Total Suspended Solid (TSS). Most of the earlier research did direct measurement by taking the water sample to get TSS values. This method takes a long time and needs special tools, resulting in significant cost. Remote sensing technology has solved a lot of problems, such as the mapping of watershed and sedimentation, monitoring disaster area, mapping coastline change, and weather analysis. The aim of this research is to estimate TSS of Rawa Pening lake in Central Java by using the Lansat 8 image. The result shows that the proposed method successfully estimates the Rawa Pening’s TSS. In situ TSS shows normal water quality range, and so does estimation result of segmentation method.Keywords: total suspended solid (TSS), remote sensing, image segmentation, RGB value
Procedia PDF Downloads 4121029 Comparative Evaluation of EBT3 Film Dosimetry Using Flat Bad Scanner, Densitometer and Spectrophotometer Methods and Its Applications in Radiotherapy
Authors: K. Khaerunnisa, D. Ryangga, S. A. Pawiro
Abstract:
Over the past few decades, film dosimetry has become a tool which is used in various radiotherapy modalities, either for clinical quality assurance (QA) or dose verification. The response of the film to irradiation is usually expressed in optical density (OD) or net optical density (netOD). While the film's response to radiation is not linear, then the use of film as a dosimeter must go through a calibration process. This study aimed to compare the function of the calibration curve of various measurement methods with various densitometer, using a flat bad scanner, point densitometer and spectrophotometer. For every response function, a radichromic film calibration curve is generated from each method by performing accuracy, precision and sensitivity analysis. netOD is obtained by measuring changes in the optical density (OD) of the film before irradiation and after irradiation when using a film scanner if it uses ImageJ to extract the pixel value of the film on the red channel of three channels (RGB), calculate the change in OD before and after irradiation when using a point densitometer, and calculate changes in absorbance before and after irradiation when using a spectrophotometer. the results showed that the three calibration methods gave readings with a netOD precision of doses below 3% for the uncertainty value of 1σ (one sigma). while the sensitivity of all three methods has the same trend in responding to film readings against radiation, it has a different magnitude of sensitivity. while the accuracy of the three methods provides readings below 3% for doses above 100 cGy and 200 cGy, but for doses below 100 cGy found above 3% when using point densitometers and spectrophotometers. when all three methods are used for clinical implementation, the results of the study show accuracy and precision below 2% for the use of scanners and spectrophotometers and above 3% for precision and accuracy when using point densitometers.Keywords: Callibration Methods, Film Dosimetry EBT3, Flat Bad Scanner, Densitomete, Spectrophotometer
Procedia PDF Downloads 1351028 The Influence of Intellectual Capital Disclosures on Market Capitalization Growth
Authors: Nyoman Wijana, Chandra Arha
Abstract:
Disclosures of Intellectual Capital (IC) is a presentation of corporate information assets that are not recorded in the financial statements. This disclosures is very helpful because it provides inform corporate assets are intangible. In the new economic era, the company's intangible assets will determine company's competitive advantage. This study aimed to examine the effect of IC disclosures on market capitalization growth. Observational studies conducted over ten years in 2002-2011. The purpose of this study was to determine the effect for last ten years. One hundred samples of the company's largest market capitalization in 2011 traced back to last ten years. Data that used, are in 2011, 2008, 2005, and 2002 Method that’s used for acquiring the data is content analysis. The analytical method used is Ordinanary Least Square (OLS) and analysis tools are e views 7 This software using Pooled Least Square estimation parameters are specifically designed for panel data. The results of testing analysis showed inconsistent expression levels affect the growth of the market capitalization in each year of observation. The results of this study are expected to motivate the public company in Indonesia to do more voluntary IC disclosures and encourage regulators to make regulations in a comprehensive manner so that all categories of the IC must be disclosed by the company.Keywords: IC disclosures, market capitalization growth, analytical method, OLS
Procedia PDF Downloads 3401027 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method
Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas
Abstract:
To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.Keywords: building energy prediction, data mining, demand response, electricity market
Procedia PDF Downloads 3161026 Estimation of Cholesterol Level in Different Brands of Vegetable Oils in Iraq
Authors: Mohammed Idaan Hassan Al-Majidi
Abstract:
An analysis of twenty one assorted brands of vegetable oils in Babylon Iraq, reveals varying levels of cholesterol content. Cholesterol was found to be present in most of the oil brands sampled using three standard methods. Cholesterol was detected in seventeen of the vegetable oil brands with concentration of less than 1 mg/ml while seven of the oil brands had cholesterol concentrations ranging between 1-4 mg/ml. Low iodine values were obtained in four of the vegetable oil brands and three of them had high acid values. High performance liquid chromatography (HPLC) confirmed the presence of cholesterol at varying concentrations in all the oil brands and gave the lowest detectable cholesterol values in all the oil brands. The Laser brand made from rapeseed had the highest cholesterol concentration of 3.2 mg/ml while Grand brand made from groundnuts had the least concentration (0.12 mg/ml) of cholesterol using HPLC analysis. Leibermann-Burchard method showed that Gino brand from palm kernel had the least concentration of cholesterol (3.86 mg/ml ±0.032) and the highest concentration of 3.996 mg/ml ±0.0404 was obtained in Sesame seed oil brand. This report is important in view of health implications of cholesterol in our diets. Consequently, we have been able to show that there is no cholesterol free oil in the market as shown on the vegetable oil brand labels. Therefore, companies producing and marketing vegetable oils are enjoined to desist from misleading the public by labeling their products as “cholesterol free”. They should indicate the amount of cholesterol present in the vegetable oil, no matter how small the quantity may be.Keywords: vegetable oils, heart diseases, leibermann-burchard, cholesterol
Procedia PDF Downloads 2591025 Scentscape of the Soul as a Direct Channel of Communication with the Psyche and Physical Body
Authors: Elena Roadhouse
Abstract:
“When it take the kitchen middens from the latest canning session out to the compost before going to bed, the orchestra is in full chorus. Night vapors and scents from the earth mingle with the fragrance of honeysuckle nearby and basil grown in the compost. They merge into the rhythmic pulse of night”. William Longgood Carl Jung did not specifically recognize scent and olfactory function as a window into the psyche. He did recognize instinct and the natural history of mankind as key to understanding and reconnecting with the Psyche. The progressive path of modern humans has brought incredible scientific and industrial advancements that have changed the human relationship with Mother Earth, the primal wisdom of mankind, and led to the loss of instinct. The olfactory bulbs are an integral part of our ancient brain and has evolved in a way that is proportional to the human separation with the instinctual self. If olfaction is a gateway to our instinct, then it is also a portal to the soul. Natural aromatics are significant and powerful instruments for supporting the mind, our emotional selves, and our bodies. This paper aims to shed light on the important role of scent in the understanding of the existence of the psyche, generational trauma, and archetypal fragrance. Personalized Natural Perfume combined with mindfulness practices can be used as an effective behavioral conditioning tool to promote the healing of transgenerational and individual trauma, the fragmented self, and the physical body.Keywords: scentscape of the soul, psyche, individuation, epigenetics, depth psychology, carl Jung, instinct, trauma, archetypal scent, personal myth, holistic wellness, natural perfumery
Procedia PDF Downloads 1041024 Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data Towards Mapping Fruit Plantations in Highly Heterogenous Landscapes
Authors: Yingisani Chabalala, Elhadi Adam, Khalid Adem Ali
Abstract:
Mapping smallholder fruit plantations using optical data is challenging due to morphological landscape heterogeneity and crop types having overlapped spectral signatures. Furthermore, cloud covers limit the use of optical sensing, especially in subtropical climates where they are persistent. This research assessed the effectiveness of Sentinel-1 (S1) and Sentinel-2 (S2) data for mapping fruit trees and co-existing land-use types by using support vector machine (SVM) and random forest (RF) classifiers independently. These classifiers were also applied to fused data from the two sensors. Feature ranks were extracted using the RF mean decrease accuracy (MDA) and forward variable selection (FVS) to identify optimal spectral windows to classify fruit trees. Based on RF MDA and FVS, the SVM classifier resulted in relatively high classification accuracy with overall accuracy (OA) = 0.91.6% and kappa coefficient = 0.91% when applied to the fused satellite data. Application of SVM to S1, S2, S2 selected variables and S1S2 fusion independently produced OA = 27.64, Kappa coefficient = 0.13%; OA= 87%, Kappa coefficient = 86.89%; OA = 69.33, Kappa coefficient = 69. %; OA = 87.01%, Kappa coefficient = 87%, respectively. Results also indicated that the optimal spectral bands for fruit tree mapping are green (B3) and SWIR_2 (B10) for S2, whereas for S1, the vertical-horizontal (VH) polarization band. Including the textural metrics from the VV channel improved crop discrimination and co-existing land use cover types. The fusion approach proved robust and well-suited for accurate smallholder fruit plantation mapping.Keywords: smallholder agriculture, fruit trees, data fusion, precision agriculture
Procedia PDF Downloads 541023 A Study on Reliability of Gender and Stature Determination by Odontometric and Craniofacial Anthropometric Parameters
Authors: Churamani Pokhrel, C. B. Jha, S. R. Niraula, P. R. Pokharel
Abstract:
Human identification is one of the most challenging subjects that man has confronted. The determination of adult sex and stature are two of the four key factors (sex, stature, age, and race) in identification of an individual. Craniofacial and odontometric parameters are important tools for forensic anthropologists when it is not possible to apply advanced techniques for identification purposes. The present study provides anthropometric correlation of the parameters with stature and gender and also devises regression formulae for reconstruction of stature. A total of 312 Nepalese students with equal distribution of sex i.e., 156 male and 156 female students of age 18-35 years were taken for the study. Total of 10 parameters were measured (age, sex, stature, head circumference, head length, head breadth, facial height, bi-zygomatic width, mesio-distal canine width and inter-canine distance of both maxilla and mandible). Co-relation and regression analysis was done to find the association between the parameters. All parameters were found to be greater in males than females and each was found to be statistically significant. Out of total 312 samples, the best regressor for the determination of stature was head circumference and mandibular inter-canine width and that for gender was head circumference and right mandibular teeth. The accuracy of prediction was 83%. Regression equations and analysis generated from craniofacial and odontometric parameters can be a supplementary approach for the estimation of stature and gender when extremities are not available.Keywords: craniofacial, gender, odontometric, stature
Procedia PDF Downloads 1911022 From Bureaucracy to Organizational Learning Model: An Organizational Change Process Study
Authors: Vania Helena Tonussi Vidal, Ester Eliane Jeunon
Abstract:
This article aims to analyze the change processes of management related bureaucracy and learning organization model. The theoretical framework was based on Beer and Nohria (2001) model, identified as E and O Theory. Based on this theory the empirical research was conducted in connection with six key dimensions: goal, leadership, focus, process, reward systems and consulting. We used a case study of an educational Institution located in Barbacena, Minas Gerais. This traditional center of technical knowledge for long time adopted the bureaucratic way of management. After many changes in a business model, as the creation of graduate and undergraduate courses they decided to make a deep change in management model that is our research focus. The data were collected through semi-structured interviews with director, managers and courses supervisors. The analysis were processed by the procedures of Collective Subject Discourse (CSD) method, develop by Lefèvre & Lefèvre (2000), Results showed the incremental growing of management model toward a learning organization. Many impacts could be seeing. As negative factors we have: people resistance; poor information about the planning and implementation process; old politics inside the new model and so on. Positive impacts are: new procedures in human resources, mainly related to manager skills and empowerment; structure downsizing, open discussions channel; integrated information system. The process is still under construction and now great stimulus is done to managers and employee commitment in the process.Keywords: bureaucracy, organizational learning, organizational change, E and O theory
Procedia PDF Downloads 4341021 Use Multiphysics Simulations and Resistive Pulse Sensing to Study the Effect of Metal and Non-Metal Nanoparticles in Different Salt Concentration
Authors: Chun-Lin Chiang, Che-Yen Lee, Yu-Shan Yeh, Jiunn-Haur Shaw
Abstract:
Wafer fabrication is a critical part of the semiconductor process, when the finest linewidth with the improvement of technology continues to decline and the structure development from 2D towards to 3D. The nanoparticles contained in the slurry or in the ultrapure water which used for cleaning have a large influence on the manufacturing process. Therefore, semiconductor industry is hoping to find a viable method for on-line detection the nanoparticles size and concentration. The resistive pulse sensing technology is one of the methods that may cover this question. As we know that nanoparticles properties of material differ significantly from their properties at larger length scales. So, we want to clear that the metal and non-metal nanoparticles translocation dynamic when we use the resistive pulse sensing technology. In this study we try to use the finite element method that contains three governing equations to do multiphysics coupling simulations. The Navier-Stokes equation describes the laminar motion, the Nernst-Planck equation describes the ion transport, and the Poisson equation describes the potential distribution in the flow channel. To explore that the metal nanoparticles and the non-metal nanoparticles in different concentration electrolytes, through the nanochannel caused by ion current changes. Then the reliability of the simulation results was verified by resistive pulse sensing test. The existing results show that the lower ion concentration, the greater effect of nanoparticles on the ion concentration in the nanochannel. The conductive spikes are correlated with nanoparticles surface charge. Then we can be concluded that in the resistive pulse sensing technique, the ion concentration in the nanochannel and nanoparticle properties are important for the translocation dynamic, and they have the interactions.Keywords: multiphysics simulations, resistive pulse sensing, nanoparticles, nanochannel
Procedia PDF Downloads 3491020 Earthquake Vulnerability and Repair Cost Estimation of Masonry Buildings in the Old City Center of Annaba, Algeria
Authors: Allaeddine Athmani, Abdelhacine Gouasmia, Tiago Ferreira, Romeu Vicente
Abstract:
The seismic risk mitigation from the perspective of the old buildings stock is truly essential in Algerian urban areas, particularly those located in seismic prone regions, such as Annaba city, and which the old buildings present high levels of degradation associated with no seismic strengthening and/or rehabilitation concerns. In this sense, the present paper approaches the issue of the seismic vulnerability assessment of old masonry building stocks through the adaptation of a simplified methodology developed for a European context area similar to that of Annaba city, Algeria. Therefore, this method is used for the first level of seismic vulnerability assessment of the masonry buildings stock of the old city center of Annaba. This methodology is based on a vulnerability index that is suitable for the evaluation of damage and for the creation of large-scale loss scenarios. Over 380 buildings were evaluated in accordance with the referred methodology and the results obtained were then integrated into a Geographical Information System (GIS) tool. Such results can be used by the Annaba city council for supporting management decisions, based on a global view of the site under analysis, which led to more accurate and faster decisions for the risk mitigation strategies and rehabilitation plans.Keywords: Damage scenarios, masonry buildings, old city center, seismic vulnerability, vulnerability index
Procedia PDF Downloads 4511019 The Application of Insects in Forensic Investigations
Authors: Shirin Jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani
Abstract:
Forensic entomology is the science of study and analysis of insects evidences to aid in criminal investigation. Being aware of the distribution, biology, ecology and behavior of insects, which are founded at crime scene can provide information about when, where and how the crime has been committed. It has many application in criminal investigations. Its main use is estimation of the minimum time after death in suspicious death. The close association between insects and corpses and the use of insects in criminal investigations is the subject of forensic entomology. Because insects attack to the decomposing corpse and spawning on it from the initial stages. Forensic scientists can estimate the postmortem index by studying the insects population and the developing larval stages.In addition, toxicological and molecular studies of these insects can reveal the cause of death or even the identity of a victim. It also be used to detect drugs and poisons, and determination of incident location. Gathering robust entomological evidences is made possible for experts by recent Techniques. They can provide vital information about death, corpse movement or burial, submersion interval, time of decapitation, identification of specific sites of trauma, post-mortem artefacts on the body, use of drugs, linking a suspect to the scene of a crime, sexual molestations and the identification of suspects.Keywords: Forensic entomology, post mortem interval, insects, larvae
Procedia PDF Downloads 5031018 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 731017 A Machine Learning-based Study on the Estimation of the Threat Posed by Orbital Debris
Authors: Suhani Srivastava
Abstract:
This research delves into the classification of orbital debris through machine learning (ML): it will categorize the intensity of the threat orbital debris poses through multiple ML models to gain an insight into effectively estimating the danger specific orbital debris can pose to future space missions. As the space industry expands, orbital debris becomes a growing concern in Low Earth Orbit (LEO) because it can potentially obfuscate space missions due to the increased orbital debris pollution. Moreover, detecting orbital debris and identifying its characteristics has become a major concern in Space Situational Awareness (SSA), and prior methods of solely utilizing physics can become inconvenient in the face of the growing issue. Thus, this research focuses on approaching orbital debris concerns through machine learning, an efficient and more convenient alternative, in detecting the potential threat certain orbital debris pose. Our findings found that the Logistic regression machine worked the best with a 98% accuracy and this research has provided insight into the accuracies of specific machine learning models when classifying orbital debris. Our work would help provide space shuttle manufacturers with guidelines about mitigating risks, and it would help in providing Aerospace Engineers facilities to identify the kinds of protection that should be incorporated into objects traveling in the LEO through the predictions our models provide.Keywords: aerospace, orbital debris, machine learning, space, space situational awareness, nasa
Procedia PDF Downloads 211016 Phytochemial Screening, Anti-Microbial and Mineral Determination of Brysocarpus coccineus Root
Authors: I. L. Ibrahim, A. Mann, A. Ndanaimi
Abstract:
The research involved phytochemical screening, antibacterial activities and mineral determination by flame photometry of the crude extract of Brysocarpus coccineus schum indeed were carried out. The result of Phytochemical screening reveal tha saponins, alkaloids, cardiac glycosides, and anthraquinones were present. This suggests that the plant extract could be used as anti-inflammatory and anti-bleeding agents. Estimation of mineral content shows that the crude extract of B. coccineus contains 0.73 (Na+), 1.06 (K+) and 1.98 (Ca+) which justifies its use to be safe for hypertensive patients and could be used to lower blood pressure. The antibacterial properties of aqueous and ethanol extract were studied against some bacteria; pseudomonas aeruginosa, Escherichia coli, Bacilus subtilis, Klebsilla penmuoniae by disc diffusion method. The aqueous extract showed significant activity against the organisms while the ethanol at concentrations 5-10mg/ml ethanol extract showed significant zone of inhibition against the organisms, E. coli, (19 mm), B. cereus (12 mm), P. aeruginosa (11 mm), K. pnemuoniae (11 mm). Minimum inhibitory concentration (MIC) was carried with considerable effect of inhibition on the organisms. The MIC values observed were 1, 24, 16 and 19 mm against E. coli, B. cereus, P. aeruginosa and K. pnemuoniae respectively. Therefore, the plant could be a potential source of antibacterial agent although more pharmacological and clinical study may be recommended.Keywords: phytochemicals, microorganisms, screenings, mineral ions
Procedia PDF Downloads 4131015 Developing Critical-Process Skills Integrated Assessment Instrument as Alternative Assessment on Electrolyte Solution Matter in Senior High School
Authors: Sri Rejeki Dwi Astuti, Suyanta
Abstract:
The demanding of the asessment in learning process was impact by policy changes. Nowadays, the assessment not only emphasizes knowledge, but also skills and attitude. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop instrument of integrated assessment as alternative assessment to measure critical thinking skills and science process skills in electrolyte solution and to describe instrument’s characteristic such as logic validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step and was analyzed by qualitative analysis. Initial product was observed by three peer reviewer and six expert judgment (two subject matter expert, two evaluation expert and two chemistry teacher) to acquire logic validity test. Logic validity test was analyzed using Aiken’s formula. The estimation of construct validity was analyzed by exploratory factor analysis. Result showed that integrated assessment instrument has 0,90 of Aiken’s Value and all item in integrated assessment asserted valid according to construct validity.Keywords: construct validity, critical thinking skills, integrated assessment instrument, logic validity, science process skills
Procedia PDF Downloads 2631014 Plasma Treatment of a Lignite Using Water-Stabilized Plasma Torch at Atmospheric Pressure
Authors: Anton Serov, Alan Maslani, Michal Hlina, Vladimir Kopecky, Milan Hrabovsky
Abstract:
Recycling of organic waste is an increasingly hot topic in recent years. This issue becomes even more interesting if the raw material for the fuel production can be obtained as the result of that recycling. A process of high-temperature decomposition of a lignite (a non-hydrolysable complex organic compound) was studied on the plasma gasification reactor PLASGAS, where water-stabilized plasma torch was used as a source of high enthalpy plasma. The plasma torch power was 120 kW and allowed heating of the reactor to more than 1000 °C. The material feeding rate in the gasification reactor was selected 30 and 60 kg per hour that could be compared with small industrial production. An efficiency estimation of the thermal decomposition process was done. A balance of the torch energy distribution was studied as well as an influence of the lignite particle size and an addition of methane (CH4) in a reaction volume on the syngas composition (H2+CO). It was found that the ratio H2:CO had values in the range of 1,5 to 2,5 depending on the experimental conditions. The recycling process occurred at atmospheric pressure that was one of the important benefits because of the lack of expensive vacuum pump systems. The work was supported by the Grant Agency of the Czech Republic under the project GA15-19444S.Keywords: atmospheric pressure, lignite, plasma treatment, water-stabilized plasma torch
Procedia PDF Downloads 3731013 Applying of an Adaptive Neuro-Fuzzy Inference System (ANFIS) for Estimation of Flood Hydrographs
Authors: Amir Ahmad Dehghani, Morteza Nabizadeh
Abstract:
This paper presents the application of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to flood hydrograph modeling of Shahid Rajaee reservoir dam located in Iran. This was carried out using 11 flood hydrographs recorded in Tajan river gauging station. From this dataset, 9 flood hydrographs were chosen to train the model and 2 flood hydrographs to test the model. The different architectures of neuro-fuzzy model according to the membership function and learning algorithm were designed and trained with different epochs. The results were evaluated in comparison with the observed hydrographs and the best structure of model was chosen according the least RMSE in each performance. To evaluate the efficiency of neuro-fuzzy model, various statistical indices such as Nash-Sutcliff and flood peak discharge error criteria were calculated. In this simulation, the coordinates of a flood hydrograph including peak discharge were estimated using the discharge values occurred in the earlier time steps as input values to the neuro-fuzzy model. These results indicate the satisfactory efficiency of neuro-fuzzy model for flood simulating. This performance of the model demonstrates the suitability of the implemented approach to flood management projects.Keywords: adaptive neuro-fuzzy inference system, flood hydrograph, hybrid learning algorithm, Shahid Rajaee reservoir dam
Procedia PDF Downloads 4781012 Estimation of Optimum Parameters of Non-Linear Muskingum Model of Routing Using Imperialist Competition Algorithm (ICA)
Authors: Davood Rajabi, Mojgan Yazdani
Abstract:
Non-linear Muskingum model is an efficient method for flood routing, however, the efficiency of this method is influenced by three applied parameters. Therefore, efficiency assessment of Imperialist Competition Algorithm (ICA) to evaluate optimum parameters of non-linear Muskingum model was addressed through this study. In addition to ICA, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were also used aiming at an available criterion to verdict ICA. In this regard, ICA was applied for Wilson flood routing; then, routing of two flood events of DoAab Samsami River was investigated. In case of Wilson flood that the target function was considered as the sum of squared deviation (SSQ) of observed and calculated discharges. Routing two other floods, in addition to SSQ, another target function was also considered as the sum of absolute deviations of observed and calculated discharge. For the first floodwater based on SSQ, GA indicated the best performance, however, ICA was on first place, based on SAD. For the second floodwater, based on both target functions, ICA indicated a better operation. According to the obtained results, it can be said that ICA could be used as an appropriate method to evaluate the parameters of Muskingum non-linear model.Keywords: Doab Samsami river, genetic algorithm, imperialist competition algorithm, meta-exploratory algorithms, particle swarm optimization, Wilson flood
Procedia PDF Downloads 5041011 Does Trade and Institutional Quality Play Any Significant Role on Environmental Quality in Sub-Saharan Africa?
Authors: Luqman Afolabi
Abstract:
This paper measures the impacts of trade and institutions on environmental quality in Sub-Saharan Africa (SSA). To examine the direction and the magnitude of the effects, the study employs the pooled mean group (PMG) estimation technique on the panel data obtained from the World Bank’s World Development and Governance Indicators, between 1996 and 2018. The empirical estimates validate the environmental Kuznets curve hypothesis (EKC) for the region, even though there have been inconclusive results on the environment – growth nexus. Similarly, a positive coefficient is obtained on the impact of trade on the environment, while the impact of the institutional indicators produce mixed results. A significant policy implication is that the governments of the SSA countries pursue policies that tend to increase economic growth, so that pollutants may be reduced. Such policies may include the provision of incentives for sustainable growth-driven industries in the region. In addition, the governance infrastructures should be improved in such a way that appropriate penalties are imposed on the pollutants, while advanced technologies that have the potentials to reduce environmental degradation should be encouraged. Finally, it is imperative from these findings that the governments of the region should promote their trade relations and the competitiveness of their local industries in order to keep pace with the global markets.Keywords: environmental quality, institutional quality sustainable development goals, trade
Procedia PDF Downloads 1421010 Oxidative Stress Markers in Sports Related to Training
Authors: V. Antevska, B. Dejanova, L. Todorovska, J. Pluncevic, E. Sivevska, S. Petrovska, S. Mancevska, I. Karagjozova
Abstract:
Introduction: The aim of this study was to optimise the laboratory oxidative stress (OS) markers in soccer players. Material and methods: In a number of 37 soccer players (21±3 years old) and 25 control subjects (sedenters), plasma samples were taken for d-ROMs (reactive oxygen metabolites) and NO (nitric oxide) determination. The d-ROMs test was performed by measurement of hydroperoxide levels (Diacron, Italy). For NO determination the method of nitrate enzyme reduction with the Greiss reagent was used (OXIS, USA). The parameters were taken after the training of the soccer players and were compared with the control group. Training was considered as maximal exercise treadmill test. The criteria of maximum loading for each subject was established as >95% maximal heart rate. Results: The level of d-ROMs was found to be increased in the soccer players vs. control group but no significant difference was noticed. After the training d-ROMs in soccer players showed increased value of 299±44 UCarr (p<0.05). NO showed increased level in all soccer players vs. controls but significant difference was found after the training 102±29 μmol (p<0.05). Conclusion: Due to these results we may suggest that the measuring these OS markers in sport medicine may be useful for better estimation and evaluation of the training program. More oxidative stress should be used to clarify optimization of the training intensity program.Keywords: oxidative stress markers, soccer players, training, sport
Procedia PDF Downloads 4471009 Enhancing Solar Fuel Production by CO₂ Photoreduction Using Transition Metal Oxide Catalysts in Reactors Prepared by Additive Manufacturing
Authors: Renata De Toledo Cintra, Bruno Ramos, Douglas Gouvêa
Abstract:
There is a huge global concern due to the emission of greenhouse gases, consequent environmental problems, and the increase in the average temperature of the planet, caused mainly by fossil fuels, petroleum derivatives represent a big part. One of the main greenhouse gases, in terms of volume, is CO₂. Recovering a part of this product through chemical reactions that use sunlight as an energy source and even producing renewable fuel (such as ethane, methane, ethanol, among others) is a great opportunity. The process of artificial photosynthesis, through the conversion of CO₂ and H₂O into organic products and oxygen using a metallic oxide catalyst, and incidence of sunlight, is one of the promising solutions. Therefore, this research is of great relevance. To this reaction take place efficiently, an optimized reactor was developed through simulation and prior analysis so that the geometry of the internal channel is an efficient route and allows the reaction to happen, in a controlled and optimized way, in flow continuously and offering the least possible resistance. The design of this reactor prototype can be made in different materials, such as polymers, ceramics and metals, and made through different processes, such as additive manufacturing (3D printer), CNC, among others. To carry out the photocatalysis in the reactors, different types of catalysts will be used, such as ZnO deposited by spray pyrolysis in the lighting window, probably modified ZnO, TiO₂ and modified TiO₂, among others, aiming to increase the production of organic molecules, with the lowest possible energy.Keywords: artificial photosynthesis, CO₂ reduction, photocatalysis, photoreactor design, 3D printed reactors, solar fuels
Procedia PDF Downloads 861008 Impact of Soci̇al Media in Tourism Marketing
Authors: Betül Garda
Abstract:
Technological developments have diversified marketing activities of the tourism sector and it has increased tourism opportunities to compete on a global scale for tourism businesses. Tourism businesses have been forced to use its core skills and knowledge effectively with the increase in effectiveness of the technology in the global competitive environment. Tourism businesses have been reached beyond the traditional boundaries because of their commercial activities, so, the boundaries of the national market either eliminated or blurred. Therefore, the internet is the alternative promotion tool and distribution channel to providing unlimited facilities for tourism suppliers. For example, the internet provides an opportunity to reach customers on a global scale with direct email marketing, advertising, customer service, promotion, sales, and marketing. Tourism businesses have improved themselves with the continuous information flows and also they have provided the permanence of the changes. Especially in terms of tourism businesses, social media is emerging as an extremely important tool in the use of knowledge effectively. This research paper investigates the impact of social media on the tourism businesses. A social networking site is a type of social media that provides a platform for business and people to connect with each other. Social media is so flexible that it can be used for both leisure and business purposes. In the tourism industry, social networking sites are one of the essential tools that play an important and beneficial role. The topic that will be discussed in this research paper are consumer behavior, connection with consumers, effectiveness in terms of time and cost, creating brand awareness and building the image of the company, promoting company, targeting consumers in a conceptual frame.Keywords: branding, promoting, social media in tourism, tourism marketing tools
Procedia PDF Downloads 2831007 Whole Coding Genome Inter-Clade Comparison to Predict Global Cancer-Protecting Variants
Authors: Lamis Naddaf, Yuval Tabach
Abstract:
In this research, we identified the missense genetic variants that have the potential to enhance resistance against cancer. Such field has not been widely explored, as researchers tend to investigate mutations that cause diseases, in response to the suffering of patients, rather than those mutations that protect from them. In conjunction with the genomic revolution, and the advances in genetic engineering and synthetic biology, identifying the protective variants will increase the power of genotype-phenotype predictions and can have significant implications on improved risk estimation, diagnostics, prognosis and even for personalized therapy and drug discovery. To approach our goal, we systematically investigated the sites of the coding genomes and picked up the alleles that showed a correlation with the species’ cancer resistance. We predicted 250 protecting variants (PVs) with a 0.01 false discovery rate and more than 20 thousand PVs with a 0.25 false discovery rate. Cancer resistance in Mammals and reptiles was significantly predicted by the number of PVs a species has. Moreover, Genes enriched with the protecting variants are enriched in pathways relevant to tumor suppression like pathways of Hedgehog signaling and silencing, which its improper activation is associated with the most common form of cancer malignancy. We also showed that the PVs are more abundant in healthy people compared to cancer patients within different human races.Keywords: comparative genomics, machine learning, cancer resistance, cancer-protecting alleles
Procedia PDF Downloads 971006 Optimal Image Representation for Linear Canonical Transform Multiplexing
Authors: Navdeep Goel, Salvador Gabarda
Abstract:
Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4x4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4*4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4*4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.Keywords: chirp signals, image multiplexing, image transformation, linear canonical transform, polynomial approximation
Procedia PDF Downloads 4121005 Modeling of Diurnal Pattern of Air Temperature in a Tropical Environment: Ile-Ife and Ibadan, Nigeria
Authors: Rufus Temidayo Akinnubi, M. O. Adeniyi
Abstract:
Existing diurnal air temperature models simulate night time air temperature over Nigeria with high biases. An improved parameterization is presented for modeling the diurnal pattern of air temperature (Ta) which is applicable in the calculation of turbulent heat fluxes in Global climate models, based on Nigeria Micrometeorological Experimental site (NIMEX) surface layer observations. Five diurnal Ta models for estimating hourly Ta from daily maximum, daily minimum, and daily mean air temperature were validated using root-mean-square error (RMSE), Mean Error Bias (MBE) and scatter graphs. The original Fourier series model showed better performance for unstable air temperature parameterizations while the stable Ta was strongly overestimated with a large error. The model was improved with the inclusion of the atmospheric cooling rate that accounts for the temperature inversion that occurs during the nocturnal boundary layer condition. The MBE and RMSE estimated by the modified Fourier series model reduced by 4.45 oC and 3.12 oC during the transitional period from dry to wet stable atmospheric conditions. The modified Fourier series model gave good estimation of the diurnal weather patterns of Ta when compared with other existing models for a tropical environment.Keywords: air temperature, mean bias error, Fourier series analysis, surface energy balance,
Procedia PDF Downloads 2301004 Increasing Business Competitiveness in Georgia in Terms of Globalization
Authors: Badri Gechbaia, Levan Gvarishvili
Abstract:
Despite the fact that a lot of Georgian scientists have worked on the issue of the business competitiveness, it think that it is necessary to deepen the works in this sphere, it is necessary also to perfect the methodology in the estimation of the business competitiveness, we have to display the main factors which define the competitive advantages in the business sphere, we have also to establish the interconnections between the business competitiveness level and the quality of states economical involvement in the international economic processes, we have to define the ways to rise the business competitiveness and its role in the upgrading of countries economic development. The introduction part justifies the actuality of the studied topic and the thesis; It defines the survey subject, the object, and the goals with relevant objectives; theoretical-methodological and informational-statistical base for the survey; what is new in the survey and what the value for its theoretical and practical application is. The aforementioned study is an effort to raise public awareness on this issue. Analysis of the fundamental conditions for the efficient functioning of business in Georgia, identification of reserves for increasing its efficiency based on the assessment of the strengths and weaknesses of the business sector. Methods of system analysis, abstract-logic, induction and deduction, synthesis and generalization, and positive, normative, and comparative analysis are used in the research process. Specific regularities of the impact of the globalization process on the determinants of business competitiveness are established. The reasons for business competitiveness in Georgia have been identifiedKeywords: competitiveness, methodology, georgian, economic
Procedia PDF Downloads 1131003 Different Sampling Schemes for Semi-Parametric Frailty Model
Authors: Nursel Koyuncu, Nihal Ata Tutkun
Abstract:
Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.Keywords: frailty model, ranked set sampling, efficiency, simple random sampling
Procedia PDF Downloads 2111002 Development and Validation of a HPLC Method for 6-Gingerol and 6-Shogaol in Joint Pain Relief Gel Containing Ginger (Zingiber officinale)
Authors: Tanwarat Kajsongkram, Saowalux Rotamporn, Sirinat Limbunruang, Sirinan Thubthimthed.
Abstract:
High-Performance Liquid Chromatography (HPLC) method was developed and validated for simultaneous estimation of 6-Gingerol(6G) and 6-Shogaol(6S) in joint pain relief gel containing ginger extract. The chromatographic separation was achieved by using C18 column, 150 x 4.6mm i.d., 5μ Luna, mobile phase containing acetonitrile and water (gradient elution). The flow rate was 1.0 ml/min and the absorbance was monitored at 282 nm. The proposed method was validated in terms of the analytical parameters such as specificity, accuracy, precision, linearity, range, limit of detection (LOD), limit of quantification (LOQ), and determined based on the International Conference on Harmonization (ICH) guidelines. The linearity ranges of 6G and 6S were obtained over 20-60 and 6-18 µg/ml respectively. Good linearity was observed over the above-mentioned range with linear regression equation Y= 11016x- 23778 for 6G and Y = 19276x-19604 for 6S (x is concentration of analytes in μg/ml and Y is peak area). The value of correlation coefficient was found to be 0.9994 for both markers. The limit of detection (LOD) and limit of quantification (LOQ) for 6G were 0.8567 and 2.8555 µg/ml and for 6S were 0.3672 and 1.2238 µg/ml respectively. The recovery range for 6G and 6S were found to be 91.57 to 102.36 % and 84.73 to 92.85 % for all three spiked levels. The RSD values from repeated extractions for 6G and 6S were 3.43 and 3.09% respectively. The validation of developed method on precision, accuracy, specificity, linearity, and range were also performed with well-accepted results.Keywords: ginger, 6-gingerol, HPLC, 6-shogaol
Procedia PDF Downloads 443