Search results for: TNT equivalent method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19571

Search results for: TNT equivalent method

17501 The Effect of Fast Food Globalisation on Students’ Food Choice

Authors: Ijeoma Chinyere Ukonu

Abstract:

This research seeks to investigate how the globalisation of fast food has affected students’ food choice. A mixed method approach was used in this research; basically involving quantitative and qualitative methods. The quantitative method uses a self-completion questionnaire to randomly sample one hundred and four students; while the qualitative method uses a semi structured interview technique to survey four students on their knowledge and choice to consume fast food. A cross tabulation of variables and the Kruskal Wallis nonparametric test were used to analyse the quantitative data; while the qualitative data was analysed through deduction of themes, and trends from the interview transcribe. The findings revealed that globalisation has amplified the evolution of fast food, popularising it among students. Its global presence has affected students’ food choice and preference. Price, convenience, taste, and peer influence are some of the major factors affecting students’ choice of fast food. Though, students are familiar with the health effect of fast food and the significance of using food information labels for healthy choice making, their preference of fast food is more than homemade food.

Keywords: fast food, food choice, globalisation, students

Procedia PDF Downloads 291
17500 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: association rules, rule-based classification, classification quality, validation

Procedia PDF Downloads 439
17499 Execution of Optimization Algorithm in Cascaded H-Bridge Multilevel Inverter

Authors: M. Suresh Kumar, K. Ramani

Abstract:

This paper proposed the harmonic elimination of Cascaded H-Bridge Multi-Level Inverter by using Selective Harmonic Elimination-Pulse Width Modulation method programmed with Particle Swarm Optimization algorithm. PSO method determine proficiently the required switching angles to eliminate low order harmonics up to the 11th order from the inverter output voltage waveform while keeping the magnitude of the fundamental harmonics at the desired value. Results demonstrate that the proposed method does efficiently eliminate a great number of specific harmonics and the output voltage is resulted in minimum Total Harmonic Distortion. The results shown that the PSO algorithm attain successfully to the global solution faster than other algorithms.

Keywords: multi-level inverter, Selective Harmonic Elimination Pulse Width Modulation (SHEPWM), Particle Swarm Optimization (PSO), Total Harmonic Distortion (THD)

Procedia PDF Downloads 603
17498 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery

Authors: Evans Belly, Imdad Rizvi, M. M. Kadam

Abstract:

Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.

Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery

Procedia PDF Downloads 314
17497 Determination of Marbofloxacin in Pig Plasma Using LC-MS/MS and Its Application to the Pharmacokinetic Studies

Authors: Jeong Woo Kang, MiYoung Baek, Ki-Suk Kim, Kwang-Jick Lee, ByungJae So

Abstract:

Introduction: A fast, easy and sensitive detection method was developed and validated by liquid chromatography tandem mass spectrometry for the determination of marbofloxacin in pig plasma which was further applied to study the pharmacokinetics of marbofloxacin. Materials and Methods: The plasma sample (500 μL) was mixed with 1.5 ml of 0.1% formic acid in MeCN to precipitate plasma proteins. After shaking for 20 min, The mixture was centrifuged at 5,000 × g for 30 min. It was dried under a nitrogen flow at 50℃. 500 μL aliquot of the sample was injected into the LC-MS/MS system. Chromatographic analysis was carried out mobile phase gradient consisting 0.1% formic acid in D.W. (A) and 0.1% formic acid in MeCN (B) with C18 reverse phase column. Mass spectrometry was performed using the positive ion mode and the selected ion monitoring (MRM). Results and Conclusions: The method validation was performed in the sample matrix. Good linearities (R2>0.999) were observed and the quantified average recoveries of marbofloxacin were 87 - 92% at level of 10 ng g-1 -100 ng g-1. The percent of coefficient of variation (CV) for the described method was less than 10 % over the range of concentrations studied. The limits of detection (LOD) and quantification (LOQ) were 2 and 5 ng g-1, respectively. This method has also been applied successfully to pharmacokinetic analysis of marbofloxacin after intravenous (IV), intramuscular (IM) and oral administration (PO). The mean peak plasma concentration (Cmax) was 2,597 ng g-1at 0.25 h, 2,587 ng g-1at 0.44 h and 2,355 ng g-1at 1.58 h for IV, IM and PO, respectively. The area under the plasma concentration-time curve (AUC0–t) was 24.8, 29.0 and 25.2 h μg/mL for IV, IM and PO, respectively. The elimination half-life (T1/2) was 8.6, 13.1 and 9.5 for IV, IM and PO, respectively. Bioavailability (F) of the marbofloxacin in pig was 117 and 101 % for IM and PO, respectively. Based on these result, marbofloxacin does not have any obstacles as therapeutics to develop the oral formulations such as tablets and capsules.

Keywords: marbofloxacin, LC-MS/MS, pharmacokinetics, chromatographic

Procedia PDF Downloads 548
17496 Bending Tests for the Axial Load Identifications in Space Structures with Unknown Boundary Conditions

Authors: M. Bonopera, N. Tullini, C. C. Chen, T. K. Lin, K. C. Chang

Abstract:

This paper presents the extension of a static method for the axial load identifications in prismatic beam-columns with uncertain length and unknown boundary conditions belonging to generic space structures, such as columns of space frames or struts and ties of space trusses. The non-destructive method requires the knowledge of the beam-column flexural rigidity only. Flexural displacements are measured at five cross sections along the beam-column subjected to an additional vertical load at the mid-span. Unlike analogous dynamic methods, any set of experimental data may be used in the identification procedure. The method is verified by means of many numerical and experimental tests on beam-columns having unknown boundary conditions and different slenderness belonging to three different space prototypes in small-scale. Excellent estimates of the tensile and compressive forces are obtained for the elements with higher slenderness and when the greatest possible distance between sensors is adopted. Moreover, the application of larger values of the vertical load and very accurate displacement measurements are required. The method could be an efficacious technique in-situ, considering that safety inspections will become increasingly important in the near future, especially because of the improvement of the material properties that allowed designing space structures composed of beam-columns with higher slenderness.

Keywords: force identification, in-situ test, space structure, static test

Procedia PDF Downloads 245
17495 Semilocal Convergence of a Three Step Fifth Order Iterative Method under Hölder Continuity Condition in Banach Spaces

Authors: Ramandeep Behl, Prashanth Maroju, S. S. Motsa

Abstract:

In this paper, we study the semilocal convergence of a fifth order iterative method using recurrence relation under the assumption that first order Fréchet derivative satisfies the Hölder condition. Also, we calculate the R-order of convergence and provide some a priori error bounds. Based on this, we give existence and uniqueness region of the solution for a nonlinear Hammerstein integral equation of the second kind.

Keywords: Holder continuity condition, Frechet derivative, fifth order convergence, recurrence relations

Procedia PDF Downloads 612
17494 Wave Interaction with Defects in Pressurized Composite Structures

Authors: R. K. Apalowo, D. Chronopoulos, V. Thierry

Abstract:

A wave finite element (WFE) and finite element (FE) based computational method is presented by which the dispersion properties as well as the wave interaction coefficients for one-dimensional structural system can be predicted. The structural system is discretized as a system comprising a number of waveguides connected by a coupling joint. Uniform nodes are ensured at the interfaces of the coupling element with each waveguide. Then, equilibrium and continuity conditions are enforced at the interfaces. Wave propagation properties of each waveguide are calculated using the WFE method and the coupling element is modelled using the FE method. The scattering of waves through the coupling element, on which damage is modelled, is determined by coupling the FE and WFE models. Furthermore, the central aim is to evaluate the effect of pressurization on the wave dispersion and scattering characteristics of the prestressed structural system compared to that which is not prestressed. Numerical case studies are exhibited for two waveguides coupled through a coupling joint.

Keywords: Finite Element, Prestressed Structures, Wave Finite Element, Wave Propagation Properties, Wave Scattering Coefficients.

Procedia PDF Downloads 295
17493 Effect of Scalping on the Mechanical Behavior of Coarse Soils

Authors: Nadine Ali Hassan, Ngoc Son Nguyen, Didier Marot, Fateh Bendahmane

Abstract:

This paper aims at presenting a study of the effect of scalping methods on the mechanical properties of coarse soils by resorting to numerical simulations based on the discrete element method (DEM) and experimental triaxial tests. Two reconstitution methods are used, designated as scalping method and substitution method. Triaxial compression tests are first simulated on a granular materials with a grap graded particle size distribution by using the DEM. We study the effect of these reconstitution methods on the stress-strain behavior of coarse soils with different fine contents and with different ways to control the densities of the scalped and substituted materials. Experimental triaxial tests are performed on original mixtures of sands and gravels with different fine contents and on their corresponding scalped and substituted samples. Numerical results are qualitatively compared to experimental ones. Agreements and discrepancies between these results are also discussed.

Keywords: coarse soils, mechanical behavior, scalping, replacement, triaxial devices

Procedia PDF Downloads 207
17492 Dynamical Characteristics of Interaction between Water Droplet and Aerosol Particle in Dedusting Technology

Authors: Ding Jue, Li Jiahua, Lei Zhidi, Weng Peifen, Li Xiaowei

Abstract:

With the rapid development of national modern industry, people begin to pay attention to environmental pollution and harm caused by industrial dust. Based on above, a numerical study on the dedusting technology of industrial environment was conducted. The dynamic models of multicomponent particles collision and coagulation, breakage and deposition are developed, and the interaction of water droplet and aerosol particle in 2-Dimension flow field was researched by Eulerian-Lagrangian method and Multi-Monte Carlo method. The effects of the droplet scale, movement speed of droplet and the flow field structure on scavenging efficiency were analyzed. The results show that under the certain condition, 30μm of droplet has the best scavenging efficiency. At the initial speed 1m/s of droplets, droplets and aerosol particles have more time to interact, so it has a better scavenging efficiency for the particle.

Keywords: water droplet, aerosol particle, collision and coagulation, multi-monte carlo method

Procedia PDF Downloads 307
17491 Preparation of Regional Input-Output Table for Fars Province in 2011: GRIT1Method

Authors: Maryam Akbarzadeh, F. Esmaeilzadeh, A. Poostvar, M. Manuchehri

Abstract:

Preparation of regional input-output tables requires statistical methods combined with high costs and too much time. Obtained estimates by non-statistical methods have low confidence coefficient. Therefore, integrated methods for this purpose are suggested by recent input–output studies. In this study, first GRIT method is introduced as an appropriate integrated method for preparation of input-output table of Fars province. Next, input-output table is prepared for Fars province using this method. Therefore, this study is based on input-output table of national economy in 2001. Necessary modifications performed in the field of changes at level of prices and differences of regional trade compared with other areas at national level. Moreover, up to date statistics and information and technical experts view on the various economic sectors along with input-output table 33 was used in 2011 followed by investigation of general structure of the province economy based on the amounts of added value obtained from this table.

Keywords: grit, input-output, table, regional

Procedia PDF Downloads 260
17490 Symbolic Computation and Abundant Travelling Wave Solutions to Modified Burgers' Equation

Authors: Muhammad Younis

Abstract:

In this article, the novel (G′/G)-expansion method is successfully applied to construct the abundant travelling wave solutions to the modified Burgers’ equation with the aid of computation. The method is reliable and useful, which gives more general exact travelling wave solutions than the existing methods. These obtained solutions are in the form of hyperbolic, trigonometric and rational functions including solitary, singular and periodic solutions which have many potential applications in physical science and engineering. Some of these solutions are new and some have already been constructed. Additionally, the constraint conditions, for the existence of the solutions are also listed.

Keywords: traveling wave solutions, NLPDE, computation, integrability

Procedia PDF Downloads 433
17489 A Method for Reduction of Association Rules in Data Mining

Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa

Abstract:

The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.

Keywords: data mining, association rules, rules reduction, artificial intelligence

Procedia PDF Downloads 161
17488 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images

Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.

Keywords: diabetic retinopathy, fundus, CHT, exudates, hemorrhages

Procedia PDF Downloads 272
17487 Developing a Hybrid Method to Diagnose and Predict Sports Related Concussions with Machine Learning

Authors: Melody Yin

Abstract:

Concussions impact a large amount of adolescents; they make up as much as half of the diagnosed concussions in America. This research proposes a hybrid machine learning model based on the combination of human/knowledge-based domains and computer-generated feature rankings to improve the accuracy of diagnosing sports related concussion (SRC). Using a data set of symptoms collected on the sideline post-SRC events, the symptom selection criteria method has been developed by using Google AutoML's important score function to identify the top 10 symptom features. In addition, symptom domains have been introduced as another parameter, categorizing the symptoms into physical, cognitive, sleep, and emotional domains. The hybrid machine learning model has been trained with a combination of the top 10 symptoms and 4 domains. From the results, the hybrid model was the best performer for symptom resolution time prediction in 2 and 4-week thresholds. This research is a proof of concept study in the use of domains along with machine learning in order to improve concussion prediction accuracy. It is also possible that the use of domains can make the model more efficient due to reduced training time. This research examines the use of a hybrid method in predicting sports-related concussion. This achievement is based on data preprocessing, using a hybrid method to select criteria to achieve high performance.

Keywords: hybrid model, machine learning, sports related concussion, symptom resolution time

Procedia PDF Downloads 168
17486 An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing Electrocardiogram Based on ResNet and Bi-Long Short-Term Memory

Authors: Yang Zhang, Jian He

Abstract:

Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper introduces sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for coronary heart disease prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network.

Keywords: Bi-LSTM, CHD, ECG, ResNet, sliding window

Procedia PDF Downloads 89
17485 Corrosion Protection of Structural Steel by Surfactant Containing Reagents

Authors: D. Erdenechimeg, T. Bujinlkham, N. Erdenepurev

Abstract:

The anti-corrosion performance of fatty acid coated mild steel samples is studied. Samples of structural steel coated with collector reagents deposited from surfactant in ethanol solution and overcoated with an epoxy barrier paint. A quantitative corrosion rate was determined by linear polarization resistance method using biopotentiostat/galvanostat 400. Coating morphology was determined by scanning electronic microscopy. A test for hydrophobic surface of steel by surfactant was done. From the samples, the main component or high content iron was determined by chemical method and other metal contents were determined by Inductively Coupled Plasma-Optical Emission Spectrometry (ICP-OES) method. Prior to measuring the corrosion rate, mechanical and chemical treatments were performed to prepare the test specimens. Overcoating the metal samples with epoxy barrier paint after exposing them with surfactant the corrosion rate can be inhibited by 34-35 µm/year.

Keywords: corrosion, linear polarization resistance, coating, surfactant

Procedia PDF Downloads 99
17484 An Analysis of Different Essential Components of Flight Plan Operations at Low Altitude

Authors: Apisit Nawapanpong, Natthapat Boonjerm

Abstract:

This project aims to analyze and identify the flight plan of low-altitude aviation in Thailand and other countries. The development of UAV technology has led the innovation and revolution in the aviation industry; this includes the development of new modes of passenger or freight transportation, and it has also affected other industries widely. At present, this technology is being developed rapidly and has been tested all over the world to make the most efficient for technology or innovation, and it is likely to grow more extensively. However, no flight plan for low-altitude operation has been published by the government organization; when compared with high-altitude aviation with manned aircraft, various unique factors are different, whether mission, operation, altitude range or airspace restrictions. In the study of the essential components of low-altitude operation measures to be practical and tangible, there were major problems, so the main consideration of this project is to analyze the components of low-altitude operations which are conducted up to the altitudes of 400 ft or 120 meters above ground level referring to the terrain, for example, air traffic management, classification of aircraft, basic necessity and safety, and control area. This research will focus on confirming the theory through qualitative and quantitative research combined with theoretical modeling and regulatory framework and by gaining insights from various positions in aviation industries, including aviation experts, government officials, air traffic controllers, pilots, and airline operators to identify the critical essential components of low-altitude flight operation. This project analyzes by using computer programs for science and statistics research to prove that the result is equivalent to the theory and be beneficial for regulating the flight plan for low-altitude operation by different essential components from this project and can be further developed for future studies and research in aviation industries.

Keywords: low-altitude aviation, UAV technology, flight plan, air traffic management, safety measures

Procedia PDF Downloads 68
17483 Structural Evaluation of Airfield Pavement Using Finite Element Analysis Based Methodology

Authors: Richard Ji

Abstract:

Nondestructive deflection testing has been accepted widely as a cost-effective tool for evaluating the structural condition of airfield pavements. Backcalculation of pavement layer moduli can be used to characterize the pavement existing condition in order to compute the load bearing capacity of pavement. This paper presents an improved best-fit backcalculation methodology based on deflection predictions obtained using finite element method (FEM). The best-fit approach is based on minimizing the squared error between falling weight deflectometer (FWD) measured deflections and FEM predicted deflections. Then, concrete elastic modulus and modulus of subgrade reaction were back-calculated using Heavy Weight Deflectometer (HWD) deflections collected at the National Airport Pavement Testing Facility (NAPTF) test site. It is an alternative and more versatile method in considering concrete slab geometry and HWD testing locations compared to methods currently available.

Keywords: nondestructive testing, pavement moduli backcalculation, finite element method, concrete pavements

Procedia PDF Downloads 166
17482 Statistical Description of Counterpoise Effective Length Based on Regressive Formulas

Authors: Petar Sarajcev, Josip Vasilj, Damir Jakus

Abstract:

This paper presents a novel statistical description of the counterpoise effective length due to lightning surges, where the (impulse) effective length had been obtained by means of regressive formulas applied to the transient simulation results. The effective length is described in terms of a statistical distribution function, from which median, mean, variance, and other parameters of interest could be readily obtained. The influence of lightning current amplitude, lightning front duration, and soil resistivity on the effective length has been accounted for, assuming statistical nature of these parameters. A method for determining the optimal counterpoise length, in terms of the statistical impulse effective length, is also presented. It is based on estimating the number of dangerous events associated with lightning strikes. Proposed statistical description and the associated method provide valuable information which could aid the design engineer in optimising physical lengths of counterpoises in different grounding arrangements and soil resistivity situations.

Keywords: counterpoise, grounding conductor, effective length, lightning, Monte Carlo method, statistical distribution

Procedia PDF Downloads 426
17481 Design, Synthesis and Pharmacological Investigation of Novel 2-Phenazinamine Derivatives as a Mutant BCR-ABL (T315I) Inhibitor

Authors: Gajanan M. Sonwane

Abstract:

Nowadays, the entire pharmaceutical industry is facing the challenge of increasing efficiency and innovation. The major hurdles are the growing cost of research and development and a concurrent stagnating number of new chemical entities (NCEs). Hence, the challenge is to select the most druggable targets and to search the equivalent drug-like compounds, which also possess specific pharmacokinetic and toxicological properties that allow them to be developed as drugs. The present research work includes the studies of developing new anticancer heterocycles by using molecular modeling techniques. The heterocycles synthesized through such methodology are much effective as various physicochemical parameters have been already studied and the structure has been optimized for its best fit in the receptor. Hence, on the basis of the literature survey and considering the need to develop newer anticancer agents, new phenazinamine derivatives were designed by subjecting the nucleus to molecular modeling, viz., GQSAR analysis and docking studies. Simultaneously, these designed derivatives were subjected to in silico prediction of biological activity through PASS studies and then in silico toxicity risk assessment studies. In PASS studies, it was found that all the derivatives exhibited a good spectrum of biological activities confirming its anticancer potential. The toxicity risk assessment studies revealed that all the derivatives obey Lipinski’s rule. Amongst these series, compounds 4c, 5b and 6c were found to possess logP and drug-likeness values comparable with the standard Imatinib (used for anticancer activity studies) and also with the standard drug methotrexate (used for antimitotic activity studies). One of the most notable mutations is the threonine to isoleucine mutation at codon 315 (T315I), which is known to be resistant to all currently available TKI. Enzyme assay planned for confirmation of target selective activity.

Keywords: drug design, tyrosine kinases, anticancer, Phenazinamine

Procedia PDF Downloads 116
17480 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane

Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo

Abstract:

Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.

Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining

Procedia PDF Downloads 86
17479 Current Characteristic of Water Electrolysis to Produce Hydrogen, Alkaline, and Acid Water

Authors: Ekki Kurniawan, Yusuf Nur Jayanto, Erna Sugesti, Efri Suhartono, Agus Ganda Permana, Jaspar Hasudungan, Jangkung Raharjo, Rintis Manfaati

Abstract:

The purpose of this research is to study the current characteristic of the electrolysis of mineral water to produce hydrogen, alkaline water, and acid water. Alkaline and hydrogen water are believed to have health benefits. Alkaline water containing hydrogen can be an anti-oxidant that captures free radicals, which will increase the immune system. In Indonesia, there are two existing types of alkaline water producing equipment, but the installation is complicated, and the price is relatively expensive. The electrolysis process is slow (6-8 hours) since they are locally made using 311 VDC full bridge rectifier power supply. This paper intends to discuss how to make hydrogen and alkaline water by a simple portable mineral water ionizer. This is an electrolysis device that is easy to carry and able to separate ions of mineral water into acidic and alkaline water. With an electric field, positive ions will be attracted to the cathode, while negative ions will be attracted to the anode. The circuit equivalent can be depicted as RLC transient ciruit. The diode component ensures that the electrolytic current is direct current. Switch S divides the switching times t1, t2, and t3. In the first stage up to t1, the electrolytic current increases exponentially, as does the inductor charging current (L). The molecules in drinking water experience magnetic properties. The direction of the dipole ions, which are random in origin, will regularly flare with the direction of the electric field. In the second stage up to t2, the electrolytic current decreases exponentially, just like the charging current of a capacitor (C). In the 3rd stage, start t3 until it tends to be constant, as is the case with the current flowing through the resistor (R).

Keywords: current electrolysis, mineral water, ions, alkaline and acid waters, inductor, capacitor, resistor

Procedia PDF Downloads 112
17478 Improving Topic Quality of Scripts by Using Scene Similarity Based Word Co-Occurrence

Authors: Yunseok Noh, Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park

Abstract:

Scripts are one of the basic text resources to understand broadcasting contents. Since broadcast media wields lots of influence over the public, tools for understanding broadcasting contents are more required. Topic modeling is the method to get the summary of the broadcasting contents from its scripts. Generally, scripts represent contents descriptively with directions and speeches. Scripts also provide scene segments that can be seen as semantic units. Therefore, a script can be topic modeled by treating a scene segment as a document. Because scripts consist of speeches mainly, however, relatively small co-occurrences among words in the scene segments are observed. This causes inevitably the bad quality of topics based on statistical learning method. To tackle this problem, we propose a method of learning with additional word co-occurrence information obtained using scene similarities. The main idea of improving topic quality is that the information that two or more texts are topically related can be useful to learn high quality of topics. In addition, by using high quality of topics, we can get information more accurate whether two texts are related or not. In this paper, we regard two scene segments are related if their topical similarity is high enough. We also consider that words are co-occurred if they are in topically related scene segments together. In the experiments, we showed the proposed method generates a higher quality of topics from Korean drama scripts than the baselines.

Keywords: broadcasting contents, scripts, text similarity, topic model

Procedia PDF Downloads 318
17477 A Comparative Study of Indoor Radon Concentrations between Dwellings and Workplaces in the Ko Samui District, Surat Thani Province, Southern Thailand

Authors: Kanokkan Titipornpun, Tripob Bhongsuwan, Jan Gimsa

Abstract:

The Ko Samui district of Surat Thani province is located in the high amounts of equivalent uranium in the ground surface that is the source of radon. Our research in the Ko Samui district aimed at comparing the indoor radon concentrations between dwellings and workplaces. Measurements of indoor radon concentrations were carried out in 46 dwellings and 127 workplaces, using CR-39 alpha-track detectors in closed-cup. A total of 173 detectors were distributed in 7 sub-districts. The detectors were placed in bedrooms of dwellings and workrooms of workplaces. All detectors were exposed to airborne radon for 90 days. After exposure, the alpha tracks were made visible by chemical etching before they were manually counted under an optical microscope. The track densities were assumed to be correlated with the radon concentration levels. We found that the radon concentrations could be well described by a log-normal distribution. Most concentrations (37%) were found in the range between 16 and 30 Bq.m-3. The radon concentrations in dwellings and workplaces varied from a minimum of 11 Bq.m-3 to a maximum of 305 Bq.m-3. The minimum (11 Bq.m-3) and maximum (305 Bq.m-3) values of indoor radon concentrations were found in a workplace and a dwelling, respectively. Only for four samples (3%), the indoor radon concentrations were found to be higher than the reference level recommended by the WHO (100 Bq.m-3). The overall geometric mean in the surveyed area was 32.6±1.65 Bq.m-3, which was lower than the worldwide average (39 Bq.m-3). The statistic comparison of the geometric mean indoor radon concentrations between dwellings and workplaces showed that the geometric mean in dwellings (46.0±1.55 Bq.m-3) was significantly higher than in workplaces (28.8±1.58 Bq.m-3) at the 0.05 level. Moreover, our study found that the majority of the bedrooms in dwellings had a closed atmosphere, resulting in poorer ventilation than in most of the workplaces that had access to air flow through open doors and windows at daytime. We consider this to be the main reason for the higher geometric mean indoor radon concentration in dwellings compared to workplaces.

Keywords: CR-39 detector, indoor radon, radon in dwelling, radon in workplace

Procedia PDF Downloads 280
17476 Analysis of the Inverse Kinematics for 5 DOF Robot Arm Using D-H Parameters

Authors: Apurva Patil, Maithilee Kulkarni, Ashay Aswale

Abstract:

This paper proposes an algorithm to develop the kinematic model of a 5 DOF robot arm. The formulation of the problem is based on finding the D-H parameters of the arm. Brute Force iterative method is employed to solve the system of non linear equations. The focus of the paper is to obtain the accurate solutions by reducing the root mean square error. The result obtained will be implemented to grip the objects. The trajectories followed by the end effector for the required workspace coordinates are plotted. The methodology used here can be used in solving the problem for any other kinematic chain of up to six DOF.

Keywords: 5 DOF robot arm, D-H parameters, inverse kinematics, iterative method, trajectories

Procedia PDF Downloads 202
17475 Microwave Accelerated Simultaneous Distillation –Extraction: Preparative Recovery of Volatiles from Food Products

Authors: Ferhat Mohamed, Boukhatem Mohamed Nadjib, Chemat Farid

Abstract:

Simultaneous distillation–extraction (SDE) is routinely used by analysts for sample preparation prior to gas chromatography analysis. In this work, a new process design and operation for microwave assisted simultaneous distillation – solvent extraction (MW-SDE) of volatile compounds was developed. Using the proposed method, isolation, extraction and concentration of volatile compounds can be carried out in a single step. To demonstrate its feasibility, MW-SDE was compared with the conventional technique, Simultaneous distillation–extraction (SDE), for gas chromatography-mass spectrometry (GC-MS) analysis of volatile compounds in a fresh orange juice and a dry spice “carvi seeds”. SDE method required long time (3 h) to isolate the volatile compounds, and large amount of organic solvent (200 mL of hexane) for further extraction, while MW-SDE needed little time (only 30 min) to prepare sample, and less amount of organic solvent (10 mL of hexane). These results show that MW-SDE–GC-MS is a simple, rapid and solvent-less method for determination of volatile compounds from aromatic plants.

Keywords: essential oil, extraction, distillation, carvi seeds

Procedia PDF Downloads 560
17474 Development of Ecofriendly Ionic Liquid Modified Reverse Phase Liquid Chromatography Method for Simultaneous Determination of Anti-Hyperlipidemic Drugs

Authors: Hassan M. Albishri, Fatimah Al-Shehri, Deia Abd El-Hady

Abstract:

Among the analytical techniques, reverse phase liquid chromatography (RPLC) is currently used in pharmaceutical industry. Ecofriendly analytical chemistry offers the advantages of decreasing the environmental impact with the advantage of increasing operator safety which constituted a topic of industrial interest. Recently, ionic liquids have been successfully used to reduce or eliminate the conventional organic toxic solvents. In the current work, a simple and ecofriendly ionic liquid modified RPLC (IL-RPLC) method has been firstly developed and compared with RPLC under acidic and neutral mobile phase conditions for simultaneous determination of atorvastatin-calcium, rosuvastatin and simvastatin. Several chromatographic effective parameters have been changed in a systematic way. Adequate results have been achieved by mixing ILs with ethanol as a mobile phase under neutral conditions at 1 mL/min flow rate on C18 column. The developed IL-RPLC method has been validated for the quantitative determination of drugs in pharmaceutical formulations. The method showed excellent linearity for analytes in a wide range of concentrations with acceptable precise and accurate data. The current IL-RPLC technique could have vast applications particularly under neutral conditions for simple and greener (bio)analytical applications of pharmaceuticals.

Keywords: ionic liquid, RPLC, anti-hyperlipidemic drugs, ecofriendly

Procedia PDF Downloads 256
17473 Received Signal Strength Indicator Based Localization of Bluetooth Devices Using Trilateration: An Improved Method for the Visually Impaired People

Authors: Muhammad Irfan Aziz, Thomas Owens, Uzair Khaleeq uz Zaman

Abstract:

The instantaneous and spatial localization for visually impaired people in dynamically changing environments with unexpected hazards and obstacles, is the most demanding and challenging issue faced by the navigation systems today. Since Bluetooth cannot utilize techniques like Time Difference of Arrival (TDOA) and Time of Arrival (TOA), it uses received signal strength indicator (RSSI) to measure Receive Signal Strength (RSS). The measurements using RSSI can be improved significantly by improving the existing methodologies related to RSSI. Therefore, the current paper focuses on proposing an improved method using trilateration for localization of Bluetooth devices for visually impaired people. To validate the method, class 2 Bluetooth devices were used along with the development of a software. Experiments were then conducted to obtain surface plots that showed the signal interferences and other environmental effects. Finally, the results obtained show the surface plots for all Bluetooth modules used along with the strong and weak points depicted as per the color codes in red, yellow and blue. It was concluded that the suggested improved method of measuring RSS using trilateration helped to not only measure signal strength affectively but also highlighted how the signal strength can be influenced by atmospheric conditions such as noise, reflections, etc.

Keywords: Bluetooth, indoor/outdoor localization, received signal strength indicator, visually impaired

Procedia PDF Downloads 134
17472 Antiproliferative Effect of Polyphenols from Crocus sativus L. Leaves on Human Colon Adenocarcinoma Cells (Caco-2)

Authors: Gonzalo Ortiz de Elguea-Culebras, Raúl Sánchez-Vioquea, Adela Mena-Morales, Manuel Alaiz, Enrique Melero-Bravo, Esteban García-Romero, Javier Vioque, Lourdes Marchante-Cuevas, Julio Girón-Calle

Abstract:

Saffron (Crocus sativus L.) is a highly valued crop for the manufacture of spice that consists of the dried stigma of the flowers. This is in contrast to other underutilized parts of the saffron plant as leaves, which represent abundant biomass whose use might help to enhance the sustainability of the saffron crop. Saffron leaves contain significant amounts of phenolic compounds, 7.8 equivalent grams of gallic acid per 100g of extract, and are very promising compounds in terms of exploring novel uses of saffron leaves. Given that phenolic compounds have numerous effects on cancer-related biological pathways, we have investigated the in vitro antiproliferative effect of saffron leaf polyphenols against human colon adenocarcinoma cells (Caco-2). Polyphenols were extracted from leaves with 70% ethanol, defatted with hexane, and purified by solid phase extraction using C18 silica gel and then silica gel 60. Analysis of polyphenols was performed by HPLC-ESI-MS. Di-, tri-, and tetrahexosides of quercetin, kaempferol, and isorhamnetin, as well as C-hexosides like isoorientin and vitexin, were tentatively identified. Polyphenols strongly inhibited the proliferation of Caco-2 cells, which is consistent with model studies in which several of the polyphenols identified in saffron leaves have demonstrated their potential as chemopreventive agents in cancer. Due to the low profitability that saffron leaf currently represents, we consider these results very encouraging and that this by-product deserves further investigation as a potential source of active molecules against colorectal cancer.

Keywords: saffron leaves, agricultural by-products, polyphenols, antiproliferative effect, human colon adenocarcinoma cells

Procedia PDF Downloads 94