Search results for: sample weights
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1479

Search results for: sample weights

1359 Weight-Based Query Optimization System Using Buffer

Authors: Kashif Irfan, Fahad Shahbaz Khan, Tehseen Zia, M. A. Anwar

Abstract:

Fast retrieval of data has been a need of user in any database application. This paper introduces a buffer based query optimization technique in which queries are assigned weights according to their number of execution in a query bank. These queries and their optimized executed plans are loaded into the buffer at the start of the database application. For every query the system searches for a match in the buffer and executes the plan without creating new plans.

Keywords: Query Bank, Query Matcher, Weight Manager.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1220
1358 Physical-Chemical Surface Characterization of Lake Nasser Sediments

Authors: Yousra M. Zakaria Helmy, Edward H. Smith

Abstract:

Lake Nasser is one of the largest reservoirs in the world. Over 120 million metric tons of sediments are deposited in its dead storage zone every year. The main objective of the present work was to determine the physical and chemical characteristics of Lake Nasser sediments. The sample had a relatively low surface area of 2.9 m2/g which increased more than 3-fold upon chemical activation. The main chemical elements of the raw sediments were C, O and Si with some traces of Al, Fe and Ca. The organic functional groups for the tested sample included O-H, C=C, C-H and C-O, with indications of Si-O and other metal-C and/or metal-O bonds normally associated with clayey materials. Potentiometric titration of the sample in different ionic strength backgrounds revealed an alkaline material with very strong positive surface charge at pH values just a little less than the pH of zero charge which is ~9. Surface interactions of the sediments with the background electrolyte were significant. An advanced surface complexation model was able to capture these effects, employing a single-site approach to represent protolysis reactions in aqueous solution, and to determine the significant surface species in the pH range of environmental interest.

Keywords: Lake Nasser, sediments, surface characterization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
1357 Quantitative Determination of Trace Elements in Some Oriental Herb Products

Authors: Nguyen Thi Kim Dzung, Pham Ngoc Khai, Rainer Ludwig

Abstract:

The quantitative determination of several trace elements (Cr, As, Se, Cd, Hg, Pb) existing as inorganic impurities in some oriental herb-products such as Lingzhi Mushroom capsules, Philamin powder, etc using ICP-MS has been studied. Various instrumental parameters such as power, gas flow rate, sample depth, as well as the concentration of nitric acid and thick background due to high concentration of possible interferences on the determination of these above-mentioned elements was investigated and the optimum working conditions of the sample measurement on ICP-MS (Agilent-7500a) were reported. Appropriate isotope internal standards were also used to improve the accuracy of mercury determination. Optimal parameters for sampling digestion were also investigated. The recovery of analytical procedure was examined by using a Certified Reference Material (IAEA-CRM 359). The recommended procedure was then applied for the quantitative determination of Cr, As, Se, Cd, Hg, Pb in Lingzhi Mushroom capsule, and Philamine powder samples. The reproducibility of sample measurement (average value between 94 and 102%) and the uncertainty of analytical data (less than 20%) are acceptable.

Keywords: Oriental herbal product, trace elements, ICP-MS, biochemistry, medical chemistry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3666
1356 On a Conjecture Regarding the Adam Optimizer

Authors: Mohamed Akrout, Douglas Tweed

Abstract:

The great success of deep learning relies on efficient optimizers, which are the algorithms that decide how to adjust network weights and biases based on gradient information. One of the most effective and widely used optimizers in recent years has been the method of adaptive moments, or Adam, but the mathematical reasons behind its effectiveness are still unclear. Attempts to analyse its behaviour have remained incomplete, in part because they hinge on a conjecture which has never been proven, regarding ratios of powers of the first and second moments of the gradient. Here we show that this conjecture is in fact false, but that a modified version of it is true, and can take its place in analyses of Adam.

Keywords: Adam optimizer, Bock’s conjecture, stochastic optimization, average regret.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 295
1355 The Role and Effectiveness of Audit Committee in Corporate Governance of Credit Institutions

Authors: Tina Vuko, Marija Maretić, Marko Čular

Abstract:

The aim of this study is to analyze the role and effectiveness of internal mechanism (audit committee) of corporate governance on credit institutions performance in Croatia. Based on research objective, sample of 78 credit institutions listed on Zagreb Stock Exchange, from 2007 to 2012, has been collected and efficiency index of audit committee (EIAC) has been created. Based on the sample and created EIAC, conclusions are as follows: audit committees of credit institutions have medium efficiency, based on EIAC measurement; there is a significant difference in audit committee effectiveness, in observed period; there is no positive relationship between audit committee effectiveness and credit institution performance; there is a significant difference between level of audit committee effectiveness and audit firm type. Future research should contain increased number of elements in EIAC creation and increased sample, for all obligators who need to establish audit committee.

Keywords: Corporate Governance, Audit Committee, Financial Institutions, Efficiency Index of Audit Committee.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2564
1354 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: Coherent covering location problem, efficient frontier, Lagrangian relaxation, data envelopment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 761
1353 Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss

Authors: H. Bevrani, N. Najafi

Abstract:

This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.

Keywords: Bayesian inference, Beta-binomial Distribution, LPLcriteria, quadratic loss function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
1352 Structural Analysis of Warehouse Rack Construction for Heavy Loads

Authors: C. Kozkurt, A. Fenercioglu, M. Soyaslan

Abstract:

In this study rack systems that are structural storage units of warehouses have been analyzed as structural with Finite Element Method (FEA). Each cell of discussed rack system storages pallets which have from 800 kg to 1000 kg weights and 0.80x1.15x1.50 m dimensions. Under this load, total deformations and equivalent stresses of structural elements and principal stresses, tensile stresses and shear stresses of connection elements have been analyzed. The results of analyses have been evaluated according to resistance limits of structural and connection elements. Obtained results have been presented as visual and magnitude.

Keywords: warehouse, structural analysis, AS/RS, FEM, FEA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3676
1351 An ensemble of Weighted Support Vector Machines for Ordinal Regression

Authors: Willem Waegeman, Luc Boullart

Abstract:

Instead of traditional (nominal) classification we investigate the subject of ordinal classification or ranking. An enhanced method based on an ensemble of Support Vector Machines (SVM-s) is proposed. Each binary classifier is trained with specific weights for each object in the training data set. Experiments on benchmark datasets and synthetic data indicate that the performance of our approach is comparable to state of the art kernel methods for ordinal regression. The ensemble method, which is straightforward to implement, provides a very good sensitivity-specificity trade-off for the highest and lowest rank.

Keywords: Ordinal regression, support vector machines, ensemblelearning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
1350 Dispersed Error Control based on Error Filter Design for Improving Halftone Image Quality

Authors: Sang-Chul Kim, Sung-Il Chien

Abstract:

The error diffusion method generates worm artifacts, and weakens the edge of the halftone image when the continuous gray scale image is reproduced by a binary image. First, to enhance the edges, we propose the edge-enhancing filter by considering the quantization error information and gradient of the neighboring pixels. Furthermore, to remove worm artifacts often appearing in a halftone image, we add adaptively random noise into the weights of an error filter.

Keywords: Artifact suppression, Edge enhancement, Error diffusion method, Halftone image

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1378
1349 Influence of Overfeeding on Productive Performance Traits, Foie Gras Production, Blood Parameters, Internal Organs, Carcass Traits, and Mortality Rate in Two Breeds of Ducks

Authors: El-Sayed, Mona, Y., U. E. Mahrous

Abstract:

A total of 60 male mule ducks and 60 male Muscovy ducks were allotted into three groups (n = 20) to estimate the effects of overfeeding (two and four meals) versus ad libitum feeding on productive performance traits, foie gras production, internal organs, and blood parameters.

The results show that force-feeding four meals significantly increased (P < 0.01) body weight, weight gain, and gain percentage compared to force-feeding two meals. Both force-feeding regimes (two or four meals) induced significantly higher body weight, weight gain, gain percentage, and absolute carcass weight than ad libitum feeding; however, carcass percentage was significantly higher in ad libitum feeding. Mule ducks had significantly higher weight gain and weight gain percentages than Muscovy ducks.

Feed consumption per kilogram of foie gras and per kilogram weight gain was lower for the four-meal than for the two-meal forced feeding regime. Force-feeding four meals induced significantly higher liver weight and percentage (488.96 ± 25.78g, 7.82 ± 0.40%) than force-feeding two meals (381.98 ± 13.60g, 6.42 ± 0.21%). Moreover, feed conversion was significantly higher under forced feeding than under ad libitum feeding (77.65 ± 3.41g, 1.72 ± 0.05%; P < 0.01).

Forced feeding (two or four meals) increased all organ weights (intestine, proventriculus, heart, spleen, and pancreas) over ad libitum feeding weights, except for the gizzard; however intestinal and abdominal fat values were higher for four-meal forced feeding than for two-meal forced feeding.

Overfeeding did not change blood parameters significantly compared to ad libitum feeding; however, four-meal forced feeding improved the quality of foie gras since it significantly increased the percentage of grade A foie gras (62.5%) at the expense of grades B (33.33%) and C (4.17%) compared with the two-meal forced feeding.

The mortality percentage among Muscovy ducks during the forced feeding period was 22.5%, compared to 0% in mule ducks. Liver weight was highly significantly correlated with life weight after overfeeding and certain blood plasma traits.

Keywords: Foie gras, overfeeding, ducks, productive performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2506
1348 Particle Swarm Optimization with Interval-valued Genotypes and Its Application to Neuroevolution

Authors: Hidehiko Okada

Abstract:

The author proposes an extension of particle swarm optimization (PSO) for solving interval-valued optimization problems and applies the extended PSO to evolutionary training of neural networks (NNs) with interval weights. In the proposed PSO, values in the genotypes are not real numbers but intervals. Experimental results show that interval-valued NNs trained by the proposed method could well approximate hidden target functions despite the fact that no training data was explicitly provided.

Keywords: Evolutionary algorithms, swarm intelligence, particle swarm optimization, neural network, interval arithmetic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
1347 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers

Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić

Abstract:

The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.

Keywords: Crackers, dietary fibers, rheology, sensory properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2443
1346 Quadrature Formula for Sampled Functions

Authors: Khalid Minaoui, Thierry Chonavel, Benayad Nsiri, Driss Aboutajdine

Abstract:

This paper deals with efficient quadrature formulas involving functions that are observed only at fixed sampling points. The approach that we develop is derived from efficient continuous quadrature formulas, such as Gauss-Legendre or Clenshaw-Curtis quadrature. We select nodes at sampling positions that are as close as possible to those of the associated classical quadrature and we update quadrature weights accordingly. We supply the theoretical quadrature error formula for this new approach. We show on examples the potential gain of this approach.

Keywords: Gauss-Legendre, Clenshaw-Curtis, quadrature, Peano kernel, irregular sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364
1345 Experimental Investigation on Effect of the Zirconium + Magnesium Coating of the Piston and Valve of the Single-Cylinder Diesel Engine to the Engine Performance and Emission

Authors: Erdinç Vural, Bülent Özdalyan, Serkan Özel

Abstract:

The four-stroke single cylinder diesel engine has been used in this study, the pistons and valves of the engine have been stabilized, the aluminum oxide (Al2O3) in different ratios has been added in the power of zirconium (ZrO2) magnesium oxide (MgO), and has been coated with the plasma spray method. The pistons and valves of the combustion chamber of the engine are coated with 5 different (ZrO2 + MgO), (ZrO2 + MgO + 25% Al2O3), (ZrO2 + MgO + 50% Al2O3), (ZrO2 + MgO + 75% Al2O3), (Al2O3) sample. The material tests have been made for each of the coated engine parts with the scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX) and X-ray diffraction (XRD) using Cu Kα radiation surface analysis methods. The engine tests have been repeated for each sample in any electric dynamometer in full power 1600 rpm, 2000 rpm, 2400 rpm and 2800 rpm engine speeds. The material analysis and engine tests have shown that the best performance has been performed with (ZrO2 + MgO + 50% Al2O3). Thus, there is no significant change in HC and Smoke emissions, but NOx emission is increased, as the engine improves power, torque, specific fuel consumption and CO emissions in the tests made with sample A3.

Keywords: Ceramic coating, material characterization, engine performance, exhaust emissions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
1344 Optimizing the Number of Bits/Stage in 10-Bit, 50Ms/Sec Pipelined A/D Converter Considering Area, Speed, Power and Linearity

Authors: P. Prasad Rao, K. Lal Kishore

Abstract:

Pipeline ADCs are becoming popular at high speeds and with high resolution. This paper discusses the options of number of bits/stage conversion techniques in pipelined ADCs and their effect on Area, Speed, Power Dissipation and Linearity. The basic building blocks like op-amp, Sample and Hold Circuit, sub converter, DAC, Residue Amplifier used in every stage is assumed to be identical. The sub converters use flash architectures. The design is implemented using 0.18

Keywords: 1.5 bits/stage, Conversion Frequency, Redundancy Switched Capacitor Sample and Hold Circuit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
1343 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov Chain Monte Carlo method, Maximum Likelihood method, normal distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1380
1342 Adjusted Ratio and Regression Type Estimators for Estimation of Population Mean when some Observations are missing

Authors: Nuanpan Nangsue

Abstract:

Ratio and regression type estimators have been used by previous authors to estimate a population mean for the principal variable from samples in which both auxiliary x and principal y variable data are available. However, missing data are a common problem in statistical analyses with real data. Ratio and regression type estimators have also been used for imputing values of missing y data. In this paper, six new ratio and regression type estimators are proposed for imputing values for any missing y data and estimating a population mean for y from samples with missing x and/or y data. A simulation study has been conducted to compare the six ratio and regression type estimators with a previous estimator of Rueda. Two population sizes N = 1,000 and 5,000 have been considered with sample sizes of 10% and 30% and with correlation coefficients between population variables X and Y of 0.5 and 0.8. In the simulations, 10 and 40 percent of sample y values and 10 and 40 percent of sample x values were randomly designated as missing. The new ratio and regression type estimators give similar mean absolute percentage errors that are smaller than the Rueda estimator for all cases. The new estimators give a large reduction in errors for the case of 40% missing y values and sampling fraction of 30%.

Keywords: Auxiliary variable, missing data, ratio and regression type estimators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
1341 Genetic Algorithm with Fuzzy Genotype Values and Its Application to Neuroevolution

Authors: Hidehiko Okada

Abstract:

The author proposes an extension of genetic algorithm (GA) for solving fuzzy-valued optimization problems. In the proposed GA, values in the genotypes are not real numbers but fuzzy numbers. Evolutionary processes in GA are extended so that GA can handle genotype instances with fuzzy numbers. The proposed method is applied to evolving neural networks with fuzzy weights and biases. Experimental results showed that fuzzy neural networks evolved by the fuzzy GA could model hidden target fuzzy functions well despite the fact that no training data was explicitly provided.

Keywords: Evolutionary algorithm, genetic algorithm, fuzzy number, neural network, neuroevolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2253
1340 Antenna Array Beamforming Using Neural Network

Authors: Maja Sarevska, Abdel-Badeeh M. Salem

Abstract:

This paper considers the problem of Null-Steering beamforming using Neural Network (NN) approach for antenna array system. Two cases are presented. First, unlike the other authors, the estimated Direction Of Arrivals (DOAs) are used for antenna array weights NN-based determination and the imprecise DOAs estimations are taken into account. Second, the blind null-steering beamforming is presented. In this case the antenna array outputs are presented at the input of the NN without DOAs estimation. The results of computer simulations will show much better relative mean error performances of the first NN approach compared to the NNbased blind beamforming.

Keywords: Beamforming, DOAs, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2362
1339 Quantitative Evaluation of Frameworks for Web Applications

Authors: Thirumalai Selvi, N. V. Balasubramanian, P. Sheik Abdul Khader

Abstract:

An empirical study of web applications that use software frameworks is presented here. The analysis is based on two approaches. In the first, developers using such frameworks are required, based on their experience, to assign weights to parameters such as database connection. In the second approach, a performance testing tool, OpenSTA, is used to compute start time and other such measures. From such an analysis, it is concluded that open source software is superior to proprietary software. The motivation behind this research is to examine ways in which a quantitative assessment can be made of software in general and frameworks in particular. Concepts such as metrics and architectural styles are discussed along with previously published research.

Keywords: Metrics, Frameworks, Performance Testing, WebApplications, Open Source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702
1338 Tissue Composition and Muscularity of Lamb Legs Fed with Sunflower Seeds and Vitamin E

Authors: A. G. Silva Sobrinho, G. M. Manzi, N. L. L. Lima, F. A. Almeida, V. Endo, N. M. B. L. Zeola

Abstract:

The purpose of this study was to evaluate the tissue composition and carcass muscularity of 32 legs of Ile de France lambs fed with diets containing sunflower seeds and vitamin E, with mean body weight of 15 kg, lodged in individual pens at 15 kg and slaughtered at 32 kg of body weight. The treatments influenced (P<0,05) leg weight, femur length and muscle:bone ratio, being the highest values (2,13 kg, 16,19 cm and 7,38, respectively) in lambs that received diet without sunflower seeds and vitamin E. The other variables were not affected (P>0,05) by the treatments. The interaction of the sunflower and vitamin E was positive for bone total weights and intermuscular fat.

Keywords: sheep, conformation, feedlot, nutrition, sugar-cane

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218
1337 A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

Authors: Amir-Massoud Bidgoli, Mehdi Naseri Parsa

Abstract:

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Keywords: feature selection, resampling, reliable features, Consistency Subset Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2538
1336 High-Temperature X-Ray Powder Diffraction of Secondary Gypsum

Authors: D. Gazdič, I. Hájková, M. Fridrichová

Abstract:

This paper involved the performance of a hightemperature X-Ray powder diffraction analysis (XRD) of a sample of chemical gypsum generated in the production of titanium white; this gypsum originates by neutralizing highly acidic water with limestone suspension. Specifically, it was gypsum formed in the first stage of neutralization when the resulting material contains, apart from gypsum, a number of waste products resulting from the decomposition of ilmenite by sulphuric acid. So it can be described as red titanogypsum. By conducting the experiment using XRD apparatus Bruker D8 Advance with a Cu anode (λkα=1.54184 Å) equipped with high-temperature chamber Anton Paar HTK 16, it was possible to identify clearly in the sample each phase transition in the system of CaSO4·xH2O.

Keywords: Anhydrite, Gypsum, Bassanite, Hematite, XRD, Powder, High-Temperature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
1335 Block Activity in Metric Neural Networks

Authors: Mario Gonzalez, David Dominguez, Francisco B. Rodriguez

Abstract:

The model of neural networks on the small-world topology, with metric (local and random connectivity) is investigated. The synaptic weights are random, driving the network towards a chaotic state for the neural activity. An ordered macroscopic neuron state is induced by a bias in the network connections. When the connections are mainly local, the network emulates a block-like structure. It is found that the topology and the bias compete to influence the network to evolve into a global or a block activity ordering, according to the initial conditions.

Keywords: Block attractor, random interaction, small world, spin glass.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1291
1334 Mining News Sites to Create Special Domain News Collections

Authors: David B. Bracewell, Fuji Ren, Shingo Kuroiwa

Abstract:

We present a method to create special domain collections from news sites. The method only requires a single sample article as a seed. No prior corpus statistics are needed and the method is applicable to multiple languages. We examine various similarity measures and the creation of document collections for English and Japanese. The main contributions are as follows. First, the algorithm can build special domain collections from as little as one sample document. Second, unlike other algorithms it does not require a second “general" corpus to compute statistics. Third, in our testing the algorithm outperformed others in creating collections made up of highly relevant articles.

Keywords: Information Retrieval, News, Special DomainCollections,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
1333 Bioremediation of Hydrocarbon and Some Heavy Metal Polluted Wastewater Effluent of a Typical Refinery

Authors: S. Abdulsalam, A. D. I. Suleiman, N. M. Musa, M. Yusuf

Abstract:

Environment free of pollutants should be the concern of every individual but with industrialization and urbanization it is difficult to achieve. In view of achieving a pollution limited environment at low cost, a study was conducted on the use of bioremediation technology to remediate hydrocarbons and three heavy metals namely; copper (Cu), zinc (Zn) and iron (Fe) from a typical petroleum refinery wastewater in a closed system. Physicochemical and microbiological characteristics on the wastewater sample revealed that it was polluted with the aforementioned pollutants. Isolation and identification of microorganisms present in the wastewater sample revealed the presence of Bacillus subtilis, Micrococcus luteus, Staphylococcus aureus and Staphylococcus epidermidis. Bioremediation experiments carried out on five batch reactors with different compositions but at same environmental conditions revealed that treatment T5 (boosted with the association of Bacillus subtilis, Micrococcus luteus) gave the best result in terms of oil and grease content removal (i.e. 67% in 63 days). In addition, these microorganisms were able of reducing the concentrations of heavy metals in the sample. Treatments T5, T3 (boosted with Bacillus subtilis only) and T4 (boosted with Micrococcus luteus only) gave optimum percentage uptakes of 65, 75 and 25 for Cu, Zn and Fe respectively.

Keywords: Boosted, bioremediation, closed system, aeration, uptake, wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1548
1332 A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease

Authors: Ersin Kaya, Bulent Oran, Ahmet Arslan

Abstract:

In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.

Keywords: Congenital heart disease, Fuzzy rule-basedclassifiers, Classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
1331 Conspiracy Theory in Discussions of the Coronavirus Pandemic in the Gulf Region

Authors: Rasha Salameh

Abstract:

In light of the tense relationship between Saudi Arabia and Iran, this research paper sheds some light on Saudi-owned television network, Al-Arabiya’s reporting of the Coronavirus in the Gulf region. Particularly because most of the cases in the beginning were coming from Iran, some programs of this Saudi channel embraced a conspiracy theory. Hate speech has been used in the talking and discussions about the topic. The results of these discussions will be detailed in this paper in percentages with regard to the research sample, which includes five programs on the Al-Arabiya channel: ‘DNA’, ‘Marraya’ (Mirrors), ‘Panorama’, ‘Tafaolcom’ (Your Interaction) and ‘Diplomatic Street’, in the period between January 19, that is, the date of the first case in Iran, and April 10, 2020. The research shows the use of a conspiracy theory in the programs, in addition to some professional violations. The surveyed sample also shows that the matter receded due to the Arab Gulf states' preoccupation with the successively increasing cases that have appeared there since the start of the pandemic. The results indicate that hate speech was present in the sample at a rate of 98.1%, and that most of the programs that dealt with the Iranian issue under the Coronavirus pandemic on Al Arabiya used the conspiracy theory at a rate of 75.5%.

Keywords: Al-Arabiya, Iran, COVID-19, hate speech, conspiracy theory, politicization of the pandemic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 407
1330 Functional Sample of the Portable Device for Fast Analysis of Explosives

Authors: A. Bumbová, J. Kellner, Z. Večeřa, V. Kahle, J. Navrátil

Abstract:

The construction of original functional sample of the portable device for fast analysis of energetic materials has been described in the paper. The portable device consisting of two parts – an original miniaturized microcolumn liquid chromatograph and a unique chemiluminescence detector – has been proposed and realized. In a very short time, this portable device is capable of identifying selectively most of military nitramine- and nitroesterbased explosives as well as inorganic nitrates occurring in trace concentrations in water or in soil. The total time required for the identification of extracts is shorter than 8 minutes.

Keywords: Chemiluminescence, microcolumn liquid chromatograph, nitramines, nitroesters, portable device.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454