Search results for: binary mixture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1970

Search results for: binary mixture

1820 Direct Blind Separation Methods for Convolutive Images Mixtures

Authors: Ahmed Hammed, Wady Naanaa

Abstract:

In this paper, we propose a general approach to deal with the problem of a convolutive mixture of images. We use a direct blind source separation method by adding only one non-statistical justified constraint describing the relationships between different mixing matrix at the aim to make its resolution easy. This method can be applied, provided that this constraint is known, to degraded document affected by the overlapping of text-patterns and images. This is due to chemical and physical reactions of the materials (paper, inks,...) occurring during the documents aging, and other unpredictable causes such as humidity, microorganism infestation, human handling, etc. We will demonstrate that this problem corresponds to a convolutive mixture of images. Subsequently, we will show how the validation of our method through numerical examples. We can so obtain clear images from unreadable ones which can be caused by pages superposition, a phenomenon similar to that we find every often in archival documents.

Keywords: blind source separation, convoluted mixture, degraded documents, text-patterns overlapping

Procedia PDF Downloads 298
1819 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 165
1818 Effect of a Mixture of Phenol, O-Cresol, P-Cresol, and M-Cresol on the Nitrifying Process in a Sequencing Batch Reactor

Authors: Adriana Sosa, Susana Rincon, Chérif Ben, Diana Cabañas, Juan E. Ruiz, Alejandro Zepeda

Abstract:

The complex chemical composition (mixtures of ammonium and recalcitrant compounds) of the effluents from the chemical, pharmaceutical and petrochemical industries represents a challenge in their biological treatment. This treatment involves nitrification process that can suffer an inhibition due to the presence of aromatic compounds giving as a result the decrease of the process efficiency. The inhibitory effects on nitrification in the presence of aromatic compounds have already been studied; however a few studies have considered the presence of phenolic compounds in the form of mixtures, which is the form that they are present in real context. For this reason, we realized a kinetic study on the nitrifying process in the presence of different concentrations of a mixture of phenol, o-cresol, m-cresol and p-cresol (0 - 320 mg C/L) in a sequencing batch reactor (SBR). Firstly, the nitrifying process was evaluated in absence of the phenolic mixture (control 1) in a SBR with 2 L working volume and 176 mg/L of nitrogen of microbial protein. Total oxidation of initial ammonium (efficiency; ENH4+ of 100 %) to nitrate (nitrifying yield; YNO3- of 0.95) were obtained with specific rates of ammonium consumption (qN-NH4+) and nitrate production (qN-NO3-) (of 1.11 ± 0.04 h-1 and 0.67 h-1 ± 0.11 respectively. During the phase of acclimation with 40 mg C/L of the phenolic mixture, an inhibitory effect on the nitrifying process was observed, provoking a decrease in ENH4+ and YNO3- (11 and 54 % respectively) as well as in the specific rates (89 y 46 % respectively), being the ammonia oxidizing bacteria (BAO) the most affected. However, in the next cycles without the phenolic mixture (control 2), the nitrifying consortium was able to recover its nitrifying capacity (ENH4+ = 100% and YNO3-=0.98). Afterwards the SBR was fed with 10 mg C/L of the phenolic mixture, obtaining and ENH4+ of 100%, YNO3- and qN-NH4+ 0.62 ± 0.006 and 0.13 ± 0.004 respectively, while the qN-NO3- was 0.49 ± 0.007. Moreover, with the increase of the phenolic concentrations (10-160 mg C/L) and the number of cycles the nitrifying consortium was able to oxidize the ammonia with ENH4+ of 100 % and YNO3- close to 1. However a decrease in the values of the nitrification specific rates and increase in the oxidation in phenolic compounds (70 to 94%) were observed. Finally, in the presence of 320 mg C/L, the nitrifying consortium was able to simultaneously oxidize the ammonia (ENH4+= 100%) and the phenolic mixture (p-cresol>phenol>m-cresol>o-cresol) being the o-cresol the most recalcitrant compound. In all the experiments the use of a SBR allowed a respiratory adaptation of the consortium to oxidize the phenolic mixture achieving greater adaptation of the nitrite-oxidizing bacteria (NOB) than in the ammonia-oxidizing bacteria (AOB).

Keywords: cresol, inhibition, nitrification, phenol, sequencing batch reactor

Procedia PDF Downloads 333
1817 Design Procedure of Cold Bitumen Emulsion Mixtures

Authors: Hayder Shanbara, Felicite Ruddock, William Atherton, Ali Al-Rifaie

Abstract:

In highways construction, Hot Mix Asphalt (HMA) is used predominantly as a paving material from many years. Around 90 percent of the world road network is laid by flexible pavements. However, there are some restrictions on paving hot mix asphalt such as immoderate greenhouse gas emission, rainy season difficulties, fuel and energy consumption and cost. Therefore, Cold Bitumen Emulsion Mixture (CBEM) is considered an alternative mix to the HMA. CBEM is the popular type of Cold Mix Asphalt (CMA). It is unheated emulsion, aggregate and filler mixtures, which can be prepared and mixed at ambient temperature. This research presents a simple and more practicable design procedure of CBEM and discusses limitations of this design. CBEM is a mixture of bitumen emulsion and aggregates that mixed and produced at ambient temperature. It is relatively easy to produce, but the design procedure that provided by Asphalt Institute (Manual Series 14 (1989)) pose some issues in its practical application.

Keywords: cold bitumen, emulsion mixture, design procedure, pavement

Procedia PDF Downloads 221
1816 Isothermal Vapour-Liquid Equilibria of Binary Mixtures of 1, 2-Dichloroethane with Some Cyclic Ethers: Experimental Results and Modelling

Authors: Fouzia Amireche-Ziar, Ilham Mokbel, Jacques Jose

Abstract:

The vapour pressures of the three binary mixtures: 1, 2- dichloroethane + 1,3-dioxolane, + 1,4-dioxane or + tetrahydropyrane, are carried out at ten temperatures ranging from 273 to 353.15 K. An accurate static device was employed for these measurements. The VLE data were reduced using the Redlich-Kister equation by taking into consideration the vapour pressure non-ideality in terms of the second molar virial coefficient. The experimental data were compared to the results predicted with the DISQUAC and Dortmund UNIFAC group contribution models for the total pressures P and the excess molar Gibbs energies GE.

Keywords: disquac model, dortmund UNIFAC model, excess molar Gibbs energies GE, VLE

Procedia PDF Downloads 203
1815 Design of Lead-Lag Based Internal Model Controller for Binary Distillation Column

Authors: Rakesh Kumar Mishra, Tarun Kumar Dan

Abstract:

Lead-Lag based Internal Model Control method is proposed based on Internal Model Control (IMC) strategy. In this paper, we have designed the Lead-Lag based Internal Model Control for binary distillation column for SISO process (considering only bottom product). The transfer function has been taken from Wood and Berry model. We have find the composition control and disturbance rejection using Lead-Lag based IMC and comparing with the response of simple Internal Model Controller.

Keywords: SISO, lead-lag, internal model control, wood and berry, distillation column

Procedia PDF Downloads 609
1814 Effect of Ginger, Red Pepper, and Their Mixture in Diet on Growth Performance and Body Composition of Oscar, Astronotus ocellatus

Authors: Sarah Jorjani, Afshin Ghelichi, Mazyar Kamali

Abstract:

The aim of this study was to estimate the effect of addition of ginger and red pepper and their mixture in diet on growth performance, survival rate and body composition of Astronotus ocellatus (Oscar fish). This study had been carried out for 8 weeks. For this reason 132 oscar fishes with intial weight of 2.44±0.26 (gr) were divided into 4 treatments with three replicate as compeletly randomize design test and fed by 100% Biomar diet (T1), Biomar + red pepper (55 mg/kg) (T2), Biomar + ginger (1%) (T3) and Biomar + mixture of red pepper and ginger (T4).The fish were fed in 5% of their body weight. The results showed T2 have significant differences in most of growth parameters in compare with other treatments, such as PBWI, SGR, PER and SR (P < 0.05), but there were no significant differences between treatments in FCR and FE (P > 0.05).

Keywords: red pepper, ginger, oscar fish, growth performance, body composition

Procedia PDF Downloads 395
1813 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices

Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner

Abstract:

Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.

Keywords: biometrics, electrocardiographic, machine learning, signals processing

Procedia PDF Downloads 116
1812 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm

Authors: El Harraj Abdeslam, Raissouni Naoufal

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes

Procedia PDF Downloads 233
1811 Impact of Fischer-Tropsch Wax on Ethylene Vinyl Acetate/Waste Crumb Rubber Modified Bitumen: An Energy-Sustainability Nexus

Authors: Keith D. Nare, Mohau J. Phiri, James Carson, Chris D. Woolard, Shanganyane P. Hlangothi

Abstract:

In an energy-intensive world, minimizing energy consumption is paramount to cost saving and reducing the carbon footprint. Improving mixture procedures utilizing warm mix additive Fischer-Tropsch (FT) wax in ethylene vinyl acetate (EVA) and modified bitumen highlights a greener and sustainable approach to modified bitumen. In this study, the impact of FT wax on optimized EVA/waste crumb rubber modified bitumen is assayed with a maximum loading of 2.5%. The rationale of the FT wax loading is to maintain the original maximum loading of EVA in the optimized mixture. The phase change abilities of FT wax enable EVA co-crystallization with the support of the elastomeric backbone of crumb rubber. Less than 1% loading of FT wax worked in the EVA/crumb rubber modified bitumen energy-sustainability nexus. Response surface methodology approach to the mixture design is implemented amongst the different loadings of FT wax, EVA for a consistent amount of crumb rubber and bitumen. Rheological parameters (complex shear modulus, phase angle and rutting parameter) were the factors used as performance indicators of the different optimized mixtures. The low temperature chemistry of the optimized mixtures is analyzed using elementary beam theory and the elastic-viscoelastic correspondence principle. Master curves and black space diagrams are developed and used to predict age-induced cracking of the different long term aged mixtures. Modified binder rheology reveals that the strain response is not linear and that there is substantial re-arrangement of polymer chains as stress is increased, this is based on the age state of the mixture and the FT wax and EVA loadings. Dominance of individual effects is evident over effects of synergy in co-interaction of EVA and FT wax. All-inclusive FT wax and EVA formulations were best optimized in mixture 4 with mixture 7 reflecting increase in ease of workability. Findings show that interaction chemistry of bitumen, crumb rubber EVA, and FT wax is first and second order in all cases involving individual contributions and co-interaction amongst the components of the mixture.

Keywords: bitumen, crumb rubber, ethylene vinyl acetate, FT wax

Procedia PDF Downloads 144
1810 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources

Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha

Abstract:

Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.

Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models

Procedia PDF Downloads 175
1809 Effect of Ecologic Fertilizers on Productivity and Yield Quality of Common and Spelt Wheat

Authors: Danutė Jablonskytė-Raščė, Audronė MankevičIenė, Laura Masilionytė

Abstract:

During the period 2009–2015, in Joniškėlis Experimental Station of the Lithuanian Research Centre for Agriculture and Forestry, the effect of ecologic fertilizers Ekoplant, bio-activators Biokal 01 and Terra Sorb Foliar and their combinations on the formation of the productivity elements, grain yield and quality of winter wheat, spelt (Triticum spelta L.), and common wheat (Triticum aestivum L.) was analysed in ecological agro-system. The soil under FAO classification – Endocalcari-Endo-hypogleyic-Cambisol. In a clay loam soil, ecological fertilizer produced from sunflower hull ash and this fertilizer in combination with plant extracts and bio-humus exerted an influence on the grain yield of spelt and common wheat and their mixture (increased the grain yield by 10.0%, compared with the unfertilized crops). Spelt grain yield was by on average 16.9% lower than that of common wheat and by 11.7% lower than that of the mixture, but the role of spelt in organic production systems is important because with no mineral fertilization it produced grains with a higher (by 4%) gluten content and exhibited a greater ability to suppress weeds (by on average 61.9% lower weed weight) compared with the grain yield and weed suppressive ability of common wheat and mixture. Spelt cultivation in a mixture with common wheat significantly improved quality indicators of the mixture (its grain contained by 2.0% higher protein content and by 4.0% higher gluten content than common wheat grain), reduced disease incidence (by 2-8%), and weed infestation level (by 34-81%).

Keywords: common and spelt-wheat, ecological fertilizers, bio-activators, productivity elements, yield, quality

Procedia PDF Downloads 265
1808 High Pressure Thermophysical Properties of Complex Mixtures Relevant to Liquefied Natural Gas (LNG) Processing

Authors: Saif Al Ghafri, Thomas Hughes, Armand Karimi, Kumarini Seneviratne, Jordan Oakley, Michael Johns, Eric F. May

Abstract:

Knowledge of the thermophysical properties of complex mixtures at extreme conditions of pressure and temperature have always been essential to the Liquefied Natural Gas (LNG) industry’s evolution because of the tremendous technical challenges present at all stages in the supply chain from production to liquefaction to transport. Each stage is designed using predictions of the mixture’s properties, such as density, viscosity, surface tension, heat capacity and phase behaviour as a function of temperature, pressure, and composition. Unfortunately, currently available models lead to equipment over-designs of 15% or more. To achieve better designs that work more effectively and/or over a wider range of conditions, new fundamental property data are essential, both to resolve discrepancies in our current predictive capabilities and to extend them to the higher-pressure conditions characteristic of many new gas fields. Furthermore, innovative experimental techniques are required to measure different thermophysical properties at high pressures and over a wide range of temperatures, including near the mixture’s critical points where gas and liquid become indistinguishable and most existing predictive fluid property models used breakdown. In this work, we present a wide range of experimental measurements made for different binary and ternary mixtures relevant to LNG processing, with a particular focus on viscosity, surface tension, heat capacity, bubble-points and density. For this purpose, customized and specialized apparatus were designed and validated over the temperature range (200 to 423) K at pressures to 35 MPa. The mixtures studied were (CH4 + C3H8), (CH4 + C3H8 + CO2) and (CH4 + C3H8 + C7H16); in the last of these the heptane contents was up to 10 mol %. Viscosity was measured using a vibrating wire apparatus, while mixture densities were obtained by means of a high-pressure magnetic-suspension densimeter and an isochoric cell apparatus; the latter was also used to determine bubble-points. Surface tensions were measured using the capillary rise method in a visual cell, which also enabled the location of the mixture critical point to be determined from observations of critical opalescence. Mixture heat capacities were measured using a customised high-pressure differential scanning calorimeter (DSC). The combined standard relative uncertainties were less than 0.3% for density, 2% for viscosity, 3% for heat capacity and 3 % for surface tension. The extensive experimental data gathered in this work were compared with a variety of different advanced engineering models frequently used for predicting thermophysical properties of mixtures relevant to LNG processing. In many cases the discrepancies between the predictions of different engineering models for these mixtures was large, and the high quality data allowed erroneous but often widely-used models to be identified. The data enable the development of new or improved models, to be implemented in process simulation software, so that the fluid properties needed for equipment and process design can be predicted reliably. This in turn will enable reduced capital and operational expenditure by the LNG industry. The current work also aided the community of scientists working to advance theoretical descriptions of fluid properties by allowing to identify deficiencies in theoretical descriptions and calculations.

Keywords: LNG, thermophysical, viscosity, density, surface tension, heat capacity, bubble points, models

Procedia PDF Downloads 247
1807 Combined Use of Microbial Consortia for the Enhanced Degradation of Type-IIx Pyrethroids

Authors: Parminder Kaur, Chandrajit B. Majumder

Abstract:

The unrestrained usage of pesticides to meet the burgeoning demand of enhanced crop productivity has led to the serious contamination of both terrestrial and aquatic ecosystem. The remediation of mixture of pesticides is a challenging affair regarding inadvertent mixture of pesticides from agricultural lands treated with various compounds. Global concerns about the excessive use of pesticides have driven the need to develop more effective and safer alternatives for their remediation. We focused our work on the microbial degradation of a mixture of three Type II-pyrethroids, namely Cypermethrin, Cyhalothrin and Deltamethrin commonly applied for both agricultural and domestic purposes. The fungal strains (Fusarium strain 8-11P and Fusarium sp. zzz1124) had previously been isolated from agricultural soils and their ability to biotransform this amalgam was studied. In brief, the experiment was conducted in two growth systems (added carbon and carbon-free) enriched with variable concentrations of pyrethroids between 100 to 300 mgL⁻¹. Parameter optimization (pH, temperature, concentration and time) was done using a central composite design matrix of Response Surface Methodology (RSM). At concentrations below 200 mgL⁻¹, complete removal was observed; however, degradation of 95.6%/97.4 and 92.27%/95.65% (in carbon-free/added carbon) was observed for 250 and 300 mgL⁻¹ respectively. The consortium has been shown to degrade the pyrethroid mixture (300 mg L⁻¹) within 120 h. After 5 day incubation, the residual pyrethroids concentration in unsterilized soil were much lower than in sterilized soil, indicating that microbial degradation predominates in pyrethroids elimination with the half-life (t₁/₂) of 1.6 d and R² ranging from 0.992-0.999. Overall, these results showed that microbial consortia might be more efficient than single degrader strains. The findings will complement our current understanding of the bioremediation of mixture of Type II pyrethroids with microbial consortia and potentially heighten the importance for considering bioremediation as an effective alternative for the remediation of such pollutants.

Keywords: bioremediation, fungi, pyrethroids, soil

Procedia PDF Downloads 117
1806 Improving the Quality of Casava Peel-Leaf Mixture through Fermentation with Rhizopus oligosporusas Poultry Ration

Authors: Mirnawati, G. Ciptaan, Ferawati

Abstract:

This study aims to improve the quality of the cassava peel-leaf mixture (CPLM) through fermentation with Rhizopus oligosporusas poultry ration. This research is an experimental study using a completely randomized design (CRD) with four treatments and five replications. The treatments were cassava peel-leaf mixture (CPLM) fermented with Rhizopus oligosporus. The treatments were a combination of cassava peel and leaves with the ratio of; A (9:1), B (8:2), C (7:3), and D (6:4). The observed variables were protease enzyme activity, crude protein, crude fiber, nitrogen retention, digestibility of crude fiber, and metabolic energy. The results of the diversity analysis showed that there was a very significant (p < 0.01) effect on protease activity, crude protein, crude fiber, nitrogen retention, digestibility of crude fiber, and energy metabolism of fermented CPLM. Based on the results of the study, it can be concluded that CPLM (6:4) fermented with Rhizopus oligosporus gave the best results seen from protease activity 7,25 U/ml, 21.23% crude protein, 19.80% crude fiber, 59.65% nitrogen retention, 62.99% crude fiber digestibility and metabolic energy 2671 Kcal/kg.

Keywords: quality, Casava peel-leaf mixture, fermentation, Rhizopus oligosporus

Procedia PDF Downloads 146
1805 Unsupervised Learning and Similarity Comparison of Water Mass Characteristics with Gaussian Mixture Model for Visualizing Ocean Data

Authors: Jian-Heng Wu, Bor-Shen Lin

Abstract:

The temperature-salinity relationship is one of the most important characteristics used for identifying water masses in marine research. Temperature-salinity characteristics, however, may change dynamically with respect to the geographic location and is quite sensitive to the depth at the same location. When depth is taken into consideration, however, it is not easy to compare the characteristics of different water masses efficiently for a wide range of areas of the ocean. In this paper, the Gaussian mixture model was proposed to analyze the temperature-salinity-depth characteristics of water masses, based on which comparison between water masses may be conducted. Gaussian mixture model could model the distribution of a random vector and is formulated as the weighting sum for a set of multivariate normal distributions. The temperature-salinity-depth data for different locations are first used to train a set of Gaussian mixture models individually. The distance between two Gaussian mixture models can then be defined as the weighting sum of pairwise Bhattacharyya distances among the Gaussian distributions. Consequently, the distance between two water masses may be measured fast, which allows the automatic and efficient comparison of the water masses for a wide range area. The proposed approach not only can approximate the distribution of temperature, salinity, and depth directly without the prior knowledge for assuming the regression family, but may restrict the complexity by controlling the number of mixtures when the amounts of samples are unevenly distributed. In addition, it is critical for knowledge discovery in marine research to represent, manage and share the temperature-salinity-depth characteristics flexibly and responsively. The proposed approach has been applied to a real-time visualization system of ocean data, which may facilitate the comparison of water masses by aggregating the data without degrading the discriminating capabilities. This system provides an interface for querying geographic locations with similar temperature-salinity-depth characteristics interactively and for tracking specific patterns of water masses, such as the Kuroshio near Taiwan or those in the South China Sea.

Keywords: water mass, Gaussian mixture model, data visualization, system framework

Procedia PDF Downloads 105
1804 Aggregation of Fractal Aggregates Inside Fractal Cages in Irreversible Diffusion Limited Cluster Aggregation Binary Systems

Authors: Zakiya Shireen, Sujin B. Babu

Abstract:

Irreversible diffusion-limited cluster aggregation (DLCA) of binary sticky spheres was simulated by modifying the Brownian Cluster Dynamics (BCD). We randomly distribute N spheres in a 3D box of size L, the volume fraction is given by Φtot = (π/6)N/L³. We identify NA and NB number of spheres as species A and B in our system both having identical size. In these systems, both A and B particles undergo Brownian motion. Irreversible bond formation happens only between intra-species particles and inter-species interact only through hard-core repulsions. As we perform simulation using BCD we start to observe binary gels. In our study, we have observed that species B always percolate (cluster size equal to L) as expected for the monomeric case and species A does not percolate below a critical ratio which is different for different volume fractions. We will also show that the accessible volume of the system increases when compared to the monomeric case, which means that species A is aggregating inside the cage created by B. We have also observed that for moderate Φtot the system undergoes a transition from flocculation region to percolation region indicated by the change in fractal dimension from 1.8 to 2.5. For smaller ratio of A, it stays in the flocculation regime even though B have already crossed over to the percolation regime. Thus, we observe two fractal dimension in the same system.

Keywords: BCD, fractals, percolation, sticky spheres

Procedia PDF Downloads 257
1803 Features Reduction Using Bat Algorithm for Identification and Recognition of Parkinson Disease

Authors: P. Shrivastava, A. Shukla, K. Verma, S. Rungta

Abstract:

Parkinson's disease is a chronic neurological disorder that directly affects human gait. It leads to slowness of movement, causes muscle rigidity and tremors. Gait serve as a primary outcome measure for studies aiming at early recognition of disease. Using gait techniques, this paper implements efficient binary bat algorithm for an early detection of Parkinson's disease by selecting optimal features required for classification of affected patients from others. The data of 166 people, both fit and affected is collected and optimal feature selection is done using PSO and Bat algorithm. The reduced dataset is then classified using neural network. The experiments indicate that binary bat algorithm outperforms traditional PSO and genetic algorithm and gives a fairly good recognition rate even with the reduced dataset.

Keywords: parkinson, gait, feature selection, bat algorithm

Procedia PDF Downloads 512
1802 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 327
1801 Improving Swelling Performance Using Industrial Waste Products

Authors: Mohieldin Elmashad, Salwa Yassin

Abstract:

Expansive soils regarded as one of the most problematic unsaturated formations in the Egyptian arid zones and present a great challenge in civil engineering, in general, and geotechnical engineering, in particular. Severe geotechnical complications and consequent structural damages have been arising due to an excessive and differential volumetric change upon wetting and change in water content. Different studies have been carried out concerning the swelling performance of the expansive soils using different additives including phospho-gypsum as an industrial waste product. However, this paper describes the results of a comprehensive testing programme that was carried out to investigate the effect of phospho-gypsum (PG) and sodium chloride (NaCl), as an additive mixture, on the swelling performance of constituent samples of swelling soils. The constituent samples comprise commercial bentonite collected from a natural site, mixed with different percentages of PG-NaCl mixture. The testing programme had been scoped to cover the physical and chemical properties of the constituent samples. In addition, a mineralogical study using x-ray diffraction (XRD) was performed on the collected bentonite and the mixed bentonite with PG-NaCl mixture samples. The obtained results of this study showed significant improvement in the swelling performance of the tested samples with the increase of the proposed PG-NaCl mixture content.

Keywords: expansive soils, industrial waste, mineralogical study, swelling performance, X-ray diffraction

Procedia PDF Downloads 243
1800 Binary Programming for Manufacturing Material and Manufacturing Process Selection Using Genetic Algorithms

Authors: Saleem Z. Ramadan

Abstract:

The material selection problem is concerned with the determination of the right material for a certain product to optimize certain performance indices in that product such as mass, energy density, and power-to-weight ratio. This paper is concerned about optimizing the selection of the manufacturing process along with the material used in the product under performance indices and availability constraints. In this paper, the material selection problem is formulated using binary programming and solved by genetic algorithm. The objective function of the model is to minimize the total manufacturing cost under performance indices and material and manufacturing process availability constraints.

Keywords: optimization, material selection, process selection, genetic algorithm

Procedia PDF Downloads 383
1799 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 185
1798 Getting to Know the Types of Asphalt, Its Manufacturing and Processing Methods and Its Application in Road Construction

Authors: Hamid Fallah

Abstract:

Asphalt is generally a mixture of stone materials with continuous granulation and a binder, which is usually bitumen. Asphalt is made in different shapes according to its use. The most familiar type of asphalt is hot asphalt or hot asphalt concrete. Stone materials usually make up more than 90% of the asphalt mixture. Therefore, stone materials have a significant impact on the quality of the resulting asphalt. According to the method of application and mixing, asphalt is divided into three categories: hot asphalt, protective asphalt, and cold asphalt. Cold mix asphalt is a mixture of stone materials and mixed bitumen or bitumen emulsion whose raw materials are mixed at ambient temperature. In some types of cold asphalt, the bitumen may be heated as necessary, but other materials are mixed with the bitumen without heating. Protective asphalts are used to make the roadbed impermeable, increase its abrasion and sliding resistance, and also temporarily improve the existing asphalt and concrete surfaces. This type of paving is very economical compared to hot asphalt due to the speed and ease of implementation and the limited need for asphalt machines and equipment. The present article, which is prepared in descriptive library form, introduces asphalt, its types, characteristics, and its application.

Keywords: asphalt, type of asphalt, asphalt concrete, sulfur concrete, bitumen in asphalt, sulfur, stone materials

Procedia PDF Downloads 33
1797 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment

Authors: P. K. Singhal, R. Naresh, V. Sharma

Abstract:

This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.

Keywords: artificial bee colony algorithm, economic dispatch, unit commitment, wind power

Procedia PDF Downloads 351
1796 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems

Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran

Abstract:

Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.

Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model

Procedia PDF Downloads 486
1795 Competitive Adsorption of Heavy Metals onto Natural and Activated Clay: Equilibrium, Kinetics and Modeling

Authors: L. Khalfa, M. Bagane, M. L. Cervera, S. Najjar

Abstract:

The aim of this work is to present a low cost adsorbent for removing toxic heavy metals from aqueous solutions. Therefore, we are interested to investigate the efficiency of natural clay minerals collected from south Tunisia and their modified form using sulfuric acid in the removal of toxic metal ions: Zn(II) and Pb(II) from synthetic waste water solutions. The obtained results indicate that metal uptake is pH-dependent and maximum removal was detected to occur at pH 6. Adsorption equilibrium is very rapid and it was achieved after 90 min for both metal ions studied. The kinetics results show that the pseudo-second-order model describes the adsorption and the intraparticle diffusion models are the limiting step. The treatment of natural clay with sulfuric acid creates more active sites and increases the surface area, so it showed an increase of the adsorbed quantities of lead and zinc in single and binary systems. The competitive adsorption study showed that the uptake of lead was inhibited in the presence of 10 mg/L of zinc. An antagonistic binary adsorption mechanism was observed. These results revealed that clay is an effective natural material for removing lead and zinc in single and binary systems from aqueous solution.

Keywords: heavy metal, activated clay, kinetic study, competitive adsorption, modeling

Procedia PDF Downloads 193
1794 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution

Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi

Abstract:

Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.

Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion

Procedia PDF Downloads 146
1793 Musical Instrument Recognition in Polyphonic Audio Through Convolutional Neural Networks and Spectrograms

Authors: Rujia Chen, Akbar Ghobakhlou, Ajit Narayanan

Abstract:

This study investigates the task of identifying musical instruments in polyphonic compositions using Convolutional Neural Networks (CNNs) from spectrogram inputs, focusing on binary classification. The model showed promising results, with an accuracy of 97% on solo instrument recognition. When applied to polyphonic combinations of 1 to 10 instruments, the overall accuracy was 64%, reflecting the increasing challenge with larger ensembles. These findings contribute to the field of Music Information Retrieval (MIR) by highlighting the potential and limitations of current approaches in handling complex musical arrangements. Future work aims to include a broader range of musical sounds, including electronic and synthetic sounds, to improve the model's robustness and applicability in real-time MIR systems.

Keywords: binary classifier, CNN, spectrogram, instrument

Procedia PDF Downloads 1
1792 Engineering Optimization Using Two-Stage Differential Evolution

Authors: K. Y. Tseng, C. Y. Wu

Abstract:

This paper employs a heuristic algorithm to solve engineering problems including truss structure optimization and optimal chiller loading (OCL) problems. Two different type algorithms, real-valued differential evolution (DE) and modified binary differential evolution (MBDE), are successfully integrated and then can obtain better performance in solving engineering problems. In order to demonstrate the performance of the proposed algorithm, this study adopts each one testing case of truss structure optimization and OCL problems to compare the results of other heuristic optimization methods. The result indicates that the proposed algorithm can obtain similar or better solution in comparing with previous studies.

Keywords: differential evolution, Truss structure optimization, optimal chiller loading, modified binary differential evolution

Procedia PDF Downloads 136
1791 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation

Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen

Abstract:

Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.

Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning

Procedia PDF Downloads 36