Search results for: multiscale entropy
149 Use of Diatomite for the Elimination of Chromium Three from Wastewater Annaba, Algeria
Authors: Sabiha Chouchane, Toufik Chouchane, Azzedine Hani
Abstract:
The wastewater was treated with a natural asorbent “Diatomite” to eliminate chromium three. Diatomite is an element that comes from Sig (west of Algeria). The physicochemical characterization revealed that the diatomite is mainly made up of silica, lime and a lower degree of alumina. The process considered in static regime, at 20°C, an ion stirring speed of 150 rpm, a pH = 4 and a grain diameter of between 100 and 150µm, shows that one gram of diatomite purified can fix according to the Langmuir model up to 39.64 mg/g of chromium with pseudo 1st order kinetics. The pseudo-equilibrium time highlighted is 25 minutes. The affinity between the adsorbent and the adsorbate follows the value of the RL ratio indicates us that the solid used has a good adsorption capacity. The external transport of the metal ions from the solution to the adsorbent seems to be a step controlling the speed of the overall process. On the other hand, internal transport in the pores is not the only limiting mechanism of sorption kinetics. Thermodynamic parameters show that chromium sorption is spontaneous and exothermic with negative entropy.Keywords: adsorption, diatomite, crIII, wastewater
Procedia PDF Downloads 55148 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions
Procedia PDF Downloads 479147 Kinetic and Thermodynamics of Sorption of 5-Fluorouracil (5-Fl) on Carbon Nanotubes
Authors: Muhammad Imran Din
Abstract:
The aim of this study was to understand the interaction between multi-walled carbon nano tubes (MCNTs) and anticancer agents and evaluate the drug-loading ability of MCNTs. Batch adsorption experiments were carried out for adsorption of 5-Fluorouracil (5-FL) using MCNTs. The effect of various operating variables, viz., adsorbent dosage, pH, contact time and temperature for adsorption of 5-Fluorouracil (5-FL) has been studied. The Freundlich adsorption model was successfully employed to describe the adsorption process. It was found that the pseudo-second-order mechanism is predominant and the overall rate of the 5-Fluorouracil (5-FL) adsorption process appears to be controlled by the more than one-step. Thermodynamic parameters such as free energy change (ΔG°), enthalpy change (ΔH°) and entropy change (ΔS°) have been calculated respectively, revealed the spontaneous, endothermic and feasible nature of adsorption process. The results showed that carbon nano tubes were able to form supra molecular complexes with 5-Fluorouracil (5-FL) by π-π stacking and possessed favorable loading properties as drug carriers.Keywords: drug, adsorption, anticancer, 5-Fluorouracil (5-FL)
Procedia PDF Downloads 361146 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 518145 Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology
Authors: Hussain Abdullah Al-Salamin, Elias Ogutu Azariah Tembe
Abstract:
Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphism, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.Keywords: homomorphism, isomorphism, monomorphisms, automorphisms, epimorphisms, endomorphism, supply chain, operational research (OR)
Procedia PDF Downloads 372144 Modeling Driving Distraction Considering Psychological-Physical Constraints
Authors: Yixin Zhu, Lishengsa Yue, Jian Sun, Lanyue Tang
Abstract:
Modeling driving distraction in microscopic traffic simulation is crucial for enhancing simulation accuracy. Current driving distraction models are mainly derived from physical motion constraints under distracted states, in which distraction-related error terms are added to existing microscopic driver models. However, the model accuracy is not very satisfying, due to a lack of modeling the cognitive mechanism underlying the distraction. This study models driving distraction based on the Queueing Network Human Processor model (QN-MHP). This study utilizes the queuing structure of the model to perform task invocation and switching for distracted operation and control of the vehicle under driver distraction. Based on the assumption of the QN-MHP model about the cognitive sub-network, server F is a structural bottleneck. The latter information must wait for the previous information to leave server F before it can be processed in server F. Therefore, the waiting time for task switching needs to be calculated. Since the QN-MHP model has different information processing paths for auditory information and visual information, this study divides driving distraction into two types: auditory distraction and visual distraction. For visual distraction, both the visual distraction task and the driving task need to go through the visual perception sub-network, and the stimuli of the two are asynchronous, which is called stimulus on asynchrony (SOA), so when calculating the waiting time for switching tasks, it is necessary to consider it. In the case of auditory distraction, the auditory distraction task and the driving task do not need to compete for the server resources of the perceptual sub-network, and their stimuli can be synchronized without considering the time difference in receiving the stimuli. According to the Theory of Planned Behavior for drivers (TPB), this study uses risk entropy as the decision criterion for driver task switching. A logistic regression model is used with risk entropy as the independent variable to determine whether the driver performs a distraction task, to explain the relationship between perceived risk and distraction. Furthermore, to model a driver’s perception characteristics, a neurophysiological model of visual distraction tasks is incorporated into the QN-MHP, and executes the classical Intelligent Driver Model. The proposed driving distraction model integrates the psychological cognitive process of a driver with the physical motion characteristics, resulting in both high accuracy and interpretability. This paper uses 773 segments of distracted car-following in Shanghai Naturalistic Driving Study data (SH-NDS) to classify the patterns of distracted behavior on different road facilities and obtains three types of distraction patterns: numbness, delay, and aggressiveness. The model was calibrated and verified by simulation. The results indicate that the model can effectively simulate the distracted car-following behavior of different patterns on various roadway facilities, and its performance is better than the traditional IDM model with distraction-related error terms. The proposed model overcomes the limitations of physical-constraints-based models in replicating dangerous driving behaviors, and internal characteristics of an individual. Moreover, the model is demonstrated to effectively generate more dangerous distracted driving scenarios, which can be used to construct high-value automated driving test scenarios.Keywords: computational cognitive model, driving distraction, microscopic traffic simulation, psychological-physical constraints
Procedia PDF Downloads 91143 Study of Structural Behavior and Proton Conductivity of Inorganic Gel Paste Electrolyte at Various Phosphorous to Silicon Ratio by Multiscale Modelling
Authors: P. Haldar, P. Ghosh, S. Ghoshdastidar, K. Kargupta
Abstract:
In polymer electrolyte membrane fuel cells (PEMFC), the membrane electrode assembly (MEA) is consisting of two platinum coated carbon electrodes, sandwiched with one proton conducting phosphoric acid doped polymeric membrane. Due to low mechanical stability, flooding and fuel cell crossover, application of phosphoric acid in polymeric membrane is very critical. Phosphorous and silica based 3D inorganic gel gains the attention in the field of supercapacitors, fuel cells and metal hydrate batteries due to its thermally stable highly proton conductive behavior. Also as a large amount of water molecule and phosphoric acid can easily get trapped in Si-O-Si network cavities, it causes a prevention in the leaching out. In this study, we have performed molecular dynamics (MD) simulation and first principle calculations to understand the structural, electronics and electrochemical and morphological behavior of this inorganic gel at various P to Si ratios. We have used dipole-dipole interactions, H bonding, and van der Waals forces to study the main interactions between the molecules. A 'structure property-performance' mapping is initiated to determine optimum P to Si ratio for best proton conductivity. We have performed the MD simulations at various temperature to understand the temperature dependency on proton conductivity. The observed results will propose a model which fits well with experimental data and other literature values. We have also studied the mechanism behind proton conductivity. And finally we have proposed a structure for the gel paste with optimum P to Si ratio.Keywords: first principle calculation, molecular dynamics simulation, phosphorous and silica based 3D inorganic gel, polymer electrolyte membrane fuel cells, proton conductivity
Procedia PDF Downloads 129142 A Survey on Lossless Compression of Bayer Color Filter Array Images
Authors: Alina Trifan, António J. R. Neves
Abstract:
Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.Keywords: bayer image, CFA, lossless compression, image coding standards
Procedia PDF Downloads 320141 Performance Study of Cascade Refrigeration System Using Alternative Refrigerants
Authors: Gulshan Sachdeva, Vaibhav Jain, S. S. Kachhwaha
Abstract:
Cascade refrigeration systems employ series of single stage vapor compression units which are thermally coupled with evaporator/condenser cascades. Different refrigerants are used in each of the circuit depending on the optimum characteristics shown by the refrigerant for a particular application. In the present research study, a steady state thermodynamic model is developed which simulates the working of an actual cascade system. The model provides COP and all other system parameters like total compressor work, temperature, pressure, enthalpy and entropy at different state points. The working fluid in Low Temperature Circuit (LTC) is CO2 (R744) while ammonia (R717), propane (R290), propylene (R1270), R404A and R12 are the refrigerants in High Temperature Circuit (HTC). The performance curves of ammonia, propane, propylene, and R404A are compared with R12 to find its nearest substitute. Results show that ammonia is the best substitute of R12.Keywords: cascade system, refrigerants, thermodynamic model, production engineering
Procedia PDF Downloads 361140 Quantitative Comparisons of Different Approaches for Rotor Identification
Authors: Elizabeth M. Annoni, Elena G. Tolkacheva
Abstract:
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors
Procedia PDF Downloads 324139 Metric Suite for Schema Evolution of a Relational Database
Authors: S. Ravichandra, D. V. L. N. Somayajulu
Abstract:
Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.Keywords: cohesion, coupling, entropy, metric suite, schema evolution
Procedia PDF Downloads 451138 On Generalized Cumulative Past Inaccuracy Measure for Marginal and Conditional Lifetimes
Authors: Amit Ghosh, Chanchal Kundu
Abstract:
Recently, the notion of past cumulative inaccuracy (CPI) measure has been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α (alpha) and study the proposed measure for conditionally specified models of two components failed at different time instants called generalized conditional CPI (GCCPI). We provide some bounds using usual stochastic order and investigate several properties of GCCPI. The effect of monotone transformation on this proposed measure has also been examined. Furthermore, we characterize some bivariate distributions under the assumption of conditional proportional reversed hazard rate model. Moreover, the role of GCCPI in reliability modeling has also been investigated for a real-life problem.Keywords: cumulative past inaccuracy, marginal and conditional past lifetimes, conditional proportional reversed hazard rate model, usual stochastic order
Procedia PDF Downloads 253137 Eco-Index for Assessing Ecological Disturbances at Downstream of a Hydropower Project
Authors: Chandra Upadhyaya, Arup Kumar Sarma
Abstract:
In the North Eastern part of India several hydro power projects are being proposed and execution for some of them are already initiated. There are controversies surrounding these constructions. Impact of these dams in the downstream part of the rivers needs to be assessed so that eco-system and people living downstream are protected by redesigning the projects if it becomes necessary. This may result in reducing the stresses to the affected ecosystem and people living downstream. At present many index based ecological methods are present to assess impact on ecology. However, none of these methods are capable of assessing the affect resulting from dam induced diurnal variation of flow in the downstream. We need environmental flow methodology based on hydrological index which can address the affect resulting from dam induced diurnal variation of flow and play an important role in a riverine ecosystem management and be able to provide a qualitative idea about changes in the habitat for aquatic and riparian species.Keywords: ecosystem, environmental flow assessment, entropy, IHA, TNC
Procedia PDF Downloads 384136 Controversies and Contradiction in (IR) Reversibility and the Equilibrium of Reactive Systems
Authors: Joao Teotonio Manzi
Abstract:
Reversibility, irreversibility, equilibrium and steady-state that play a central role in the thermodynamic analysis of processes arising in the context of reactive systems are discussed in this article. Such concepts have generated substantial doubts, even among the most experienced researchers, and engineers, because from the literature, conclusive or definitive statements cannot be extracted. Concepts such as the time-reversibility of irreversible processes seem paradoxical, requiring further analysis. Equilibrium and reversibility, which appear to be of the same nature, have also been re-examined in the light of maximum entropy. The goal of this paper is to revisit and explore these concepts based on classical thermodynamics in order to have a better understanding them due to their impacts on technological advances, as a result, to generate an optimal procedure for designing, monitoring, and engineering optimization. Furthermore, an effective graphic procedure for dimensioning a Plug Flow Reactor has been provided. Thus, to meet the needs of chemical engineering from a simple conceptual analysis but with significant practical effects, a macroscopic approach is taken so as to integrate the different parts of this paper.Keywords: reversibility, equilibrium, steady-state, thermodynamics, reactive system
Procedia PDF Downloads 106135 A Ratio-Weighted Decision Tree Algorithm for Imbalance Dataset Classification
Authors: Doyin Afolabi, Phillip Adewole, Oladipupo Sennaike
Abstract:
Most well-known classifiers, including the decision tree algorithm, can make predictions on balanced datasets efficiently. However, the decision tree algorithm tends to be biased towards imbalanced datasets because of the skewness of the distribution of such datasets. To overcome this problem, this study proposes a weighted decision tree algorithm that aims to remove the bias toward the majority class and prevents the reduction of majority observations in imbalance datasets classification. The proposed weighted decision tree algorithm was tested on three imbalanced datasets- cancer dataset, german credit dataset, and banknote dataset. The specificity, sensitivity, and accuracy metrics were used to evaluate the performance of the proposed decision tree algorithm on the datasets. The evaluation results show that for some of the weights of our proposed decision tree, the specificity, sensitivity, and accuracy metrics gave better results compared to that of the ID3 decision tree and decision tree induced with minority entropy for all three datasets.Keywords: data mining, decision tree, classification, imbalance dataset
Procedia PDF Downloads 136134 Contextual Factors of Innovation for Improving Commercial Banks' Performance in Nigeria
Authors: Tomola Obamuyi
Abstract:
The banking system in Nigeria adopted innovative banking, with the aim of enhancing financial inclusion, and making financial services readily and cheaply available to majority of the people, and to contribute to the efficiency of the financial system. Some of the innovative services include: Automatic Teller Machines (ATMs), National Electronic Fund Transfer (NEFT), Point of Sale (PoS), internet (Web) banking, Mobile Money payment (MMO), Real-Time Gross Settlement (RTGS), agent banking, among others. The introduction of these payment systems is expected to increase bank efficiency and customers' satisfaction, culminating in better performance for the commercial banks. However, opinions differ on the possible effects of the various innovative payment systems on the performance of commercial banks in the country. Thus, this study empirically determines how commercial banks use innovation to gain competitive advantage in the specific context of Nigeria's finance and business. The study also analyses the effects of financial innovation on the performance of commercial banks, when different periods of analysis are considered. The study employed secondary data from 2009 to 2018, the period that witnessed aggressive innovation in the financial sector of the country. The Vector Autoregression (VAR) estimation technique forecasts the relative variance of each random innovation to the variables in the VAR, examine the effect of standard deviation shock to one of the innovations on current and future values of the impulse response and determine the causal relationship between the variables (VAR granger causality test). The study also employed the Multi-Criteria Decision Making (MCDM) to rank the innovations and the performance criteria of Return on Assets (ROA) and Return on Equity (ROE). The entropy method of MCDM was used to determine which of the performance criteria better reflect the contributions of the various innovations in the banking sector. On the other hand, the Range of Values (ROV) method was used to rank the contributions of the seven innovations to performance. The analysis was done based on medium term (five years) and long run (ten years) of innovations in the sector. The impulse response function derived from the VAR system indicated that the response of ROA to the values of cheques transaction, values of NEFT transactions, values of POS transactions was positive and significant in the periods of analysis. The paper also confirmed with entropy and range of value that, in the long run, both the CHEQUE and MMO performed best while NEFT was next in performance. The paper concluded that commercial banks would enhance their performance by continuously improving on the services provided through Cheques, National Electronic Fund Transfer and Point of Sale since these instruments have long run effects on their performance. This will increase the confidence of the populace and encourage more usage/patronage of these services. The banking sector will in turn experience better performance which will improve the economy of the country. Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression,Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression
Procedia PDF Downloads 120133 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 102132 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms
Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma
Abstract:
Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.Keywords: image fusion, pyramid, wavelets, principal component analysis
Procedia PDF Downloads 283131 Synthesis and Characterization of Thiourea-Formaldehyde Coated Fe3O4 (TUF@Fe3O4) and Its Application for Adsorption of Methylene Blue
Authors: Saad M. Alshehri, Tansir Ahamad
Abstract:
Thiourea-Formaldehyde Pre-Polymer (TUF) was prepared by the reaction thiourea and formaldehyde in basic medium and used as a coating materials for magnetite Fe3O4. The synthesized polymer coated microspheres (TUF@Fe3O4) was characterized using FTIR, TGA SEM and TEM. Its BET surface area was up to 1680 m2 g_1. The adsorption capacity of this ACF product was evaluated in its adsorption of Methylene Blue (MB) in water under different pH values and different temperature. We found that the adsorption process was well described both by the Langmuir and Freundlich isotherm model. The kinetic processes of MB adsorption onto TUF@Fe3O4 were described in order to provide a more clear interpretation of the adsorption rate and uptake mechanism. The overall kinetic data was acceptably explained by a pseudo second-order rate model. Evaluated ∆Go and ∆Ho specify the spontaneous and exothermic nature of the reaction. The adsorption takes place with a decrease in entropy (∆So is negative). The monolayer capacity for MB was up to 450 mg g_1 and was one of the highest among similar polymeric products. It was due to its large BET surface area.Keywords: TGA, FTIR, magentite, thiourea formaldehyde resin, methylene blue, adsorption
Procedia PDF Downloads 350130 Thermodynamics of Stable Micro Black Holes Production by Modeling from the LHC
Authors: Aref Yazdani, Ali Tofighi
Abstract:
We study a simulative model for production of stable micro black holes based on investigation on thermodynamics of LHC experiment. We show that how this production can be achieved through a thermodynamic process of stability. Indeed, this process can be done through a very small amount of powerful fuel. By applying the second law of black hole thermodynamics at the scale of quantum gravity and perturbation expansion of the given entropy function, a time-dependent potential function is obtained which is illustrated with exact numerical values in higher dimensions. Seeking for the conditions for stability of micro black holes is another purpose of this study. This is proven through an injection method of putting the exact amount of energy into the final phase of the production which is equivalent to the same energy injection into the center of collision at the LHC in order to stabilize the produced particles. Injection of energy into the center of collision at the LHC is a new pattern that it is worth a try for the first time.Keywords: micro black holes, LHC experiment, black holes thermodynamics, extra dimensions model
Procedia PDF Downloads 144129 Towards Establishing a Universal Theory of Project Management
Authors: Divine Kwaku Ahadzie
Abstract:
Project management (PM) as a concept has evolved from the early 20th Century into a recognized academic and professional discipline, and indications are that it has come to stay in the 21st Century as a world-wide paradigm shift for managing successful construction projects. However, notwithstanding the strong inroads that PM has made in legitimizing its academic and professional status in construction management practice, the underlining philosophies are still based on cases and conventional practices. An important theoretical issue yet to be addressed is the lack of a universal theory that offers philosophical legitimacy for the PM concept as a uniquely specialized management concept. Here, it is hypothesized that the law of entropy, the theory of uncertainties and the theory of risk management offer plausible explanations for addressing the lacuna of what constitute PM theory. The theoretical bases of these plausible underlying theories are argued and attempts made to establish the functional relationships that exist between these theories and the PM concept. The paper then draws on data related to the success and/or failure of a number of construction projects to validate the theory.Keywords: concepts, construction, project management, universal theory
Procedia PDF Downloads 328128 Research and Application of Multi-Scale Three Dimensional Plant Modeling
Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao
Abstract:
Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition
Procedia PDF Downloads 277127 Maxwell’s Economic Demon Hypothesis and the Impossibility of Economic Convergence of Developing Economies
Authors: Firano Zakaria, Filali Adib Fatine
Abstract:
The issue f convergence in theoretical models (classical or Keynesian) has been widely discussed. The results of the work affirm that most countries are seeking to get as close as possible to a steady state in order to catch up with developed countries. In this paper, we have retested this question whether it is absolute or conditional. The results affirm that the degree of convergence of countries like Morocco is very low and income is still far from its equilibrium state. Moreover, the analysis of financial convergence, of the countries in our panel, states that the pace in this sector is more intense: countries are converging more rapidly in financial terms. The question arises as to why, with a fairly convergent financial system, growth does not respond, yet the financial system should facilitate this economic convergence. Our results confirm that the degree of information exchange between the financial system and the economic system did not change significantly between 1985 and 2017. This leads to the hypothesis that the financial system is failing to serve its role as a creator of information in developing countries despite all the reforms undertaken, thus making the existence of an economic demon in the Maxwell prevail.Keywords: economic convergence, financial convergence, financial system, entropy
Procedia PDF Downloads 91126 Improved Rare Species Identification Using Focal Loss Based Deep Learning Models
Authors: Chad Goldsworthy, B. Rajeswari Matam
Abstract:
The use of deep learning for species identification in camera trap images has revolutionised our ability to study, conserve and monitor species in a highly efficient and unobtrusive manner, with state-of-the-art models achieving accuracies surpassing the accuracy of manual human classification. The high imbalance of camera trap datasets, however, results in poor accuracies for minority (rare or endangered) species due to their relative insignificance to the overall model accuracy. This paper investigates the use of Focal Loss, in comparison to the traditional Cross Entropy Loss function, to improve the identification of minority species in the “255 Bird Species” dataset from Kaggle. The results show that, although Focal Loss slightly decreased the accuracy of the majority species, it was able to increase the F1-score by 0.06 and improve the identification of the bottom two, five and ten (minority) species by 37.5%, 15.7% and 10.8%, respectively, as well as resulting in an improved overall accuracy of 2.96%.Keywords: convolutional neural networks, data imbalance, deep learning, focal loss, species classification, wildlife conservation
Procedia PDF Downloads 191125 Removal of Lead from Aqueous Solutions by Biosorption on Pomegranate Skin: Kinetics, Equilibrium and Thermodynamics
Authors: Y. Laidani, G. Henini, S. Hanini, A. Labbaci, F. Souahi
Abstract:
In this study, pomegranate skin, a material suitable for the conditions in Algeria, was chosen as adsorbent material for removal of lead in an aqueous solution. Biosorption studies were carried out under various parameters such as mass adsorbent particle, pH, contact time, the initial concentration of metal, and temperature. The experimental results show that the percentage of biosorption increases with an increase in the biosorbent mass (0.25 g, 0.035 mg/g; 1.25 g, 0.096 mg/g). The maximum biosorption occurred at pH value of 8 for the lead. The equilibrium uptake was increased with an increase in the initial concentration of metal in solution (Co = 4 mg/L, qt = 1.2 mg/g). Biosorption kinetic data were properly fitted with the pseudo-second-order kinetic model. The best fit was obtained by the Langmuir model with high correlation coefficients (R2 > 0.995) and a maximum monolayer adsorption capacity of 0.85 mg/g for lead. The adsorption of the lead was exothermic in nature (ΔH° = -17.833 kJ/mol for Pb (II). The reaction was accompanied by a decrease in entropy (ΔS° = -0.056 kJ/K. mol). The Gibbs energy (ΔG°) increased from -1.458 to -0.305 kJ/mol, respectively for Pb (II) when the temperature was increased from 293 to 313 K.Keywords: biosorption, Pb (+II), pomegranate skin, wastewater
Procedia PDF Downloads 270124 Adsorption Isotherm, Kinetic and Mechanism Studies of Some Substituted Phenols from Aqueous Solution by Jujuba Seeds Activated Carbon
Authors: O. Benturki, A. Benturki
Abstract:
Activated carbon was prepared from Jujube seeds by chemical activation with potassium hydroxide (KOH), followed by pyrolysis at 800°C. Batch studies were conducted for kinetic, thermodynamic and equilibrium studies on the adsorption of phenol (P) and 2-4 dichlorophenol (2-4 DCP) from aqueous solution, than the adsorption capacities followed the order of 2-4 dichlorophenol > phenol. The operating variables studied were initial phenols concentration, contact time, temperature and solution pH. Results show that the pH value of 7 is favorable for the adsorption of phenols. The sorption data have been analyzed using Langmuir and Freundlich isotherms. The isotherm data followed Langmuir Model. The adsorption processes conformed to the pseudo-second-order rate kinetics. Thermodynamic parameters such as enthalpy, entropy and Gibb’s free energy changes were also calculated and it was found that the sorption of phenols by Jujuba seeds activated carbon was a spontaneous process The maximum adsorption efficiency of phenol and 2-4 dichlorophenol was 142.85 mg.g−1 and 250 mg.g−1, respectively.Keywords: activated carbon, adsorption, isotherms, Jujuba seeds, phenols, langmuir
Procedia PDF Downloads 313123 Dye Removal from Aqueous Solution by Regenerated Spent Bleaching Earth
Authors: Ahmed I. Shehab, Sabah M. Abdel Basir, M. A. Abdel Khalek, M. H. Soliman, G. Elgemeie
Abstract:
Spent bleaching earth (SBE) recycling and utilization as an adsorbent to eliminate dyes from aqueous solution was studied. Organic solvents and subsequent thermal treatment were carried out to recover and reactivate the SBE. The effect of pH, temperature, dye’s initial concentration, and contact time on the dye removal using recycled spent bleaching earth (RSBE) was investigated. Recycled SBE showed better removal affinity of cationic than anionic dyes. The maximum removal was achieved at pH 2 and 8 for anionic and cationic dyes, respectively. Kinetic data matched with the pseudo second-order model. The adsorption phenomenon governing this process was identified by the Langmuir and Freundlich isotherms for anionic dye while Freundlich model represented the sorption process for cationic dye. The changes of Gibbs free energy (ΔG°), enthalpy (ΔH°), and entropy (ΔS°) were computed and compared through thermodynamic study for both dyes.Keywords: Spent bleaching earth, reactivation, regeneration, thermal treatment, dye removal, thermodynamic
Procedia PDF Downloads 183122 Coupled Space and Time Homogenization of Viscoelastic-Viscoplastic Composites
Authors: Sarra Haouala, Issam Doghri
Abstract:
In this work, a multiscale computational strategy is proposed for the analysis of structures, which are described at a refined level both in space and in time. The proposal is applied to two-phase viscoelastic-viscoplastic (VE-VP) reinforced thermoplastics subjected to large numbers of cycles. The main aim is to predict the effective long time response while reducing the computational cost considerably. The proposed computational framework is a combination of the mean-field space homogenization based on the generalized incrementally affine formulation for VE-VP composites, and the asymptotic time homogenization approach for coupled isotropic VE-VP homogeneous solids under large numbers of cycles. The time homogenization method is based on the definition of micro and macro-chronological time scales, and on asymptotic expansions of the unknown variables. First, the original anisotropic VE-VP initial-boundary value problem of the composite material is decomposed into coupled micro-chronological (fast time scale) and macro-chronological (slow time-scale) problems. The former is purely VE, and solved once for each macro time step, whereas the latter problem is nonlinear and solved iteratively using fully implicit time integration. Second, mean-field space homogenization is used for both micro and macro-chronological problems to determine the micro and macro-chronological effective behavior of the composite material. The response of the matrix material is VE-VP with J2 flow theory assuming small strains. The formulation exploits the return-mapping algorithm for the J2 model, with its two steps: viscoelastic predictor and plastic corrections. The proposal is implemented for an extended Mori-Tanaka scheme, and verified against finite element simulations of representative volume elements, for a number of polymer composite materials subjected to large numbers of cycles.Keywords: asymptotic expansions, cyclic loadings, inclusion-reinforced thermoplastics, mean-field homogenization, time homogenization
Procedia PDF Downloads 368121 Adsorption of Cd2+ from Aqueous Solutions Using Chitosan Obtained from a Mixture of Littorina littorea and Achatinoidea Shells
Authors: E. D. Paul, O. F. Paul, J. E. Toryila, A. J. Salifu, C. E. Gimba
Abstract:
Adsorption of Cd2+ ions from aqueous solution by Chitosan, a natural polymer, obtained from a mixture of the exoskeletons of Littorina littorea (Periwinkle) and Achatinoidea (Snail) was studied at varying adsorbent dose, contact time, metal ion concentrations, temperature and pH using batch adsorption method. The equilibrium adsorption isotherms were determined between 298 K and 345 K. The adsorption data were adjusted to Langmuir, Freundlich and the pseudo second order kinetic models. It was found that the Langmuir isotherm model most fitted the experimental data, with a maximum monolayer adsorption of 35.1 mgkg⁻¹ at 308 K. The entropy and enthalpy of adsorption were -0.1121 kJmol⁻¹K⁻¹ and -11.43 kJmol⁻¹ respectively. The Freundlich adsorption model, gave Kf and n values consistent with good adsorption. The pseudo-second order reaction model gave a straight line plot with rate constant of 1.291x 10⁻³ kgmg⁻¹ min⁻¹. The qe value was 21.98 mgkg⁻¹, indicating that the adsorption of Cadmium ion by the chitosan composite followed the pseudo-second order kinetic model.Keywords: adsorption, chitosan, littorina littorea, achatinoidea, natural polymer
Procedia PDF Downloads 403120 Cobalt Ions Adsorption by Quartz and Illite and Calcite from Waste Water
Authors: Saad A. Aljlil
Abstract:
Adsorption of cobalt ions on quartz and illite and calcite from waste water was investigated. The effect of pH on the adsorption of cobalt ions was studied. The maximum capacities of cobalt ions of the three adsorbents increase with increasing cobalt solution temperature. The maximum capacities were (4.66) mg/g for quartz, (3.94) mg/g for illite, and (3.44) mg/g for calcite. The enthalpy, Gibbs free energy, and entropy for adsorption of cobalt ions on the three adsorbents were calculated. It was found that the adsorption process of the cobalt ions of the adsorbent was an endothermic process. consequently increasing the temperature causes the increase of the cobalt ions adsorption of the adsorbents. Therefore, the adsorption process is preferred at high temperature levels. The equilibrium adsorption data were correlated using Langmuir model, Freundlich model. The experimental data of cobalt ions of the adsorbents correlated well with Freundlich model.Keywords: adsorption, Langmuir, Freundlich, quartz, illite, calcite, waste water
Procedia PDF Downloads 372