Search results for: entropy
134 Umbrella Reinforcement Learning – A Tool for Hard Problems
Authors: Egor E. Nuzhin, Nikolay V. Brilliantov
Abstract:
We propose an approach for addressing Reinforcement Learning (RL) problems. It combines the ideas of umbrella sampling, borrowed from Monte Carlo technique of computational physics and chemistry, with optimal control methods, and is realized on the base of neural networks. This results in a powerful algorithm, designed to solve hard RL problems – the problems, with long-time delayed reward, state-traps sticking and a lack of terminal states. It outperforms the prominent algorithms, such as PPO, RND, iLQR and VI, which are among the most efficient for the hard problems. The new algorithm deals with a continuous ensemble of agents and expected return, that includes the ensemble entropy. This results in a quick and efficient search of the optimal policy in terms of ”exploration-exploitation trade-off” in the state-action space.Keywords: umbrella sampling, reinforcement learning, policy gradient, dynamic programming
Procedia PDF Downloads 21133 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software
Authors: Anjushi Verma, Tirthankar Gayen
Abstract:
Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.Keywords: black box, faults, failure, software reliability
Procedia PDF Downloads 443132 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications
Authors: Atish Bagchi, Siva Chandrasekaran
Abstract:
Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning
Procedia PDF Downloads 150131 Tuning Cubic Equations of State for Supercritical Water Applications
Authors: Shyh Ming Chern
Abstract:
Cubic equations of state (EoS), popular due to their simple mathematical form, ease of use, semi-theoretical nature and, reasonable accuracy are normally fitted to vapor-liquid equilibrium P-v-T data. As a result, They often show poor accuracy in the region near and above the critical point. In this study, the performance of the renowned Peng-Robinson (PR) and Patel-Teja (PT) EoS’s around the critical area has been examined against the P-v-T data of water. Both of them display large deviations at critical point. For instance, PR-EoS exhibits discrepancies as high as 47% for the specific volume, 28% for the enthalpy departure and 43% for the entropy departure at critical point. It is shown that incorporating P-v-T data of the supercritical region into the retuning of a cubic EoS can improve its performance above the critical point dramatically. Adopting a retuned acentric factor of 0.5491 instead of its genuine value of 0.344 for water in PR-EoS and a new F of 0.8854 instead of its original value of 0.6898 for water in PT-EoS reduces the discrepancies to about one third or less.Keywords: equation of state, EoS, supercritical water, SCW
Procedia PDF Downloads 535130 Post-Quantum Resistant Edge Authentication in Large Scale Industrial Internet of Things Environments Using Aggregated Local Knowledge and Consistent Triangulation
Authors: C. P. Autry, A. W. Roscoe, Mykhailo Magal
Abstract:
We discuss the theoretical model underlying 2BPA (two-band peer authentication), a practical alternative to conventional authentication of entities and data in IoT. In essence, this involves assembling a virtual map of authentication assets in the network, typically leading to many paths of confirmation between any pair of entities. This map is continuously updated, confirmed, and evaluated. The value of authentication along multiple disjoint paths becomes very clear, and we require analogues of triangulation to extend authentication along extended paths and deliver it along all possible paths. We discover that if an attacker wants to make an honest node falsely believe she has authenticated another, then the length of the authentication paths is of little importance. This is because optimal attack strategies correspond to minimal cuts in the authentication graph and do not contain multiple edges on the same path. The authentication provided by disjoint paths normally is additive (in entropy).Keywords: authentication, edge computing, industrial IoT, post-quantum resistance
Procedia PDF Downloads 197129 Risk Assessment of Building Information Modelling Adoption in Construction Projects
Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad
Abstract:
Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.Keywords: risk, BIM, fuzzy TOPSIS, construction projects
Procedia PDF Downloads 229128 Use of Diatomite for the Elimination of Chromium Three from Wastewater Annaba, Algeria
Authors: Sabiha Chouchane, Toufik Chouchane, Azzedine Hani
Abstract:
The wastewater was treated with a natural asorbent “Diatomite” to eliminate chromium three. Diatomite is an element that comes from Sig (west of Algeria). The physicochemical characterization revealed that the diatomite is mainly made up of silica, lime and a lower degree of alumina. The process considered in static regime, at 20°C, an ion stirring speed of 150 rpm, a pH = 4 and a grain diameter of between 100 and 150µm, shows that one gram of diatomite purified can fix according to the Langmuir model up to 39.64 mg/g of chromium with pseudo 1st order kinetics. The pseudo-equilibrium time highlighted is 25 minutes. The affinity between the adsorbent and the adsorbate follows the value of the RL ratio indicates us that the solid used has a good adsorption capacity. The external transport of the metal ions from the solution to the adsorbent seems to be a step controlling the speed of the overall process. On the other hand, internal transport in the pores is not the only limiting mechanism of sorption kinetics. Thermodynamic parameters show that chromium sorption is spontaneous and exothermic with negative entropy.Keywords: adsorption, diatomite, crIII, wastewater
Procedia PDF Downloads 55127 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions
Procedia PDF Downloads 478126 Kinetic and Thermodynamics of Sorption of 5-Fluorouracil (5-Fl) on Carbon Nanotubes
Authors: Muhammad Imran Din
Abstract:
The aim of this study was to understand the interaction between multi-walled carbon nano tubes (MCNTs) and anticancer agents and evaluate the drug-loading ability of MCNTs. Batch adsorption experiments were carried out for adsorption of 5-Fluorouracil (5-FL) using MCNTs. The effect of various operating variables, viz., adsorbent dosage, pH, contact time and temperature for adsorption of 5-Fluorouracil (5-FL) has been studied. The Freundlich adsorption model was successfully employed to describe the adsorption process. It was found that the pseudo-second-order mechanism is predominant and the overall rate of the 5-Fluorouracil (5-FL) adsorption process appears to be controlled by the more than one-step. Thermodynamic parameters such as free energy change (ΔG°), enthalpy change (ΔH°) and entropy change (ΔS°) have been calculated respectively, revealed the spontaneous, endothermic and feasible nature of adsorption process. The results showed that carbon nano tubes were able to form supra molecular complexes with 5-Fluorouracil (5-FL) by π-π stacking and possessed favorable loading properties as drug carriers.Keywords: drug, adsorption, anticancer, 5-Fluorouracil (5-FL)
Procedia PDF Downloads 361125 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 518124 Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology
Authors: Hussain Abdullah Al-Salamin, Elias Ogutu Azariah Tembe
Abstract:
Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphism, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.Keywords: homomorphism, isomorphism, monomorphisms, automorphisms, epimorphisms, endomorphism, supply chain, operational research (OR)
Procedia PDF Downloads 372123 Modeling Driving Distraction Considering Psychological-Physical Constraints
Authors: Yixin Zhu, Lishengsa Yue, Jian Sun, Lanyue Tang
Abstract:
Modeling driving distraction in microscopic traffic simulation is crucial for enhancing simulation accuracy. Current driving distraction models are mainly derived from physical motion constraints under distracted states, in which distraction-related error terms are added to existing microscopic driver models. However, the model accuracy is not very satisfying, due to a lack of modeling the cognitive mechanism underlying the distraction. This study models driving distraction based on the Queueing Network Human Processor model (QN-MHP). This study utilizes the queuing structure of the model to perform task invocation and switching for distracted operation and control of the vehicle under driver distraction. Based on the assumption of the QN-MHP model about the cognitive sub-network, server F is a structural bottleneck. The latter information must wait for the previous information to leave server F before it can be processed in server F. Therefore, the waiting time for task switching needs to be calculated. Since the QN-MHP model has different information processing paths for auditory information and visual information, this study divides driving distraction into two types: auditory distraction and visual distraction. For visual distraction, both the visual distraction task and the driving task need to go through the visual perception sub-network, and the stimuli of the two are asynchronous, which is called stimulus on asynchrony (SOA), so when calculating the waiting time for switching tasks, it is necessary to consider it. In the case of auditory distraction, the auditory distraction task and the driving task do not need to compete for the server resources of the perceptual sub-network, and their stimuli can be synchronized without considering the time difference in receiving the stimuli. According to the Theory of Planned Behavior for drivers (TPB), this study uses risk entropy as the decision criterion for driver task switching. A logistic regression model is used with risk entropy as the independent variable to determine whether the driver performs a distraction task, to explain the relationship between perceived risk and distraction. Furthermore, to model a driver’s perception characteristics, a neurophysiological model of visual distraction tasks is incorporated into the QN-MHP, and executes the classical Intelligent Driver Model. The proposed driving distraction model integrates the psychological cognitive process of a driver with the physical motion characteristics, resulting in both high accuracy and interpretability. This paper uses 773 segments of distracted car-following in Shanghai Naturalistic Driving Study data (SH-NDS) to classify the patterns of distracted behavior on different road facilities and obtains three types of distraction patterns: numbness, delay, and aggressiveness. The model was calibrated and verified by simulation. The results indicate that the model can effectively simulate the distracted car-following behavior of different patterns on various roadway facilities, and its performance is better than the traditional IDM model with distraction-related error terms. The proposed model overcomes the limitations of physical-constraints-based models in replicating dangerous driving behaviors, and internal characteristics of an individual. Moreover, the model is demonstrated to effectively generate more dangerous distracted driving scenarios, which can be used to construct high-value automated driving test scenarios.Keywords: computational cognitive model, driving distraction, microscopic traffic simulation, psychological-physical constraints
Procedia PDF Downloads 91122 A Survey on Lossless Compression of Bayer Color Filter Array Images
Authors: Alina Trifan, António J. R. Neves
Abstract:
Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.Keywords: bayer image, CFA, lossless compression, image coding standards
Procedia PDF Downloads 320121 Performance Study of Cascade Refrigeration System Using Alternative Refrigerants
Authors: Gulshan Sachdeva, Vaibhav Jain, S. S. Kachhwaha
Abstract:
Cascade refrigeration systems employ series of single stage vapor compression units which are thermally coupled with evaporator/condenser cascades. Different refrigerants are used in each of the circuit depending on the optimum characteristics shown by the refrigerant for a particular application. In the present research study, a steady state thermodynamic model is developed which simulates the working of an actual cascade system. The model provides COP and all other system parameters like total compressor work, temperature, pressure, enthalpy and entropy at different state points. The working fluid in Low Temperature Circuit (LTC) is CO2 (R744) while ammonia (R717), propane (R290), propylene (R1270), R404A and R12 are the refrigerants in High Temperature Circuit (HTC). The performance curves of ammonia, propane, propylene, and R404A are compared with R12 to find its nearest substitute. Results show that ammonia is the best substitute of R12.Keywords: cascade system, refrigerants, thermodynamic model, production engineering
Procedia PDF Downloads 361120 Quantitative Comparisons of Different Approaches for Rotor Identification
Authors: Elizabeth M. Annoni, Elena G. Tolkacheva
Abstract:
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors
Procedia PDF Downloads 324119 Metric Suite for Schema Evolution of a Relational Database
Authors: S. Ravichandra, D. V. L. N. Somayajulu
Abstract:
Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.Keywords: cohesion, coupling, entropy, metric suite, schema evolution
Procedia PDF Downloads 451118 On Generalized Cumulative Past Inaccuracy Measure for Marginal and Conditional Lifetimes
Authors: Amit Ghosh, Chanchal Kundu
Abstract:
Recently, the notion of past cumulative inaccuracy (CPI) measure has been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α (alpha) and study the proposed measure for conditionally specified models of two components failed at different time instants called generalized conditional CPI (GCCPI). We provide some bounds using usual stochastic order and investigate several properties of GCCPI. The effect of monotone transformation on this proposed measure has also been examined. Furthermore, we characterize some bivariate distributions under the assumption of conditional proportional reversed hazard rate model. Moreover, the role of GCCPI in reliability modeling has also been investigated for a real-life problem.Keywords: cumulative past inaccuracy, marginal and conditional past lifetimes, conditional proportional reversed hazard rate model, usual stochastic order
Procedia PDF Downloads 252117 Eco-Index for Assessing Ecological Disturbances at Downstream of a Hydropower Project
Authors: Chandra Upadhyaya, Arup Kumar Sarma
Abstract:
In the North Eastern part of India several hydro power projects are being proposed and execution for some of them are already initiated. There are controversies surrounding these constructions. Impact of these dams in the downstream part of the rivers needs to be assessed so that eco-system and people living downstream are protected by redesigning the projects if it becomes necessary. This may result in reducing the stresses to the affected ecosystem and people living downstream. At present many index based ecological methods are present to assess impact on ecology. However, none of these methods are capable of assessing the affect resulting from dam induced diurnal variation of flow in the downstream. We need environmental flow methodology based on hydrological index which can address the affect resulting from dam induced diurnal variation of flow and play an important role in a riverine ecosystem management and be able to provide a qualitative idea about changes in the habitat for aquatic and riparian species.Keywords: ecosystem, environmental flow assessment, entropy, IHA, TNC
Procedia PDF Downloads 384116 Controversies and Contradiction in (IR) Reversibility and the Equilibrium of Reactive Systems
Authors: Joao Teotonio Manzi
Abstract:
Reversibility, irreversibility, equilibrium and steady-state that play a central role in the thermodynamic analysis of processes arising in the context of reactive systems are discussed in this article. Such concepts have generated substantial doubts, even among the most experienced researchers, and engineers, because from the literature, conclusive or definitive statements cannot be extracted. Concepts such as the time-reversibility of irreversible processes seem paradoxical, requiring further analysis. Equilibrium and reversibility, which appear to be of the same nature, have also been re-examined in the light of maximum entropy. The goal of this paper is to revisit and explore these concepts based on classical thermodynamics in order to have a better understanding them due to their impacts on technological advances, as a result, to generate an optimal procedure for designing, monitoring, and engineering optimization. Furthermore, an effective graphic procedure for dimensioning a Plug Flow Reactor has been provided. Thus, to meet the needs of chemical engineering from a simple conceptual analysis but with significant practical effects, a macroscopic approach is taken so as to integrate the different parts of this paper.Keywords: reversibility, equilibrium, steady-state, thermodynamics, reactive system
Procedia PDF Downloads 106115 A Ratio-Weighted Decision Tree Algorithm for Imbalance Dataset Classification
Authors: Doyin Afolabi, Phillip Adewole, Oladipupo Sennaike
Abstract:
Most well-known classifiers, including the decision tree algorithm, can make predictions on balanced datasets efficiently. However, the decision tree algorithm tends to be biased towards imbalanced datasets because of the skewness of the distribution of such datasets. To overcome this problem, this study proposes a weighted decision tree algorithm that aims to remove the bias toward the majority class and prevents the reduction of majority observations in imbalance datasets classification. The proposed weighted decision tree algorithm was tested on three imbalanced datasets- cancer dataset, german credit dataset, and banknote dataset. The specificity, sensitivity, and accuracy metrics were used to evaluate the performance of the proposed decision tree algorithm on the datasets. The evaluation results show that for some of the weights of our proposed decision tree, the specificity, sensitivity, and accuracy metrics gave better results compared to that of the ID3 decision tree and decision tree induced with minority entropy for all three datasets.Keywords: data mining, decision tree, classification, imbalance dataset
Procedia PDF Downloads 136114 Contextual Factors of Innovation for Improving Commercial Banks' Performance in Nigeria
Authors: Tomola Obamuyi
Abstract:
The banking system in Nigeria adopted innovative banking, with the aim of enhancing financial inclusion, and making financial services readily and cheaply available to majority of the people, and to contribute to the efficiency of the financial system. Some of the innovative services include: Automatic Teller Machines (ATMs), National Electronic Fund Transfer (NEFT), Point of Sale (PoS), internet (Web) banking, Mobile Money payment (MMO), Real-Time Gross Settlement (RTGS), agent banking, among others. The introduction of these payment systems is expected to increase bank efficiency and customers' satisfaction, culminating in better performance for the commercial banks. However, opinions differ on the possible effects of the various innovative payment systems on the performance of commercial banks in the country. Thus, this study empirically determines how commercial banks use innovation to gain competitive advantage in the specific context of Nigeria's finance and business. The study also analyses the effects of financial innovation on the performance of commercial banks, when different periods of analysis are considered. The study employed secondary data from 2009 to 2018, the period that witnessed aggressive innovation in the financial sector of the country. The Vector Autoregression (VAR) estimation technique forecasts the relative variance of each random innovation to the variables in the VAR, examine the effect of standard deviation shock to one of the innovations on current and future values of the impulse response and determine the causal relationship between the variables (VAR granger causality test). The study also employed the Multi-Criteria Decision Making (MCDM) to rank the innovations and the performance criteria of Return on Assets (ROA) and Return on Equity (ROE). The entropy method of MCDM was used to determine which of the performance criteria better reflect the contributions of the various innovations in the banking sector. On the other hand, the Range of Values (ROV) method was used to rank the contributions of the seven innovations to performance. The analysis was done based on medium term (five years) and long run (ten years) of innovations in the sector. The impulse response function derived from the VAR system indicated that the response of ROA to the values of cheques transaction, values of NEFT transactions, values of POS transactions was positive and significant in the periods of analysis. The paper also confirmed with entropy and range of value that, in the long run, both the CHEQUE and MMO performed best while NEFT was next in performance. The paper concluded that commercial banks would enhance their performance by continuously improving on the services provided through Cheques, National Electronic Fund Transfer and Point of Sale since these instruments have long run effects on their performance. This will increase the confidence of the populace and encourage more usage/patronage of these services. The banking sector will in turn experience better performance which will improve the economy of the country. Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression,Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression
Procedia PDF Downloads 120113 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 102112 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms
Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma
Abstract:
Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.Keywords: image fusion, pyramid, wavelets, principal component analysis
Procedia PDF Downloads 283111 Synthesis and Characterization of Thiourea-Formaldehyde Coated Fe3O4 (TUF@Fe3O4) and Its Application for Adsorption of Methylene Blue
Authors: Saad M. Alshehri, Tansir Ahamad
Abstract:
Thiourea-Formaldehyde Pre-Polymer (TUF) was prepared by the reaction thiourea and formaldehyde in basic medium and used as a coating materials for magnetite Fe3O4. The synthesized polymer coated microspheres (TUF@Fe3O4) was characterized using FTIR, TGA SEM and TEM. Its BET surface area was up to 1680 m2 g_1. The adsorption capacity of this ACF product was evaluated in its adsorption of Methylene Blue (MB) in water under different pH values and different temperature. We found that the adsorption process was well described both by the Langmuir and Freundlich isotherm model. The kinetic processes of MB adsorption onto TUF@Fe3O4 were described in order to provide a more clear interpretation of the adsorption rate and uptake mechanism. The overall kinetic data was acceptably explained by a pseudo second-order rate model. Evaluated ∆Go and ∆Ho specify the spontaneous and exothermic nature of the reaction. The adsorption takes place with a decrease in entropy (∆So is negative). The monolayer capacity for MB was up to 450 mg g_1 and was one of the highest among similar polymeric products. It was due to its large BET surface area.Keywords: TGA, FTIR, magentite, thiourea formaldehyde resin, methylene blue, adsorption
Procedia PDF Downloads 350110 Thermodynamics of Stable Micro Black Holes Production by Modeling from the LHC
Authors: Aref Yazdani, Ali Tofighi
Abstract:
We study a simulative model for production of stable micro black holes based on investigation on thermodynamics of LHC experiment. We show that how this production can be achieved through a thermodynamic process of stability. Indeed, this process can be done through a very small amount of powerful fuel. By applying the second law of black hole thermodynamics at the scale of quantum gravity and perturbation expansion of the given entropy function, a time-dependent potential function is obtained which is illustrated with exact numerical values in higher dimensions. Seeking for the conditions for stability of micro black holes is another purpose of this study. This is proven through an injection method of putting the exact amount of energy into the final phase of the production which is equivalent to the same energy injection into the center of collision at the LHC in order to stabilize the produced particles. Injection of energy into the center of collision at the LHC is a new pattern that it is worth a try for the first time.Keywords: micro black holes, LHC experiment, black holes thermodynamics, extra dimensions model
Procedia PDF Downloads 144109 Towards Establishing a Universal Theory of Project Management
Authors: Divine Kwaku Ahadzie
Abstract:
Project management (PM) as a concept has evolved from the early 20th Century into a recognized academic and professional discipline, and indications are that it has come to stay in the 21st Century as a world-wide paradigm shift for managing successful construction projects. However, notwithstanding the strong inroads that PM has made in legitimizing its academic and professional status in construction management practice, the underlining philosophies are still based on cases and conventional practices. An important theoretical issue yet to be addressed is the lack of a universal theory that offers philosophical legitimacy for the PM concept as a uniquely specialized management concept. Here, it is hypothesized that the law of entropy, the theory of uncertainties and the theory of risk management offer plausible explanations for addressing the lacuna of what constitute PM theory. The theoretical bases of these plausible underlying theories are argued and attempts made to establish the functional relationships that exist between these theories and the PM concept. The paper then draws on data related to the success and/or failure of a number of construction projects to validate the theory.Keywords: concepts, construction, project management, universal theory
Procedia PDF Downloads 328108 Maxwell’s Economic Demon Hypothesis and the Impossibility of Economic Convergence of Developing Economies
Authors: Firano Zakaria, Filali Adib Fatine
Abstract:
The issue f convergence in theoretical models (classical or Keynesian) has been widely discussed. The results of the work affirm that most countries are seeking to get as close as possible to a steady state in order to catch up with developed countries. In this paper, we have retested this question whether it is absolute or conditional. The results affirm that the degree of convergence of countries like Morocco is very low and income is still far from its equilibrium state. Moreover, the analysis of financial convergence, of the countries in our panel, states that the pace in this sector is more intense: countries are converging more rapidly in financial terms. The question arises as to why, with a fairly convergent financial system, growth does not respond, yet the financial system should facilitate this economic convergence. Our results confirm that the degree of information exchange between the financial system and the economic system did not change significantly between 1985 and 2017. This leads to the hypothesis that the financial system is failing to serve its role as a creator of information in developing countries despite all the reforms undertaken, thus making the existence of an economic demon in the Maxwell prevail.Keywords: economic convergence, financial convergence, financial system, entropy
Procedia PDF Downloads 91107 Improved Rare Species Identification Using Focal Loss Based Deep Learning Models
Authors: Chad Goldsworthy, B. Rajeswari Matam
Abstract:
The use of deep learning for species identification in camera trap images has revolutionised our ability to study, conserve and monitor species in a highly efficient and unobtrusive manner, with state-of-the-art models achieving accuracies surpassing the accuracy of manual human classification. The high imbalance of camera trap datasets, however, results in poor accuracies for minority (rare or endangered) species due to their relative insignificance to the overall model accuracy. This paper investigates the use of Focal Loss, in comparison to the traditional Cross Entropy Loss function, to improve the identification of minority species in the “255 Bird Species” dataset from Kaggle. The results show that, although Focal Loss slightly decreased the accuracy of the majority species, it was able to increase the F1-score by 0.06 and improve the identification of the bottom two, five and ten (minority) species by 37.5%, 15.7% and 10.8%, respectively, as well as resulting in an improved overall accuracy of 2.96%.Keywords: convolutional neural networks, data imbalance, deep learning, focal loss, species classification, wildlife conservation
Procedia PDF Downloads 191106 Removal of Lead from Aqueous Solutions by Biosorption on Pomegranate Skin: Kinetics, Equilibrium and Thermodynamics
Authors: Y. Laidani, G. Henini, S. Hanini, A. Labbaci, F. Souahi
Abstract:
In this study, pomegranate skin, a material suitable for the conditions in Algeria, was chosen as adsorbent material for removal of lead in an aqueous solution. Biosorption studies were carried out under various parameters such as mass adsorbent particle, pH, contact time, the initial concentration of metal, and temperature. The experimental results show that the percentage of biosorption increases with an increase in the biosorbent mass (0.25 g, 0.035 mg/g; 1.25 g, 0.096 mg/g). The maximum biosorption occurred at pH value of 8 for the lead. The equilibrium uptake was increased with an increase in the initial concentration of metal in solution (Co = 4 mg/L, qt = 1.2 mg/g). Biosorption kinetic data were properly fitted with the pseudo-second-order kinetic model. The best fit was obtained by the Langmuir model with high correlation coefficients (R2 > 0.995) and a maximum monolayer adsorption capacity of 0.85 mg/g for lead. The adsorption of the lead was exothermic in nature (ΔH° = -17.833 kJ/mol for Pb (II). The reaction was accompanied by a decrease in entropy (ΔS° = -0.056 kJ/K. mol). The Gibbs energy (ΔG°) increased from -1.458 to -0.305 kJ/mol, respectively for Pb (II) when the temperature was increased from 293 to 313 K.Keywords: biosorption, Pb (+II), pomegranate skin, wastewater
Procedia PDF Downloads 270105 Adsorption Isotherm, Kinetic and Mechanism Studies of Some Substituted Phenols from Aqueous Solution by Jujuba Seeds Activated Carbon
Authors: O. Benturki, A. Benturki
Abstract:
Activated carbon was prepared from Jujube seeds by chemical activation with potassium hydroxide (KOH), followed by pyrolysis at 800°C. Batch studies were conducted for kinetic, thermodynamic and equilibrium studies on the adsorption of phenol (P) and 2-4 dichlorophenol (2-4 DCP) from aqueous solution, than the adsorption capacities followed the order of 2-4 dichlorophenol > phenol. The operating variables studied were initial phenols concentration, contact time, temperature and solution pH. Results show that the pH value of 7 is favorable for the adsorption of phenols. The sorption data have been analyzed using Langmuir and Freundlich isotherms. The isotherm data followed Langmuir Model. The adsorption processes conformed to the pseudo-second-order rate kinetics. Thermodynamic parameters such as enthalpy, entropy and Gibb’s free energy changes were also calculated and it was found that the sorption of phenols by Jujuba seeds activated carbon was a spontaneous process The maximum adsorption efficiency of phenol and 2-4 dichlorophenol was 142.85 mg.g−1 and 250 mg.g−1, respectively.Keywords: activated carbon, adsorption, isotherms, Jujuba seeds, phenols, langmuir
Procedia PDF Downloads 313