Search results for: gravitational search algorithm
1474 Analysis Of Non-uniform Characteristics Of Small Underwater Targets Based On Clustering
Authors: Tianyang Xu
Abstract:
Small underwater targets generally have a non-centrosymmetric geometry, and the acoustic scattering field of the target has spatial inhomogeneity under active sonar detection conditions. In view of the above problems, this paper takes the hemispherical cylindrical shell as the research object, and considers the angle continuity implied in the echo characteristics, and proposes a cluster-driven research method for the non-uniform characteristics of target echo angle. First, the target echo features are extracted, and feature vectors are constructed. Secondly, the t-SNE algorithm is used to improve the internal connection of the feature vector in the low-dimensional feature space and to construct the visual feature space. Finally, the implicit angular relationship between echo features is extracted under unsupervised condition by cluster analysis. The reconstruction results of the local geometric structure of the target corresponding to different categories show that the method can effectively divide the angle interval of the local structure of the target according to the natural acoustic scattering characteristics of the target.Keywords: underwater target;, non-uniform characteristics;, cluster-driven method;, acoustic scattering characteristics
Procedia PDF Downloads 1341473 Economic Analysis of Rainwater Harvesting Systems for Dairy Cattle
Authors: Sandra Cecilia Muhirirwe, Bart Van Der Bruggen, Violet Kisakye
Abstract:
Economic analysis of Rainwater harvesting (RWH) systems is vital in search of a cost-effective solution to water unreliability, especially in low-income countries. There is little literature focusing on the financial aspects of RWH for dairy farmers. The main purpose was to assess the economic viability of rainwater harvesting for diary framers in the Rwenzori region. The study focused on the use of rainwater harvesting systems from the rooftop and collection in above surface tanks. Daily rainfall time series for 12 years was obtained across nine gauging stations. The daily water balance equation was used for optimal sizing of the tank. Economic analysis of the investment was carried out based on the life cycle costs and the accruing benefits for the period of 15 years. Roof areas were varied from 75m2 as the minimum required area to 500m2 while maintaining the same number of cattle and keeping the daily water demand constant. The results show that the required rainwater tank sizes are very large and may be impractical to install due to the strongly varying terrain and the initial cost of investment. In all districts, there is a significant reduction of the volume of the required tank with an increasing collection area. The results further show that increasing the collection area has a minor effect on reducing the required tank size. Generally, for all rainfall areas, the reliability increases with an increase in the roof area. The results indicate that 100% reliability can only be realized with very large collection areas that are impractical to install. The estimated benefits outweigh the cost of investment. The Present Net Value shows that the investment is economically viable and investment with a short payback of a maximum of 3 years for all the time series in the study area.Keywords: dairy cattle, optimisation, rainwater harvesting, economic analysis
Procedia PDF Downloads 2051472 Median-Based Nonparametric Estimation of Returns in Mean-Downside Risk Portfolio Frontier
Authors: H. Ben Salah, A. Gannoun, C. de Peretti, A. Trabelsi
Abstract:
The Downside Risk (DSR) model for portfolio optimisation allows to overcome the drawbacks of the classical mean-variance model concerning the asymetry of returns and the risk perception of investors. This model optimization deals with a positive definite matrix that is endogenous with respect to portfolio weights. This aspect makes the problem far more difficult to handle. For this purpose, Athayde (2001) developped a new recurcive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier presents an appearance which is not very smooth. In order to overcome that, Athayde (2003) proposed a mean kernel estimation of the returns, so as to create a smoother portfolio frontier. This technique provides an effect similar to the case in which we had continuous observations. In this paper, taking advantage on the the robustness of the median, we replace the mean estimator in Athayde's model by a nonparametric median estimator of the returns. Then, we give a new version of the former algorithm (of Athayde (2001, 2003)). We eventually analyse the properties of this improved portfolio frontier and apply this new method on real examples.Keywords: Downside Risk, Kernel Method, Median, Nonparametric Estimation, Semivariance
Procedia PDF Downloads 4931471 Optimization of Reinforced Concrete Buildings According to the Algerian Seismic Code
Authors: Nesreddine Djafar Henni, Nassim Djedoui, Rachid Chebili
Abstract:
Recent decades have witnessed significant efforts being made to optimize different types of structures and components. The concept of cost optimization in reinforced concrete structures, which aims at minimizing financial resources while ensuring maximum building safety, comprises multiple materials, and the objective function for their optimal design is derived from the construction cost of the steel as well as concrete that significantly contribute to the overall weight of reinforced concrete (RC) structures. To achieve this objective, this work has been devoted to optimizing the structural design of 3D RC frame buildings which integrates, for the first time, the Algerian regulations. Three different test examples were investigated to assess the efficiency of our work in optimizing RC frame buildings. The hybrid GWOPSO algorithm is used, and 30000 generations are made. The cost of the building is reduced by iteration each time. Concrete and reinforcement bars are used in the building cost. As a result, the cost of a reinforced concrete structure is reduced by 30% compared with the initial design. This result means that the 3D cost-design optimization of the framed structure is successfully achieved.Keywords: optimization, automation, API, Malab, RC structures
Procedia PDF Downloads 491470 The Effectiveness of a Hybrid Diffie-Hellman-RSA-Advanced Encryption Standard Model
Authors: Abdellahi Cheikh
Abstract:
With the emergence of quantum computers with very powerful capabilities, the security of the exchange of shared keys between two interlocutors poses a big problem in terms of the rapid development of technologies such as computing power and computing speed. Therefore, the Diffie-Hellmann (DH) algorithm is more vulnerable than ever. No mechanism guarantees the security of the key exchange, so if an intermediary manages to intercept it, it is easy to intercept. In this regard, several studies have been conducted to improve the security of key exchange between two interlocutors, which has led to interesting results. The modification made on our model Diffie-Hellman-RSA-AES (DRA), which encrypts the information exchanged between two users using the three-encryption algorithms DH, RSA and AES, by using stenographic photos to hide the contents of the p, g and ClesAES values that are sent in an unencrypted state at the level of DRA model to calculate each user's public key. This work includes a comparative study between the DRA model and all existing solutions, as well as the modification made to this model, with an emphasis on the aspect of reliability in terms of security. This study presents a simulation to demonstrate the effectiveness of the modification made to the DRA model. The obtained results show that our model has a security advantage over the existing solution, so we made these changes to reinforce the security of the DRA model.Keywords: Diffie-Hellmann, DRA, RSA, advanced encryption standard
Procedia PDF Downloads 941469 Optimum Turbomachine Preliminary Selection for Power Regeneration in Vapor Compression Cool Production Plants
Authors: Sayyed Benyamin Alavi, Giovanni Cerri, Leila Chennaoui, Ambra Giovannelli, Stefano Mazzoni
Abstract:
Primary energy consumption and emissions of pollutants (including CO2) sustainability call to search methodologies to lower power absorption for unit of a given product. Cool production plants based on vapour compression are widely used for many applications: air conditioning, food conservation, domestic refrigerators and freezers, special industrial processes, etc. In the field of cool production, the amount of Yearly Consumed Primary Energy is enormous, thus, saving some percentage of it, leads to big worldwide impact in the energy consumption and related energy sustainability. Among various techniques to reduce power required by a Vapour Compression Cool Production Plant (VCCPP), the technique based on Power Regeneration by means of Internal Direct Cycle (IDC) will be considered in this paper. Power produced by IDC reduces power need for unit of produced Cool Power by the VCCPP. The paper contains basic concepts that lead to develop IDCs and the proposed options to use the IDC Power. Among various selections for using turbo machines, Best Economically Available Technologies (BEATs) have been explored. Based on vehicle engine turbochargers, they have been taken into consideration for this application. According to BEAT Database and similarity rules, the best turbo machine selection leads to the minimum nominal power required by VCCPP Main Compressor. Results obtained installing the prototype in “ad hoc” designed test bench will be discussed and compared with the expected performance. Forecasts for the upgrading VCCPP, various applications will be given and discussed. 4-6% saving is expected for air conditioning cooling plants and 15-22% is expected for cryogenic plants.Keywords: Refrigeration Plant, Vapour Pressure Amplifier, Compressor, Expander, Turbine, Turbomachinery Selection, Power Saving
Procedia PDF Downloads 4261468 Parameter Tuning of Complex Systems Modeled in Agent Based Modeling and Simulation
Authors: Rabia Korkmaz Tan, Şebnem Bora
Abstract:
The major problem encountered when modeling complex systems with agent-based modeling and simulation techniques is the existence of large parameter spaces. A complex system model cannot be expected to reflect the whole of the real system, but by specifying the most appropriate parameters, the actual system can be represented by the model under certain conditions. When the studies conducted in recent years were reviewed, it has been observed that there are few studies for parameter tuning problem in agent based simulations, and these studies have focused on tuning parameters of a single model. In this study, an approach of parameter tuning is proposed by using metaheuristic algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colonies (ABC), Firefly (FA) algorithms. With this hybrid structured study, the parameter tuning problems of the models in the different fields were solved. The new approach offered was tested in two different models, and its achievements in different problems were compared. The simulations and the results reveal that this proposed study is better than the existing parameter tuning studies.Keywords: parameter tuning, agent based modeling and simulation, metaheuristic algorithms, complex systems
Procedia PDF Downloads 2291467 Application and Assessment of Artificial Neural Networks for Biodiesel Iodine Value Prediction
Authors: Raquel M. De sousa, Sofiane Labidi, Allan Kardec D. Barros, Alex O. Barradas Filho, Aldalea L. B. Marques
Abstract:
Several parameters are established in order to measure biodiesel quality. One of them is the iodine value, which is an important parameter that measures the total unsaturation within a mixture of fatty acids. Limitation of unsaturated fatty acids is necessary since warming of a higher quantity of these ones ends in either formation of deposits inside the motor or damage of lubricant. Determination of iodine value by official procedure tends to be very laborious, with high costs and toxicity of the reagents, this study uses an artificial neural network (ANN) in order to predict the iodine value property as an alternative to these problems. The methodology of development of networks used 13 esters of fatty acids in the input with convergence algorithms of backpropagation type were optimized in order to get an architecture of prediction of iodine value. This study allowed us to demonstrate the neural networks’ ability to learn the correlation between biodiesel quality properties, in this case iodine value, and the molecular structures that make it up. The model developed in the study reached a correlation coefficient (R) of 0.99 for both network validation and network simulation, with Levenberg-Maquardt algorithm.Keywords: artificial neural networks, biodiesel, iodine value, prediction
Procedia PDF Downloads 6081466 The Relations among Business Model, Higher Education, University and Entrepreneurship Education: An Analysis of Academic Literature of 2009-2019 Period
Authors: Elzo Alves Aranha, Marcio M. Araki
Abstract:
Business model (BM) is a term that has been receiving the attention of scholars and practitioners and has been consolidating itself as a field of study and research. Although there is no agreement in the academic literature on the definition of BM, at least there is an explicit agreement: BM defines a logical structure of how an organization creates value, capture value and delivers value for the customers and stakeholders. The lack of understanding about connections and elements among BM and higher education, university, and entrepreneurship education opens a gap in the academic literature. Thus, it is interesting to analyze how BM has been approached by the literature and applied in higher education, university, and entrepreneurship education aimed to know the main streams of research. This is because higher education institutions are characterized by innovation, leading to a greater acceptance of new and modern concepts such as BM. Our research has the main motivation to fill the gap in the academic literature, making it possible to increase the power of understanding about connections and aspects among BM and higher education, university, and entrepreneurship education. The objective of the research is to analyze the main aspects among BM and higher education, university, and entrepreneurship education in academic literature. The research followed the systematic literature review (SLR). The SLR is based on three main factors: clarity, validity, and auditability. 82 academic papers were found in the past 10 years, from 2009-2019. The search was carried out in Science Direct and Periodicos Capes databases. The main findings indicate that there are links between BM and higher education, BM and university, BM, and entrepreneurship education. The main findings are inserted within seven aspects. The findings are innovative and contribute to increase the power of understanding about the connection among BM and higher education, university, and entrepreneurship education in academic literature. The research findings addressed to the gap exposed in academic literature. The research findings have several practical implications, and we highlight only two main ones. First, researchers will be able to use the research findings to mitigate a BM research agenda involving connections between BM and higher education, BM and university, and BM and entrepreneurship education. Second, directors, deans, and university leaders will be able to carry out BM awareness programs, BM professors training programs, and makers planning for the inclusion of BM, as one of the components of the curricula of the undergraduate and graduate courses.Keywords: business model, entrepreneurship education, higher education, university
Procedia PDF Downloads 1871465 Cash Flow Optimization on Synthetic CDOs
Authors: Timothée Bligny, Clément Codron, Antoine Estruch, Nicolas Girodet, Clément Ginet
Abstract:
Collateralized Debt Obligations are not as widely used nowadays as they were before 2007 Subprime crisis. Nonetheless there remains an enthralling challenge to optimize cash flows associated with synthetic CDOs. A Gaussian-based model is used here in which default correlation and unconditional probabilities of default are highlighted. Then numerous simulations are performed based on this model for different scenarios in order to evaluate the associated cash flows given a specific number of defaults at different periods of time. Cash flows are not solely calculated on a single bought or sold tranche but rather on a combination of bought and sold tranches. With some assumptions, the simplex algorithm gives a way to find the maximum cash flow according to correlation of defaults and maturities. The used Gaussian model is not realistic in crisis situations. Besides present system does not handle buying or selling a portion of a tranche but only the whole tranche. However the work provides the investor with relevant elements on how to know what and when to buy and sell.Keywords: synthetic collateralized debt obligation (CDO), credit default swap (CDS), cash flow optimization, probability of default, default correlation, strategies, simulation, simplex
Procedia PDF Downloads 2761464 An Efficient Encryption Scheme Using DWT and Arnold Transforms
Authors: Ali Abdrhman M. Ukasha
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. The color image is decomposed into red, green, and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using a key image that has same original size and is generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours of color image recovery can be obtained with accepted level of distortion using Canny edge detector. Experiments have demonstrated that proposed algorithm can fully encrypt 2D color image and completely reconstructed without any distortion. It has shown that the color image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: color image, wavelet transform, edge detector, Arnold transform, lossy image encryption
Procedia PDF Downloads 4871463 Progressive Multimedia Collection Structuring via Scene Linking
Authors: Aman Berhe, Camille Guinaudeau, Claude Barras
Abstract:
In order to facilitate information seeking in large collections of multimedia documents with long and progressive content (such as broadcast news or TV series), one can extract the semantic links that exist between semantically coherent parts of documents, i.e., scenes. The links can then create a coherent collection of scenes from which it is easier to perform content analysis, topic extraction, or information retrieval. In this paper, we focus on TV series structuring and propose two approaches for scene linking at different levels of granularity (episode and season): a fuzzy online clustering technique and a graph-based community detection algorithm. When evaluated on the two first seasons of the TV series Game of Thrones, we found that the fuzzy online clustering approach performed better compared to graph-based community detection at the episode level, while graph-based approaches show better performance at the season level.Keywords: multimedia collection structuring, progressive content, scene linking, fuzzy clustering, community detection
Procedia PDF Downloads 1011462 Development of Carrageenan-Psyllium/Montmorillonite Clay Hybrid Hydrogels for Agriculture Purpose
Authors: D. Aydinoglu, N. Karaca, O. Ceylan
Abstract:
Limited water resources on the earth come first among the most alarming issues. In this respect, several solutions from treatment of waste water to water management have been proposed. Recently, use of hydrogels as soil additive, which is one of the water management ways in agriculture, has gained increasing interest. In traditional agriculture applications, water used with irrigation aim, rapidly flows down between the pore structures in soil, without enough useful for soil. To overcome this fact and increase the abovementioned limit values, recently, several natural based hydrogels have been suggested and tested to find out their efficiency in soil. However, most of these researches have dealt with grafting of synthetic acrylate based monomers on natural gelling agents, most probably due to reinforced of the natural gels. These results motivated us to search a natural based hydrogel formulations, not including any synthetic component, and strengthened with montmorillonite clay instead of any grafting polymerization with synthetic monomer and examine their potential in this field, as well as characterize of them. With this purpose, carrageenan-psyllium/ montmorillonite hybrid hydrogels have been successively prepared. Their swelling capacities were determined both in deionized and tap water and were found to be dependent on the carrageenan, psyllium and montmorillonite ratios, as well as the water type. On the other hand, mechanical tests revealed that especially carrageenan and montmorillonite contents have a great effect on gel strength, which is one of the essential features, preventing the gels from cracking resulted in readily outflow of all the water in the gel without beneficial for soil. They found to reach 0.23 MPa. The experiments carried out with soil indicated that hydrogels significantly improved the water uptake capacities and water retention degrees of the soil from 49 g to 85 g per g of soil and from 32 to 67%, respectively, depending on the ingredient ratios. Also, biodegradation tests demonstrated that all the hydrogels undergo biodegradation, as expected from their natural origin. The overall results suggested that these hybrid hydrogels have a potential for use as soil additive and can be safely used owing to their totally natural structure.Keywords: carrageenan, hydrogel, montmorillonite, psyllium
Procedia PDF Downloads 1161461 Comparison of Deep Brain Stimulation Targets in Parkinson's Disease: A Systematic Review
Authors: Hushyar Azari
Abstract:
Aim and background: Deep brain stimulation (DBS) is regarded as an important therapeutic choice for Parkinson's disease (PD). The two most common targets for DBS are the subthalamic nucleus (STN) and globus pallidus (GPi). This review was conducted to compare the clinical effectiveness of these two targets. Methods: A systematic literature search in electronic databases: Embase, Cochrane Library and PubMed were restricted to English language publications 2010 to 2021. Specified MeSH terms were searched in all databases. Studies which evaluated the Unified Parkinson's Disease Rating Scale (UPDRS) III were selected by meeting the following criteria: (1) compared both GPi and STN DBS; (2) had at least three months follow-up period; (3)at least five participants in each group; (4)conducted after 2010. Study quality assessment was performed using the Modified Jadad Scale. Results: 3577 potentially relevant articles were identified, of these, 3569 were excluded based on title and abstract, duplicate and unsuitable article removal. Eight articles satisfied the inclusion criteria and were scrutinized (458 PD patients). According to Modified Jadad Scale, the majority of included studies had low evidence quality which was a limitation of this review. 5 studies reported no statistically significant between-group difference for improvements in UPDRS ш scores. At the same time, there were some results in terms of pain, action tremor, rigidity, and urinary symptoms, which indicated that STN DBS might be a better choice. Regarding the adverse effects, GPi was superior. Conclusion: It is clear that other larger randomized clinical trials with longer follow-up periods and control groups are needed to decide which target is more efficient for deep brain stimulation in Parkinson’s disease and imposes fewer adverse effects on the patients. Meanwhile, STN seems more reasonable according to the results of this systematic review.Keywords: brain stimulation, globus pallidus, Parkinson's disease, subthalamic nucleus
Procedia PDF Downloads 1791460 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System
Authors: J. K. Adedeji, M. O. Oyekanmi
Abstract:
This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.Keywords: biometric characters, facial recognition, neural network, OpenCV
Procedia PDF Downloads 2581459 Designing a Syllabus for an Academic Writing Course Instruction Based on Students' Needs
Authors: Nuur Insan Tangkelangi
Abstract:
Needs on academic writing competence as the primary focus in higher education encourage the university institutions around the world to provide academic writing courses to support their students dealing with their tasks pertaining to this competence. However, a pilot study conducted previously in one of the universities in Palopo, a city in South Sulawesi, revealed that even though the institution has provided academic writing courses, supported by some workshops related to academic writing and some supporting facilities at campus, the students still face difficulties in completing their assignments related to academic writing, particularly in writing their theses. The present study focuses on investigating the specific needs of the students in the same institution in terms of competences required in academic writing. It is also carried out to examine whether the syllabus exists and accommodates the students’ needs or not. Questionnaire and interview were used to collect data from sixty students of sixth semester and two lecturers of the academic courses. The results reveal that the students need to learn all aspects of linguistic competence (language features, lexical phrases, academic language and vocabulary, and proper language) and some aspects in discourse competence (how to write introduction, search for appropriate literature, design research method, write coherent paragraphs, refer to sources, summarize and display data, and link sentences smoothly). Regarding the syllabus, it is found that the academic writing courses provided in the institution, where this study takes place, do not have syllabus. This condition is different from other institutions which provide syllabi for all courses. However, at the commencement of the course, the students and the lecturers have negotiated their learning goals, topics discussed, learning activities, and assessment criteria for the course. Therefore, even though the syllabus does not exist, but the elements of the syllabus are there. The negotiation between the students and the lecturers contributes to the students’ attitude toward the courses. The students are contented with the course and they feel that their needs in academic writing have been accommodated. However, some suggestions for the next academic writing courses are stated by the students. Considering the results of this study, a syllabus is then proposed which is expected to accommodate the specific needs of students in that institution.Keywords: Students' needs, academic writing, syllabus design for instruction, case study
Procedia PDF Downloads 2081458 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)
Procedia PDF Downloads 3671457 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis
Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho
Abstract:
This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis
Procedia PDF Downloads 1851456 Predictive Modelling Approach to Identify Spare Parts Inventory Obsolescence
Authors: Madhu Babu Cherukuri, Tamoghna Ghosh
Abstract:
Factory supply chain management spends billions of dollars every year to procure and manage equipment spare parts. Due to technology -and processes changes some of these spares become obsolete/dead inventory. Factories have huge dead inventory worth millions of dollars accumulating over time. This is due to lack of a scientific methodology to identify them and send the inventory back to the suppliers on a timely basis. The standard approach followed across industries to deal with this is: if a part is not used for a set pre-defined period of time it is declared dead. This leads to accumulation of dead parts over time and these parts cannot be sold back to the suppliers as it is too late as per contract agreement. Our main idea is the time period for identifying a part as dead cannot be a fixed pre-defined duration across all parts. Rather, it should depend on various properties of the part like historical consumption pattern, type of part, how many machines it is being used in, whether it- is a preventive maintenance part etc. We have designed a predictive algorithm which predicts part obsolescence well in advance with reasonable accuracy and which can help save millions.Keywords: obsolete inventory, machine learning, big data, supply chain analytics, dead inventory
Procedia PDF Downloads 3191455 Yields and Composition of the Gas, Liquid and Solid Fractions Obtained by Conventional Pyrolysis of Different Lignocellulosic Biomass Residues
Authors: María del Carmen Recio-Ruiz, Ramiro Ruiz-Rosas, Juana María Rosas, José Rodríguez-Mirasol, Tomás Cordero
Abstract:
Nowadays, fossil resources are main precursors for fuel production. Due to their contribution to the greenhouse effect and their future depletion, there is a constant search for environmentally friendly feedstock alternatives. Biomass residues constitute an interesting replacement for fossil resources because of their zero net CO₂ emissions. One of the main routes to convert biomass into energy and chemicals is pyrolysis. In this work, conventional pyrolysis of different biomass residues highly available such as almond shells, hemp hurds, olive stones, and Kraft lignin, was studied. In a typical experiment, the biomass was crushed and loaded into a fixed bed reactor under continuous nitrogen flow. The influence of temperature (400-800 ºC) and heating rate (10 and 20 ºC/min) on the pyrolysis yield and composition of the different fractions has been studied. In every case, the mass yields revealed that the solid fraction decreased with temperature, while liquid and gas fractions increased due to depolymerization and cracking reactions at high temperatures. The composition of every pyrolysis fraction was studied in detail. The results showed that the composition of the gas fraction was mainly CO, CO₂ when working at low temperatures, and mostly CH₄ and H₂at high temperatures. The solid fraction developed an incipient microporosity, with narrow micropore volume of 0.21 cm³/g. Regarding the liquid fraction, pyrolysis of almond shell, hemp hurds, and olive stones led mainly to a high content in aliphatic acids and furans, due to the high volatile matter content of these biomass (>74 %wt.), and phenols to a lesser degree, which were formed due to the degradation of lignin at higher temperatures. However, when Kraft lignin was used as bio-oil precursor, the presence of phenols was very prominent, and aliphatic compounds were also detected in a lesser extent.Keywords: Bio-oil, biomass, conventional pyrolysis, lignocellulosic
Procedia PDF Downloads 1341454 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 761453 Recovery of Chromium(III) from Tannery Wastewater by Nanoparticles and Whiskers of Chitosan
Authors: El Montassir Dahmane, Nadia Eladlani, Aziz Ouahrouch, Mohammed Rhazi, Moha Taourirte
Abstract:
The present study was aimed to approximate the optimal conditions to chromium recovery from wastewater by nanoparticles and whiskers of chitosan. Chitosan with an average molecular weight of 63 kDa and a 96% deacetylation degree was prepared according to our previous study. Chromium recovery is influenced by different parameters. In our search, we determined the appropriate range of pH to form chitosan–Cr(III), nanoparticles Cr(III), and whiskers– Cr(III) complex. We studied also the influence of chromium concentration and the nature of chitosan-based materials on the complexation process. Our main aim is to approximate the optimal conditions to remove chromium(III) from the tanning bath, recuperated from tannery wastewater of Marrakech in Morocco. A Perkin Elmer optima 2000 Inductively Coupled Plasma- Optical Emission Spectrometer (ICP-OES), was used to determine the quantity of chromium persistent in tannery wastewater after complexation phenomenon. To the best of our knowledge, this is the first report interested in the optimal conditions for chromium recovery from wastewater by nanoparticles and whiskers of chitosan. From our research, we found that in chromium solution, the appropriate range of pH to form complex is between 5.6 and 6.7. Also, the complexation of Cr(III) is depending on the nature of complexing ligand and chromium concentration. The obtained results reveal that nanoparticles present an excellent adsorption capacity regardless of chromium concentration. In addition, after a critical chromium concentration (250 mg/l), our ligand becomes saturated, that requires an increase of ligand mass for increasing chromium concentration in order to have a better adsorption capacity. Hence, in the same conditions, we used chitosan, its nanoparticles, whiskers, and chitosan based films to remove Cr(III) from tannery wastewater. The pH of this effluent was around 6, and its chromium concentration was 300 mg/l. The results expose that the sequence of complexing ligand in the effluent is the same in chromium solution, determined via our previous study. However, the adsorbed quantity is less due to the presence of other metallic ions in tannery wastewater. We conclude that the best complexing ligand-based chitosan is chitosan nanoaprticles whether it’s in chromium solution or in tannery wastewater. Nanoparticles are the best complexing ligand after 24 h of contact nanoparticles can remove 70% of chromium from this tannery wastewater.Keywords: nanoparticles, whiskers, chitosan, chromium
Procedia PDF Downloads 1371452 Application of Modified Vermiculite for Cationic Textile Dyestuffs Removal: Sorption and Regeneration Studies
Authors: W. Stawiński, A. Wegrzyn, O. M. Freitas, S. A. Figueiredo
Abstract:
Water is a life supporting resource, crucial for humanity and essential for natural ecosystems, which have been endangered by developing industry and increasing human population. Dyes are common in effluents discharged by various industries such as paper, plastics, food, cosmetics, and textile. They produce toxic effects on animals and disturb natural biological processes in receiving waters. Having complex molecular structure and resistance to biological decomposition they are problematic and difficult to be treated by conventional methods. In the search of efficient and sustainable method, sorption has been getting more interest in application to wastewaters treatment. Clays are minerals that have a layer structure based on phyllosilicate sheets that may carry a charge, which is balanced by ions located between the sheets. These charge-balancing ions can be exchanged resulting in very good ion-exchange properties of the material. Modifications of clays enhance their properties, producing a good and inexpensive sorbent for the removal of pollutants from wastewaters. The presented work proves that the treatment of a clay, vermiculite, with nitric acid followed by washing in citric acid strongly increases the sorption of two cationic dyes, methylene blue (C.I. 52015) and astrazon red (C.I. 110825). Desorption studies showed that the best eluent for regeneration is a solution of NaCl in ethanol. Cycles of sorption and desorption in column system showed no significant deterioration of sorption capacity and proved that the material shows a very good performance as sorbent, which can be recycled and reused. The results obtained open new possibilities of further modifications on vermiculite and modifications of other materials in order to get very efficient sorbents useful for wastewater treatment.Keywords: cationic dyestuffs, sorption and regeneration, vermiculite, wastewater treatment
Procedia PDF Downloads 2651451 As a Little-Known Side a Passionate Statistician: Florence Nightingale
Authors: Gülcan Taşkıran, Ayla Bayık Temel
Abstract:
Background: Florence Nightingale, the modern founder of the nursing, is most famous for her role as a nurse. But not so much known about her contributions as a mathematician and statistician. Aim: In this conceptual article it is aimed to examine Florence Nightingale's statistics education, how she used her passion for statistics and applied statistical data in nursing care and her scientific contributions to statistical science. Design: Literature review method was used in the study. The databases of Istanbul University Library Search Engine, Turkish Medical Directory, Thesis Scanning Center of Higher Education Council, PubMed, Google Scholar, EBSCO Host, Web of Science were scanned to reach the studies. The keywords 'statistics' and 'Florence Nightingale' have been used in Turkish and English while being screened. As a result of the screening, totally 41 studies were examined from the national and international literature. Results: Florence Nightingale has interested in mathematics and statistics at her early ages and has received various training in these subjects. Lessons learned by Nightingale in a cultured family environment, her talent in mathematics and numbers, and her religious beliefs played a crucial role in the direction of the statistics. She was influenced by Quetelet's ideas in the formation of the statistical philosophy and received support from William Farr in her statistical studies. During the Crimean War, she applied statistical knowledge to nursing care, developed many statistical methods and graphics, so that she made revolutionary reforms in the health field. Conclusions: Nightingale's interest in statistics, her broad vision, the statistical ideas fused with religious beliefs, the innovative graphics she has developed and the extraordinary statistical projects that she carried out has been influential on the basis of her professional achievements. Florence Nightingale has also become a model for women in statistics. Today, using and teaching of statistics and research in nursing care practices and education programs continues with the light she gave.Keywords: Crimean war, Florence Nightingale, nursing, statistics
Procedia PDF Downloads 2931450 Implementation of Edge Detection Based on Autofluorescence Endoscopic Image of Field Programmable Gate Array
Authors: Hao Cheng, Zhiwu Wang, Guozheng Yan, Pingping Jiang, Shijia Qin, Shuai Kuang
Abstract:
Autofluorescence Imaging (AFI) is a technology for detecting early carcinogenesis of the gastrointestinal tract in recent years. Compared with traditional white light endoscopy (WLE), this technology greatly improves the detection accuracy of early carcinogenesis, because the colors of normal tissues are different from cancerous tissues. Thus, edge detection can distinguish them in grayscale images. In this paper, based on the traditional Sobel edge detection method, optimization has been performed on this method which considers the environment of the gastrointestinal, including adaptive threshold and morphological processing. All of the processes are implemented on our self-designed system based on the image sensor OV6930 and Field Programmable Gate Array (FPGA), The system can capture the gastrointestinal image taken by the lens in real time and detect edges. The final experiments verified the feasibility of our system and the effectiveness and accuracy of the edge detection algorithm.Keywords: AFI, edge detection, adaptive threshold, morphological processing, OV6930, FPGA
Procedia PDF Downloads 2301449 Output-Feedback Control Design for a General Class of Systems Subject to Sampling and Uncertainties
Authors: Tomas Menard
Abstract:
The synthesis of output-feedback control law has been investigated by many researchers since the last century. While many results exist for the case of Linear Time Invariant systems whose measurements are continuously available, nowadays, control laws are usually implemented on micro-controller, then the measurements are discrete-time by nature. This fact has to be taken into account explicitly in order to obtain a satisfactory behavior of the closed-loop system. One considers here a general class of systems corresponding to an observability normal form and which is subject to uncertainties in the dynamics and sampling of the output. Indeed, in practice, the modeling of the system is never perfect, this results in unknown uncertainties in the dynamics of the model. We propose here an output feedback algorithm which is based on a linear state feedback and a continuous-discrete time observer. The main feature of the proposed control law is that only discrete-time measurements of the output are needed. Furthermore, it is formally proven that the state of the closed loop system exponentially converges toward the origin despite the unknown uncertainties. Finally, the performances of this control scheme are illustrated with simulations.Keywords: dynamical systems, output feedback control law, sampling, uncertain systems
Procedia PDF Downloads 2861448 A Comparison of Methods for Neural Network Aggregation
Authors: John Pomerat, Aviv Segev
Abstract:
Recently, deep learning has had many theoretical breakthroughs. For deep learning to be successful in the industry, however, there need to be practical algorithms capable of handling many real-world hiccups preventing the immediate application of a learning algorithm. Although AI promises to revolutionize the healthcare industry, getting access to patient data in order to train learning algorithms has not been easy. One proposed solution to this is data- sharing. In this paper, we propose an alternative protocol, based on multi-party computation, to train deep learning models while maintaining both the privacy and security of training data. We examine three methods of training neural networks in this way: Transfer learning, average ensemble learning, and series network learning. We compare these methods to the equivalent model obtained through data-sharing across two different experiments. Additionally, we address the security concerns of this protocol. While the motivating example is healthcare, our findings regarding multi-party computation of neural network training are purely theoretical and have use-cases outside the domain of healthcare.Keywords: neural network aggregation, multi-party computation, transfer learning, average ensemble learning
Procedia PDF Downloads 1631447 An Accurate Computation of 2D Zernike Moments via Fast Fourier Transform
Authors: Mohammed S. Al-Rawi, J. Bastos, J. Rodriguez
Abstract:
Object detection and object recognition are essential components of every computer vision system. Despite the high computational complexity and other problems related to numerical stability and accuracy, Zernike moments of 2D images (ZMs) have shown resilience when used in object recognition and have been used in various image analysis applications. In this work, we propose a novel method for computing ZMs via Fast Fourier Transform (FFT). Notably, this is the first algorithm that can generate ZMs up to extremely high orders accurately, e.g., it can be used to generate ZMs for orders up to 1000 or even higher. Furthermore, the proposed method is also simpler and faster than the other methods due to the availability of FFT software and/or hardware. The accuracies and numerical stability of ZMs computed via FFT have been confirmed using the orthogonality property. We also introduce normalizing ZMs with Neumann factor when the image is embedded in a larger grid, and color image reconstruction based on RGB normalization of the reconstructed images. Astonishingly, higher-order image reconstruction experiments show that the proposed methods are superior, both quantitatively and subjectively, compared to the q-recursive method.Keywords: Chebyshev polynomial, fourier transform, fast algorithms, image recognition, pseudo Zernike moments, Zernike moments
Procedia PDF Downloads 2651446 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant
Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan
Abstract:
The most important process of the water treatment plant process is the coagulation using alum and poly aluminum chloride (PACL), and the value of usage per day is a hundred thousand baht. Therefore, determining the dosage of alum and PACL are the most important factors to be prescribed. Water production is economical and valuable. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for prediction chemical dose used to coagulation such as alum and PACL, which input data consists of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of Bangkhen water treatment plant (BKWTP) Metropolitan Waterworks Authority. The data collected from 1 January 2019 to 31 December 2019 cover changing seasons of Thailand. The input data of ANN is divided into three groups training set, test set, and validation set, which the best model performance with a coefficient of determination and mean absolute error of alum are 0.73, 3.18, and PACL is 0.59, 3.21 respectively.Keywords: soft jar test, jar test, water treatment plant process, artificial neural network
Procedia PDF Downloads 1681445 Forecasting the Fluctuation of Currency Exchange Rate Using Random Forest
Authors: Lule Basha, Eralda Gjika
Abstract:
The exchange rate is one of the most important economic variables, especially for a small, open economy such as Albania. Its effect is noticeable in one country's competitiveness, trade and current account, inflation, wages, domestic economic activity, and bank stability. This study investigates the fluctuation of Albania’s exchange rates using monthly average foreign currency, Euro (Eur) to Albanian Lek (ALL) exchange rate with a time span from January 2008 to June 2021, and the macroeconomic factors that have a significant effect on the exchange rate. Initially, the Random Forest Regression algorithm is constructed to understand the impact of economic variables on the behavior of monthly average foreign currencies exchange rates. Then the forecast of macro-economic indicators for 12 months was performed using time series models. The predicted values received are placed in the random forest model in order to obtain the average monthly forecast of the Euro to Albanian Lek (ALL) exchange rate for the period July 2021 to June 2022.Keywords: exchange rate, random forest, time series, machine learning, prediction
Procedia PDF Downloads 104