Search results for: Discrete Cuckoo Optimization Algorithm (DCOA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6458

Search results for: Discrete Cuckoo Optimization Algorithm (DCOA)

1808 Experimental and Numerical Investigations of Impact Response on High-Speed Train Windshield

Authors: Wen Ma, Yong Peng, Zhixiang Li

Abstract:

Security journey is a vital focus on the field of Rail Transportation. Accidents caused by the damage of the high-speed train windshield have occurred many times and have given rise to terrible consequences. Train windshield consists of tempered glass and polyvinyl butyral (PVB) film. In this work, the quasi-static tests and the split Hopkinson pressure bar (SHPB) tests were carried out first to obtain the mechanical properties and constitutive model for the tempered glass and PVB film. These tests results revealed that stress and Young’s modulus of tempered glass were wake-sensitive to strain rate, but stress and Young’s modulus of PVB film were strong-sensitive to strain rate. Then impact experiment of the windshield was carried out to investigate dynamic response and failure characteristics of train windshield. In addition, a finite element model based on the combined finite element method was proposed to investigate fracture and fragmentation responses of train windshield under different-velocity impact. The results can be used for further design and optimization of the windshield for high-speed train application.

Keywords: constitutive model, impact response, mechanism properties, PVB film, tempered glass

Procedia PDF Downloads 133
1807 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 342
1806 The Customization of 3D Last Form Design Based on Weighted Blending

Authors: Shih-Wen Hsiao, Chu-Hsuan Lee, Rong-Qi Chen

Abstract:

When it comes to last, it is regarded as the critical foundation of shoe design and development. Not only the last relates to the comfort of shoes wearing but also it aids the production of shoe styling and manufacturing. In order to enhance the efficiency and application of last development, a computer aided methodology for customized last form designs is proposed in this study. The reverse engineering is mainly applied to the process of scanning for the last form. Then the minimum energy is used for the revision of surface continuity, the surface of the last is reconstructed with the feature curves of the scanned last. When the surface of a last is reconstructed, based on the foundation of the proposed last form reconstruction module, the weighted arithmetic mean method is applied to the calculation on the shape morphing which differs from the grading for the control mesh of last, and the algorithm of subdivision is used to create the surface of last mesh, thus the feet-fitting 3D last form of different sizes is generated from its original form feature with functions remained. Finally, the practicability of the proposed methodology is verified through later case studies.

Keywords: 3D last design, customization, reverse engineering, weighted morphing, shape blending

Procedia PDF Downloads 329
1805 Electrochemical Detection of Polycyclic Aromatic Hydrocarbons in Urban Air by Exfoliated Graphite Based Electrode

Authors: A. Sacko, H. Nyoni, T. A. M. Msagati, B. Ntsendwana

Abstract:

Carbon based materials to target environmental pollutants have become increasingly recognized in science. Electrochemical methods using carbon based materials are notable methods for high sensitive detection of organic pollutants in air. It is therefore in this light that exfoliated graphite electrode was fabricated for electrochemical analysis of PAHs in urban atmospheric air. The electrochemical properties of the graphite electrode were studied using CV and EIS in the presence of acetate buffer supporting electrolyte with 2 Mm ferricyanide as a redox probe. The graphite electrode showed enhanced current response which confirms facile kinetics and enhanced sensitivity. However, the peak to peak (DE) separation increased as a function of scan rate. The EIS showed a high charger transfer resistance. The detection phenanthrene on the exfoliated graphite was studied in the presence of acetate buffer solution at PH 3.5 using DPV. The oxidation peak of phenanthrene was observed at 0.4 V. Under optimized conditions (supporting electrolyte, pH, deposition time, etc.). The detection limit observed was at 5x 10⁻⁸ M. Thus the results demonstrate with further optimization and modification lower concentration detection can be achieved.

Keywords: electrochemical detection, exfoliated graphite, PAHs (polycyclic aromatic hydrocarbons), urban air

Procedia PDF Downloads 191
1804 Amharic Text News Classification Using Supervised Learning

Authors: Misrak Assefa

Abstract:

The Amharic language is the second most widely spoken Semitic language in the world. There are several new overloaded on the web. Searching some useful documents from the web on a specific topic, which is written in the Amharic language, is a challenging task. Hence, document categorization is required for managing and filtering important information. In the classification of Amharic text news, there is still a gap in the domain of information that needs to be launch. This study attempts to design an automatic Amharic news classification using a supervised learning mechanism on four un-touch classes. To achieve this research, 4,182 news articles were used. Naive Bayes (NB) and Decision tree (j48) algorithms were used to classify the given Amharic dataset. In this paper, k-fold cross-validation is used to estimate the accuracy of the classifier. As a result, it shows those algorithms can be applicable in Amharic news categorization. The best average accuracy result is achieved by j48 decision tree and naïve Bayes is 95.2345 %, and 94.6245 % respectively using three categories. This research indicated that a typical decision tree algorithm is more applicable to Amharic news categorization.

Keywords: text categorization, supervised machine learning, naive Bayes, decision tree

Procedia PDF Downloads 182
1803 Dynamic Mode Decomposition and Wake Flow Modelling of a Wind Turbine

Authors: Nor Mazlin Zahari, Lian Gan, Xuerui Mao

Abstract:

The power production in wind farms and the mechanical loads on the turbines are strongly impacted by the wake of the wind turbine. Thus, there is a need for understanding and modelling the turbine wake dynamic in the wind farm and the layout optimization. Having a good wake model is important in predicting plant performance and understanding fatigue loads. In this paper, the Dynamic Mode Decomposition (DMD) was applied to the simulation data generated by a Direct Numerical Simulation (DNS) of flow around a turbine, perturbed by upstream inflow noise. This technique is useful in analyzing the wake flow, to predict its future states and to reflect flow dynamics associated with the coherent structures behind wind turbine wake flow. DMD was employed to describe the dynamic of the flow around turbine from the DNS data. Since the DNS data comes with the unstructured meshes and non-uniform grid, the interpolation of each occurring within each element in the data to obtain an evenly spaced mesh was performed before the DMD was applied. DMD analyses were able to tell us characteristics of the travelling waves behind the turbine, e.g. the dominant helical flow structures and the corresponding frequencies. As the result, the dominant frequency will be detected, and the associated spatial structure will be identified. The dynamic mode which represented the coherent structure will be presented.

Keywords: coherent structure, Direct Numerical Simulation (DNS), dominant frequency, Dynamic Mode Decomposition (DMD)

Procedia PDF Downloads 327
1802 Optimization of the Self-Recognition Direct Digital Radiology Technology by Applying the Density Detector Sensors

Authors: M. Dabirinezhad, M. Bayat Pour, A. Dabirinejad

Abstract:

In 2020, the technology was introduced to solve some of the deficiencies of direct digital radiology. SDDR is an invention that is capable of capturing dental images without human intervention, and it was invented by the authors of this paper. Adjusting the radiology wave dose is a part of the dentists, radiologists, and dental nurses’ tasks during the radiology photography process. In this paper, an improvement will be added to enable SDDR to set the suitable radiology wave dose according to the density and age of the patients automatically. The separate sensors will be included in the sensors’ package to use the ultrasonic wave to detect the density of the teeth and change the wave dose. It facilitates the process of dental photography in terms of time and enhances the accuracy of choosing the correct wave dose for each patient separately. Since the radiology waves are well known to trigger off other diseases such as cancer, choosing the most suitable wave dose can be helpful to decrease the side effect of that for human health. In other words, it decreases the exposure time for the patients. On the other hand, due to saving time, less energy will be consumed, and saving energy can be beneficial to decrease the environmental impact as well.

Keywords: dental direct digital imaging, environmental impacts, SDDR technology, wave dose

Procedia PDF Downloads 174
1801 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 95
1800 A Model to Assess Sustainability Using Multi-Criteria Analysis and Geographic Information Systems: A Case Study

Authors: Antonio Boggia, Luisa Paolotti, Gianluca Massei, Lucia Rocchi, Elaine Pace, Maria Attard

Abstract:

The aim of this paper is to present a methodology and a computer model for sustainability assessment based on the integration of Multi-criteria Decision Analysis (MCDA) with a Geographic Information System (GIS). It presents the result of a study for the implementation of a model for measuring sustainability to address the policy actions for the improvement of sustainability at territory level. The aim is to rank areas in order to understand the specific technical and/or financial support that is required to develop sustainable growth. Assessing sustainable development is a multidimensional problem: economic, social and environmental aspects have to be taken into account at the same time. The tool for a multidimensional representation is a proper set of indicators. The set of indicators must be integrated into a model, that is an assessment methodology, to be used for measuring sustainability. The model, developed by the Environmental Laboratory of the University of Perugia, is called GeoUmbriaSUIT. It is a calculation procedure developed as a plugin working in the open-source GIS software QuantumGIS. The multi-criteria method used within GeoUmbriaSUIT is the algorithm TOPSIS (Technique for Order Preference by Similarity to Ideal Design), which defines a ranking based on the distance from the worst point and the closeness to an ideal point, for each of the criteria used. For the sustainability assessment procedure, GeoUmbriaSUIT uses a geographic vector file where the graphic data represent the study area and the single evaluation units within it (the alternatives, e.g. the regions of a country, or the municipalities of a region), while the alphanumeric data (attribute table), describe the environmental, economic and social aspects related to the evaluation units by means of a set of indicators (criteria). The use of the algorithm available in the plugin allows to treat individually the indicators representing the three dimensions of sustainability, and to compute three different indices: environmental index, economic index and social index. The graphic output of the model allows for an integrated assessment of the three dimensions, avoiding aggregation. The presence of separate indices and graphic output make GeoUmbriaSUIT a readable and transparent tool, since it doesn’t produce an aggregate index of sustainability as final result of the calculations, which is often cryptic and difficult to interpret. In addition, it is possible to develop a “back analysis”, able to explain the positions obtained by the alternatives in the ranking, based on the criteria used. The case study presented is an assessment of the level of sustainability in the six regions of Malta, an island state in the middle of the Mediterranean Sea and the southernmost member of the European Union. The results show that the integration of MCDA-GIS is an adequate approach for sustainability assessment. In particular, the implemented model is able to provide easy to understand results. This is a very important condition for a sound decision support tool, since most of the time decision makers are not experts and need understandable output. In addition, the evaluation path is traceable and transparent.

Keywords: GIS, multi-criteria analysis, sustainability assessment, sustainable development

Procedia PDF Downloads 268
1799 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube

Authors: Dan Kanmegne

Abstract:

Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.

Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification

Procedia PDF Downloads 128
1798 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers

Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala

Abstract:

The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.

Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification

Procedia PDF Downloads 149
1797 Fuel Quality of Biodiesel from Chlorella protothecoides Microalgae Species

Authors: Mukesh Kumar, Mahendra Pal Sharma

Abstract:

Depleting fossil fuel resources coupled with serious environmental degradation has led to the search for alternative resources for biodiesel production as a substitute of Petro-diesel. Currently, edible, non-edible oils and microalgal plant species are cultivated for biodiesel production. Looking at the demerits of edible and non-edible oil resources, the focus is being given to grow microalgal species having high oil productivities, less maturity time and less land requirement. Out of various microalgal species, Chlorella protothecoides is considered as the most promising species for biodiesel production owing to high oil content (58 %), faster growth rate (24–48 h) and high biomass productivity (1214 mg/l/day). The present paper reports the results of optimization of reaction parameters of transesterification process as well as the kinetics of transesterification with 97% yield of biodiesel. The measurement of fuel quality of microalgal biodiesel shows that the biodiesel exhibit very good oxidation stability (O.S) of 7 hrs, more than ASTM D6751 (3 hrs) and EN 14112 (6 hrs) specifications. The CP and PP of 0 and -3 °C are finding as per ASTM D 2500-11 and ASTM D 97-12 standards. These results show that the microalgal biodiesel does not need any enhancement in O.S & CFP and hence can be recommended to be directly used as MB100 or its blends into diesel engine operation. Further, scope is available for the production of binary blends using poor quality biodiesel for engine operation.

Keywords: fuel quality, methyl ester yield, microalgae, transesterification

Procedia PDF Downloads 205
1796 Defining a Pathway to Zero Energy Building: A Case Study on Retrofitting an Old Office Building into a Net Zero Energy Building for Hot-Humid Climate

Authors: Kwame B. O. Amoah

Abstract:

This paper focuses on retrofitting an old existing office building to a net-zero energy building (NZEB). An existing small office building in Melbourne, Florida, was chosen as a case study to integrate state-of-the-art design strategies and energy-efficient building systems to improve building performance and reduce energy consumption. The study aimed to explore possible ways to maximize energy savings and renewable energy generation sources to cover the building's remaining energy needs necessary to achieve net-zero energy goals. A series of retrofit options were reviewed and adopted with some significant additional decision considerations. Detailed processes and considerations leading to zero energy are well documented in this study, with lessons learned adequately outlined. Based on building energy simulations, multiple design considerations were investigated, such as emerging state-of-the-art technologies, material selection, improvements to the building envelope, optimization of the HVAC, lighting systems, and occupancy loads analysis, as well as the application of renewable energy sources. The comparative analysis of simulation results was used to determine how specific techniques led to energy saving and cost reductions. The research results indicate this small office building can meet net-zero energy use after appropriate design manipulations and renewable energy sources.

Keywords: energy consumption, building energy analysis, energy retrofits, energy-efficiency

Procedia PDF Downloads 203
1795 The Comparison and Optimization of the Analytic Method for Canthaxanthin, Food Colorants

Authors: Hee-Jae Suh, Kyung-Su Kim, Min-Ji Kim, Yeon-Seong Jeong, Ok-Hwan Lee, Jae-Wook Shin, Hyang-Sook Chun, Chan Lee

Abstract:

Canthaxanthin is keto-carotenoid produced from beta-carotene and it has been approved to be used in many countries as a food coloring agent. Canthaxanthin has been analyzed using High Performance Liquid Chromatography (HPLC) system with various ways of pretreatment methods. Four official methods for verification of canthaxanthin at FSA (UK), AOAC (US), EFSA (EU) and MHLW (Japan) were compared to improve its analytical and the pretreatment method. The Linearity, the limit of detection (LOD), the limit of quantification (LOQ), the accuracy, the precision and the recovery ratio were determined from each method with modification in pretreatment method. All HPLC methods exhibited correlation coefficients of calibration curves for canthaxanthin as 0.9999. The analysis methods from FSA, AOAC, and MLHW showed the LOD of 0.395 ppm, 0.105 ppm, and 0.084 ppm, and the LOQ of 1.196 ppm, 0.318 ppm, 0.254 ppm, respectively. Among tested methods, HPLC method of MHLW with modification in pretreatments was finally selected for the analysis of canthaxanthin in lab, because it exhibited the resolution factor of 4.0 and the selectivity of 1.30. This analysis method showed a correlation coefficients value of 0.9999 and the lowest LOD and LOQ. Furthermore, the precision ratio was lower than 1 and the accuracy was almost 100%. The method presented the recovery ratio of 90-110% with modification in pretreatment method. The cross-validation of coefficient variation was 5 or less among tested three institutions in Korea.

Keywords: analytic method, canthaxanthin, food colorants, pretreatment method

Procedia PDF Downloads 670
1794 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence

Authors: Muhammad Bilal Shaikh

Abstract:

Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.

Keywords: multimodal AI, computer vision, NLP, mineral processing, mining

Procedia PDF Downloads 58
1793 An Optimal Approach for Full-Detailed Friction Model Identification of Reaction Wheel

Authors: Ghasem Sharifi, Hamed Shahmohamadi Ousaloo, Milad Azimi, Mehran Mirshams

Abstract:

The ever-increasing use of satellites demands a search for increasingly accurate and reliable pointing systems. Reaction wheels are rotating devices used commonly for the attitude control of the spacecraft since provide a wide range of torque magnitude and high reliability. The numerical modeling of this device can significantly enhance the accuracy of the satellite control in space. Modeling the wheel rotation in the presence of the various frictions is one of the critical parts of this approach. This paper presents a Dynamic Model Control of a Reaction Wheel (DMCR) in the current control mode. In current-mode, the required current is delivered to the coils in order to achieve the desired torque. During this research, all the friction parameters as viscous and coulomb, motor coefficient, resistance and voltage constant are identified. In order to model identification of a reaction wheel, numerous varying current commands apply on the particular wheel to verify the estimated model. All the parameters of DMCR are identified by classical Levenberg-Marquardt (CLM) optimization method. The experimental results demonstrate that the developed model has an appropriate precise and can be used in the satellite control simulation.

Keywords: experimental modeling, friction parameters, model identification, reaction wheel

Procedia PDF Downloads 221
1792 The Fit of the Partial Pair Distribution Functions of BaMnFeF7 Fluoride Glass Using the Buckingham Potential by the Hybrid RMC Simulation

Authors: Sidi Mohamed Mesli, Mohamed Habchi, Arslane Boudghene Stambouli, Rafik Benallal

Abstract:

The BaMnMF7 (M=Fe,V, transition metal fluoride glass, assuming isomorphous replacement) have been structurally studied through the simultaneous simulation of their neutron diffraction patterns by reverse Monte Carlo (RMC) and by the Hybrid Reverse Monte Carlo (HRMC) analysis. This last is applied to remedy the problem of the artificial satellite peaks that appear in the partial pair distribution functions (PDFs) by the RMC simulation. The HRMC simulation is an extension of the RMC algorithm, which introduces an energy penalty term (potential) in acceptance criteria. The idea of this work is to apply the Buckingham potential at the title glass by ignoring the van der Waals terms, in order to make a fit of the partial pair distribution functions and give the most possible realistic features. When displaying the partial PDFs, we suggest that the Buckingham potential is useful to describe average correlations especially in similar interactions.

Keywords: fluoride glasses, RMC simulation, hybrid RMC simulation, Buckingham potential, partial pair distribution functions

Procedia PDF Downloads 492
1791 Specified Human Motion Recognition and Unknown Hand-Held Object Tracking

Authors: Jinsiang Shaw, Pik-Hoe Chen

Abstract:

This paper aims to integrate human recognition, motion recognition, and object tracking technologies without requiring a pre-training database model for motion recognition or the unknown object itself. Furthermore, it can simultaneously track multiple users and multiple objects. Unlike other existing human motion recognition methods, our approach employs a rule-based condition method to determine if a user hand is approaching or departing an object. It uses a background subtraction method to separate the human and object from the background, and employs behavior features to effectively interpret human object-grabbing actions. With an object’s histogram characteristics, we are able to isolate and track it using back projection. Hence, a moving object trajectory can be recorded and the object itself can be located. This particular technique can be used in a camera surveillance system in a shopping area to perform real-time intelligent surveillance, thus preventing theft. Experimental results verify the validity of the developed surveillance algorithm with an accuracy of 83% for shoplifting detection.

Keywords: Automatic Tracking, Back Projection, Motion Recognition, Shoplifting

Procedia PDF Downloads 320
1790 Development of Quasi Real-Time Comprehensive System for Earthquake Disaster

Authors: Zhi Liu, Hui Jiang, Jin Li, Kunhao Chen, Langfang Zhang

Abstract:

Fast acquisition of the seismic information and accurate assessment of the earthquake disaster is the key problem for emergency rescue after a destructive earthquake. In order to meet the requirements of the earthquake emergency response and rescue for the cities and counties, a quasi real-time comprehensive evaluation system for earthquake disaster is developed. Based on monitoring data of Micro-Electro-Mechanical Systems (MEMS) strong motion network, structure database of a county area and the real-time disaster information by the mobile terminal after an earthquake, fragility analysis method and dynamic correction algorithm are synthetically obtained in the developed system. Real-time evaluation of the seismic disaster in the county region is finally realized to provide scientific basis for seismic emergency command, rescue and assistant decision.

Keywords: quasi real-time, earthquake disaster data collection, MEMS accelerometer, dynamic correction, comprehensive evaluation

Procedia PDF Downloads 199
1789 A Novel Multi-Objective Park and Ride Control Scheme Using Renewable Energy Sources: Cairo Case Study

Authors: Mohammed Elsayed Lotfy Elsayed Abouzeid, Tomonobu Senjyu

Abstract:

A novel multi-objective park and ride control approach is presented in this research. Park and ride will encourage the owners of the vehicles to leave their cars in the nearest points (on the edges of the crowded cities) and use public transportation facilities (train, bus, metro, or mon-rail) to reach their work inside the crowded city. The proposed control scheme is used to design electric vehicle charging stations (EVCS) to charge 1000 electric vehicles (EV) during their owners' work time. Cairo, Egypt is used as a case study. Photovoltaic (PV) and battery energy storage system (BESS) are used to meet the EVCS demand. Two multi-objective optimization techniques (MOGA and epsilon-MOGA) are utilized to get the optimal sizes of PV and BESS so as to meet the load demand and minimize the total life cycle cost. Detailed analysis and comparison are held to investigate the performance of the proposed control scheme using MATLAB.

Keywords: Battery Energy Storage System, Electric Vehicle, Park and Ride, Photovoltaic, Multi-objective

Procedia PDF Downloads 125
1788 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction

Authors: Kefaya Qaddoum

Abstract:

Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.

Keywords: tomato yield prediction, naive Bayes, redundancy, WSG

Procedia PDF Downloads 219
1787 Predicting Indonesia External Debt Crisis: An Artificial Neural Network Approach

Authors: Riznaldi Akbar

Abstract:

In this study, we compared the performance of the Artificial Neural Network (ANN) model with back-propagation algorithm in correctly predicting in-sample and out-of-sample external debt crisis in Indonesia. We found that exchange rate, foreign reserves, and exports are the major determinants to experiencing external debt crisis. The ANN in-sample performance provides relatively superior results. The ANN model is able to classify correctly crisis of 89.12 per cent with reasonably low false alarms of 7.01 per cent. In out-of-sample, the prediction performance fairly deteriorates compared to their in-sample performances. It could be explained as the ANN model tends to over-fit the data in the in-sample, but it could not fit the out-of-sample very well. The 10-fold cross-validation has been used to improve the out-of-sample prediction accuracy. The results also offer policy implications. The out-of-sample performance could be very sensitive to the size of the samples, as it could yield a higher total misclassification error and lower prediction accuracy. The ANN model could be used to identify past crisis episodes with some accuracy, but predicting crisis outside the estimation sample is much more challenging because of the presence of uncertainty.

Keywords: debt crisis, external debt, artificial neural network, ANN

Procedia PDF Downloads 427
1786 Optimization of Parameters for Electrospinning of Pan Nanofibers by Taguchi Method

Authors: Gamze Karanfil Celep, Kevser Dincer

Abstract:

The effects of polymer concentration and electrospinning process parameters on the average diameters of electrospun polyacrylonitrile (PAN) nanofibers were experimentally investigated. Besides, mechanical and thermal properties of PAN nanofibers were examined by tensile test and thermogravimetric analysis (TGA), respectively. For this purpose, the polymer concentration, solution feed rate, supply voltage and tip-to-collector distance were determined as the control factors. To succeed these aims, Taguchi’s L16 orthogonal design (4 parameters, 4 level) was employed for the experimental design. Optimal electrospinning conditions were defined using the signal-to-noise (S/N) ratio that was calculated from diameters of the electrospun PAN nanofibers according to "the-smaller-the-better" approachment. In addition, analysis of variance (ANOVA) was evaluated to conclude the statistical significance of the process parameters. The smallest diameter of PAN nanofibers was observed. According to the S/N ratio response results, the most effective parameter on finding out of nanofiber diameter was determined. Finally, the Taguchi design of experiments method has been found to be an effective method to statistically optimize the critical electrospinning parameters used in nanofiber production. After determining the optimum process parameters of nanofiber production, electrical conductivity and fuel cell performance of electrospun PAN nanofibers on the carbon papers will be evaluated.

Keywords: nanofiber, electrospinning, polyacrylonitrile, Taguchi method

Procedia PDF Downloads 191
1785 An Approaching Index to Evaluate a forward Collision Probability

Authors: Yuan-Lin Chen

Abstract:

This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.

Keywords: approaching index, forward collision probability, time to collision, time headway

Procedia PDF Downloads 277
1784 Quantification of Effects of Shape of Basement Topography below the Circular Basin on the Ground Motion Characteristics and Engineering Implications

Authors: Kamal, Dinesh Kumar, J. P. Narayan, Komal Rani

Abstract:

This paper presents the effects of shape of basement topography on the characteristics of the basin-generated surface (BGS) waves and associated average spectral amplification (ASA) in the 3D basins having circular surface area. Seismic responses were computed using a recently developed 3D fourth-order spatial accurate time-domain finite-difference (FD) algorithm based on parsimonious staggered-grid approximation of 3D viscoelastic wave equations. An increase of amplitude amplification and ASA towards the centre of different considered basins was obtained. Further, it may be concluded that ASA in basin very much depends on the impedance contrast, exposure area of basement to the incident wave front, edge-slope, focusing of the BGS-waves and sediment-damping. There is an urgent need of incorporation of a map of differential ground motion (DGM) caused by the BGS-waves as one of the output maps of the seismic microzonation.

Keywords: 3D viscoelastic simulation, basin-generated surface waves, maximum displacement, average spectral amplification

Procedia PDF Downloads 283
1783 Optimization of Pressure in Deep Drawing Process

Authors: Ajay Kumar Choubey, Geeta Agnihotri, C. Sasikumar, Rashmi Dwivedi

Abstract:

Deep-drawing operations are performed widely in industrial applications. It is very important for efficiency to achieve parts with no or minimum defects. Deep drawn parts are used in high performance, high strength and high reliability applications where tension, stress, load and human safety are critical considerations. Wrinkling is a kind of defect caused by stresses in the flange part of the blank during metal forming operations. To avoid wrinkling appropriate blank-holder pressure/force or drawbead can be applied. Now-a-day computer simulation plays a vital role in the field of manufacturing process. So computer simulation of manufacturing has much advantage over previous conventional process i.e. mass production, good quality of product, fast working etc. In this study, a two dimensional elasto-plastic Finite Element (F.E.) model for Mild Steel material blank has been developed to study the behavior of the flange wrinkling and deep drawing parameters under different Blank-Holder Pressure (B.H.P.). For this, commercially available Finite Element software ANSYS 14 has been used in this study. Simulation results are critically studied and salient conclusions have been drawn.

Keywords: ANSYS, deep drawing, BHP, finite element simulation, wrinkling

Procedia PDF Downloads 440
1782 Aerodynamic Design of a Light Long Range Blended Wing Body Unmanned Vehicle

Authors: Halison da Silva Pereira, Ciro Sobrinho Campolina Martins, Vitor Mainenti Leal Lopes

Abstract:

Long range performance is a goal for aircraft configuration optimization. Blended Wing Body (BWB) is presented in many works of literature as the most aerodynamically efficient design for a fixed-wing aircraft. Because of its high weight to thrust ratio, BWB is the ideal configuration for many Unmanned Aerial Vehicle (UAV) missions on geomatics applications. In this work, a BWB aerodynamic design for typical light geomatics payload is presented. Aerodynamic non-dimensional coefficients are predicted using low Reynolds number computational techniques (3D Panel Method) and wing parameters like aspect ratio, taper ratio, wing twist and sweep are optimized for high cruise performance and flight quality. The methodology of this work is a summary of tailless aircraft wing design and its application, with appropriate computational schemes, to light UAV subjected to low Reynolds number flows leads to conclusions like the higher performance and flight quality of thicker airfoils in the airframe body and the benefits of using aerodynamic twist rather than just geometric.

Keywords: blended wing body, low Reynolds number, panel method, UAV

Procedia PDF Downloads 572
1781 High Harmonics Generation in Hexagonal Graphene Quantum Dots

Authors: Armenuhi Ghazaryan, Qnarik Poghosyan, Tadevos Markosyan

Abstract:

We have considered the high-order harmonic generation in-plane graphene quantum dots of hexagonal shape by the independent quasiparticle approximation-tight binding model. We have investigated how such a nonlinear effect is affected by a strong optical wave field, quantum dot typical band gap and lateral size, and dephasing processes. The equation of motion for the density matrix is solved by performing the time integration with the eight-order Runge-Kutta algorithm. If the optical wave frequency is much less than the quantum dot intrinsic band gap, the main aspects of multiphoton high harmonic emission in quantum dots are revealed. In such a case, the dependence of the cutoff photon energy on the strength of the optical pump wave is almost linear. But when the wave frequency is comparable to the bandgap of the quantum dot, the cutoff photon energy shows saturation behavior with an increase in the wave field strength.

Keywords: strong wave field, multiphoton, bandgap, wave field strength, nanostructure

Procedia PDF Downloads 133
1780 Contention Window Adjustment in IEEE 802.11-based Industrial Wireless Networks

Authors: Mohsen Maadani, Seyed Ahmad Motamedi

Abstract:

The use of wireless technology in industrial networks has gained vast attraction in recent years. In this paper, we have thoroughly analyzed the effect of contention window (CW) size on the performance of IEEE 802.11-based industrial wireless networks (IWN), from delay and reliability perspective. Results show that the default values of CWmin, CWmax, and retry limit (RL) are far from the optimum performance due to the industrial application characteristics, including short packet and noisy environment. An adaptive CW algorithm (payload-dependent) has been proposed to minimize the average delay. Finally a simple, but effective CW and RL setting has been proposed for industrial applications which outperforms the minimum-average-delay solution from maximum delay and jitter perspective, at the cost of a little higher average delay. Simulation results show an improvement of up to 20%, 25%, and 30% in average delay, maximum delay and jitter respectively.

Keywords: average delay, contention window, distributed coordination function (DCF), jitter, industrial wireless network (IWN), maximum delay, reliability, retry limit

Procedia PDF Downloads 404
1779 Renovation Planning Model for a Shopping Mall

Authors: Hsin-Yun Lee

Abstract:

In this study, the pedestrian simulation VISWALK integration and application platform ant algorithms written program made to construct a renovation engineering schedule planning mode. The use of simulation analysis platform construction site when the user running the simulation, after calculating the user walks in the case of construction delays, the ant algorithm to find out the minimum delay time schedule plan, and add volume and unit area deactivated loss of business computing, and finally to the owners and users of two different positions cut considerations pick out the best schedule planning. To assess and validate its effectiveness, this study constructed the model imported floor of a shopping mall floor renovation engineering cases. Verify that the case can be found from the mode of the proposed project schedule planning program can effectively reduce the delay time and the user's walking mall loss of business, the impact of the operation on the renovation engineering facilities in the building to a minimum.

Keywords: pedestrian, renovation, schedule, simulation

Procedia PDF Downloads 401