Search results for: fast Fourier algorithms
3779 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration
Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef
Abstract:
Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab
Procedia PDF Downloads 3823778 A Comparative Study of the Maximum Power Point Tracking Methods for PV Systems Using Boost Converter
Authors: M. Doumi, A. Miloudi, A.G. Aissaoui, K. Tahir, C. Belfedal, S. Tahir
Abstract:
The studies on the photovoltaic system are extensively increasing because of a large, secure, essentially exhaustible and broadly available resource as a future energy supply. However, the output power induced in the photovoltaic modules is influenced by an intensity of solar cell radiation, temperature of the solar cells and so on. Therefore, to maximize the efficiency of the photovoltaic system, it is necessary to track the maximum power point of the PV array, for this Maximum Power Point Tracking (MPPT) technique is used. These algorithms are based on the Perturb-Observe, Conductance-Increment and the Fuzzy Logic methods. These techniques vary in many aspects as: simplicity, convergence speed, digital or analogical implementation, sensors required, cost, range of effectiveness, and in other aspects. This paper presents a comparative study of three widely-adopted MPPT algorithms; their performance is evaluated on the energy point of view, by using the simulation tool Simulink®, considering different solar irradiance variations. MPPT using fuzzy logic shows superior performance and more reliable control to the other methods for this application.Keywords: photovoltaic system, MPPT, perturb and observe (P&O), incremental conductance (INC), Fuzzy Logic (FLC)
Procedia PDF Downloads 4113777 Evaluation of NoSQL in the Energy Marketplace with GraphQL Optimization
Authors: Michael Howard
Abstract:
The growing popularity of electric vehicles in the United States requires an ever-expanding infrastructure of commercial DC fast charging stations. The U.S. Department of Energy estimates 33,355 publicly available DC fast charging stations as of September 2023. In 2017, 115,370 gasoline stations were operating in the United States, much more ubiquitous than DC fast chargers. Range anxiety is an important impediment to the adoption of electric vehicles and is even more relevant in underserved regions in the country. The peer-to-peer energy marketplace helps fill the demand by allowing private home and small business owners to rent their 240 Volt, level-2 charging facilities. The existing, publicly accessible outlets are wrapped with a Cloud-connected microcontroller managing security and charging sessions. These microcontrollers act as Edge devices communicating with a Cloud message broker, while both buyer and seller users interact with the framework via a web-based user interface. The database storage used by the marketplace framework is a key component in both the cost of development and the performance that contributes to the user experience. A traditional storage solution is the SQL database. The architecture and query language have been in existence since the 1970s and are well understood and documented. The Structured Query Language supported by the query engine provides fine granularity with user query conditions. However, difficulty in scaling across multiple nodes and cost of its server-based compute have resulted in a trend in the last 20 years towards other NoSQL, serverless approaches. In this study, we evaluate the NoSQL vs. SQL solutions through a comparison of Google Cloud Firestore and Cloud SQL MySQL offerings. The comparison pits Google's serverless, document-model, non-relational, NoSQL against the server-base, table-model, relational, SQL service. The evaluation is based on query latency, flexibility/scalability, and cost criteria. Through benchmarking and analysis of the architecture, we determine whether Firestore can support the energy marketplace storage needs and if the introduction of a GraphQL middleware layer can overcome its deficiencies.Keywords: non-relational, relational, MySQL, mitigate, Firestore, SQL, NoSQL, serverless, database, GraphQL
Procedia PDF Downloads 633776 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm
Authors: El Harraj Abdeslam, Raissouni Naoufal
Abstract:
The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes
Procedia PDF Downloads 2583775 Analysis and Detection of Facial Expressions in Autism Spectrum Disorder People Using Machine Learning
Authors: Muhammad Maisam Abbas, Salman Tariq, Usama Riaz, Muhammad Tanveer, Humaira Abdul Ghafoor
Abstract:
Autism Spectrum Disorder (ASD) refers to a developmental disorder that impairs an individual's communication and interaction ability. Individuals feel difficult to read facial expressions while communicating or interacting. Facial Expression Recognition (FER) is a unique method of classifying basic human expressions, i.e., happiness, fear, surprise, sadness, disgust, neutral, and anger through static and dynamic sources. This paper conducts a comprehensive comparison and proposed optimal method for a continued research project—a system that can assist people who have Autism Spectrum Disorder (ASD) in recognizing facial expressions. Comparison has been conducted on three supervised learning algorithms EigenFace, FisherFace, and LBPH. The JAFFE, CK+, and TFEID (I&II) datasets have been used to train and test the algorithms. The results were then evaluated based on variance, standard deviation, and accuracy. The experiments showed that FisherFace has the highest accuracy for all datasets and is considered the best algorithm to be implemented in our system.Keywords: autism spectrum disorder, ASD, EigenFace, facial expression recognition, FisherFace, local binary pattern histogram, LBPH
Procedia PDF Downloads 1763774 Influence of Densification Process and Material Properties on Final Briquettes Quality from FastGrowing Willows
Authors: Peter Križan, Juraj Beniak, Ľubomír Šooš, Miloš Matúš
Abstract:
Biomass treatment through densification is very suitable and important technology before its effective energy recovery. Densification process of biomass is significantly influenced by various technological and also material parameters which are ultimately reflected on the final solid Biofuels quality. The paper deals with the experimental research of the relationship between technological and material parameters during densification of fast-growing trees, roundly fast-rowing willow. The main goal of presented experimental research is to determine the relationship between pressing pressure raw material fraction size from a final briquettes density point of view. Experimental research was realized by single-axis densification. The impact of fraction size with interaction of pressing pressure and stabilization time on the quality properties of briquettes was determined. These parameters interaction affects the final solid biofuels (briquettes) quality. From briquettes production point of view and also from densification machines constructions point of view is very important to know about mutual interaction of these parameters on final briquettes quality. The experimental findings presented here are showing the importance of mentioned parameters during the densification process.Keywords: briquettes density, densification, fraction size, pressing pressure, stabilization time
Procedia PDF Downloads 3683773 An Exploratory Study of Reliability of Ranking vs. Rating in Peer Assessment
Authors: Yang Song, Yifan Guo, Edward F. Gehringer
Abstract:
Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups, but they basically fall into two categories: rating-based and ranking-based. The rating-based systems ask assessors to rate the artifacts one by one following some review rubrics. The ranking-based systems allow assessors to review a set of artifacts and give a rank for each of them. Though there are different systems and a large number of users of each category, there is no comprehensive comparison on which design leads to higher reliability. In this paper, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are at least 10% more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there is no such pattern for rating-based assessors.Keywords: peer assessment, peer rating, peer ranking, reliability
Procedia PDF Downloads 4403772 Kinetic, Equilibrium and Thermodynamic Studies of the Adsorption of Crystal Violet Dye Using Groundnut Hulls
Authors: Olumuyiwa Ayoola Kokapi, Olugbenga Solomon Bello
Abstract:
Dyes are organic compounds with complex aromatic molecular structure that resulted in fast colour on a substance. Dye effluent found in wastewater generated from the dyeing industries is one of the greatest contributors to water pollution. Groundnut hull (GH) is an agricultural material that constitutes waste in the environment. Environmental contamination by hazardous organic chemicals is an urgent problem, which is partially solved through adsorption technologies. The choice of groundnut hull was promised on the understanding that some materials of agricultural origin have shown potentials to act as Adsorbate for hazardous organic chemicals. The aim of this research is to evaluate the potential of groundnut hull to adsorb Crystal violet dye through kinetic, isotherm and thermodynamic studies. The prepared groundnut hulls was characterized using Brunauer, Emmett and Teller (BET), Fourier transform infrared (FTIR) and scanning electron microscopy (SEM). Operational parameters such as contact time, initial dye concentration, pH, and effect of temperature were studied. Equilibrium time for the adsorption process was attained in 80 minutes. Adsorption isotherms used to test the adsorption data were Langmuir and Freundlich isotherms model. Thermodynamic parameters such as ∆G°, ∆H°, and ∆S° of the adsorption processes were determined. The results showed that the uptake of dye by groundnut hulls occurred at a faster rate, corresponding to an increase in adsorption capacity at equilibrium time of 80 min from 0.78 to 4.45 mg/g and 0.77 to 4.45mg/g with an increase in the initial dye concentration from 10 to 50 mg/L for pH 3.0 and 8.0 respectively. High regression values obtained for pseudo-second-order kinetic model, sum of square error (SSE%) values along with strong agreement between experimental and calculated values of qe proved that pseudo second-order kinetic model fitted more than pseudo first-order kinetic model. The result of Langmuir and Freundlich model showed that the adsorption data fit the Langmuir model more than the Freundlich model. Thermodynamic study demonstrated the feasibility, spontaneous and endothermic nature of the adsorption process due to negative values of free energy change (∆G) at all temperatures and positive value of enthalpy change (∆H) respectively. The positive values of ∆S showed that there was increased disorderliness and randomness at the solid/solution interface of crystal violet dye and groundnut hulls. The present investigation showed that, groundnut hulls (GH) is a good low-cost alternative adsorbent for the removal of Crystal Violet (CV) dye from aqueous solution.Keywords: adsorption, crystal violet dye, groundnut halls, kinetics
Procedia PDF Downloads 3773771 Quantitative and Fourier Transform Infrared Analysis of Saponins from Three Kenyan Ruellia Species: Ruellia prostrata, Ruellia lineari-bracteolata and Ruellia bignoniiflora
Authors: Christine O. Wangia, Jennifer A. Orwa, Francis W. Muregi, Patrick G. Kareru, Kipyegon Cheruiyot, Eric Guantai
Abstract:
Ruellia (syn. Dipteracanthus) species are wild perennial creepers belonging to the Acanthaceae family. These species are reported to possess anti-inflammatory, analgesic, antioxidant, gastroprotective, anticancer, and immuno-stimulant properties. Phytochemical screening of both aqueous and methanolic extracts of Ruellia species revealed the presence of saponins. Saponins have been reported to possess anti-inflammatory, antioxidant, immuno-stimulant, antihepatotoxic, antibacterial, anticarcinogenic, and antiulcerogenic activities. The objective of this study was to quantify and analyze the Fourier transform infrared (FTIR) spectra of saponins in crude extracts of three Kenyan Ruellia species namely Ruellia prostrata (RPM), Ruellia lineari-bracteolata (RLB) and Ruellia bignoniiflora (RBK). Sequential organic extraction of the ground whole plant material was done using petroleum ether (PE), chloroform, ethyl acetate (EtOAc), and absolute methanol by cold maceration, while aqueous extraction was by hot maceration. The plant powders and extracts were mixed with spectroscopic grade KBr and compressed into a pellet. The infrared spectra were recorded using a Shimadzu FTIR spectrophotometer of 8000 series in the range of 3500 cm-1 - 500 cm-1. Quantitative determination of the saponins was done using standard procedures. Quantitative analysis of saponins showed that RPM had the highest quantity of crude saponins (2.05% ± 0.03), followed by RLB (1.4% ± 0.15) and RBK (1.25% ± 0.11), respectively. FTIR spectra revealed the spectral peaks characteristic for saponins in RPM, RLB, and RBK plant powders, aqueous and methanol extracts; O-H absorption (3265 - 3393 cm-1), C-H absorption ranging from 2851 to 2924 cm-1, C=C absorbance (1628 - 1655 cm-1), oligosaccharide linkage (C-O-C) absorption due to sapogenins (1036 - 1042 cm-1). The crude saponins from RPM, RLB and RBK showed similar peaks to their respective extracts. The presence of the saponins in extracts of RPM, RLB and RBK may be responsible for some of the biological activities reported in the Ruellia species.1Keywords: Ruellia bignoniiflora, Ruellia linearibracteolata, Ruellia prostrata, Saponins
Procedia PDF Downloads 1813770 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem
Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi
Abstract:
In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm
Procedia PDF Downloads 1973769 Resource-Constrained Assembly Line Balancing Problems with Multi-Manned Workstations
Authors: Yin-Yann Chen, Jia-Ying Li
Abstract:
Assembly line balancing problems can be categorized into one-sided, two-sided, and multi-manned ones by using the number of operators deployed at workstations. This study explores the balancing problem of a resource-constrained assembly line with multi-manned workstations. Resources include machines or tools in assembly lines such as jigs, fixtures, and hand tools. A mathematical programming model was developed to carry out decision-making and planning in order to minimize the numbers of workstations, resources, and operators for achieving optimal production efficiency. To improve the solution-finding efficiency, a genetic algorithm (GA) and a simulated annealing algorithm (SA) were designed and developed in this study to be combined with a practical case in car making. Results of the GA/SA and mathematics programming were compared to verify their validity. Finally, analysis and comparison were conducted in terms of the target values, production efficiency, and deployment combinations provided by the algorithms in order for the results of this study to provide references for decision-making on production deployment.Keywords: heuristic algorithms, line balancing, multi-manned workstation, resource-constrained
Procedia PDF Downloads 2103768 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms
Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano
Abstract:
In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.Keywords: heuristic, MIP model, remedial course, school, timetabling
Procedia PDF Downloads 6053767 Computer-Integrated Surgery of the Human Brain, New Possibilities
Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto
Abstract:
The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.Keywords: computational mechanics, peridynamics, finite element, biomechanics
Procedia PDF Downloads 823766 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment
Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg
Abstract:
Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring
Procedia PDF Downloads 2423765 Radial Distribution Network Reliability Improvement by Using Imperialist Competitive Algorithm
Authors: Azim Khodadadi, Sahar Sadaat Vakili, Ebrahim Babaei
Abstract:
This study presents a numerical method to optimize the failure rate and repair time of a typical radial distribution system. Failure rate and repair time are effective parameters in customer and energy based indices of reliability. Decrease of these parameters improves reliability indices. Thus, system stability will be boost. The penalty functions indirectly reflect the cost of investment which spent to improve these indices. Constraints on customer and energy based indices, i.e. SAIFI, SAIDI, CAIDI and AENS have been considered by using a new method which reduces optimization algorithm controlling parameters. Imperialist Competitive Algorithm (ICA) used as main optimization technique and particle swarm optimization (PSO), simulated annealing (SA) and differential evolution (DE) has been applied for further investigation. These algorithms have been implemented on a test system by MATLAB. Obtained results have been compared with each other. The optimized values of repair time and failure rate are much lower than current values which this achievement reduced investment cost and also ICA gives better answer than the other used algorithms.Keywords: imperialist competitive algorithm, failure rate, repair time, radial distribution network
Procedia PDF Downloads 6693764 State Estimation of a Biotechnological Process Using Extended Kalman Filter and Particle Filter
Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte, V. Grincas
Abstract:
This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.Keywords: biomass concentration, extended Kalman filter, particle filter, state estimation, specific growth rate
Procedia PDF Downloads 4303763 Evaluation of Urban Land Development Direction in Kabul City, Afghanistan
Authors: Ahmad Sharif Ahmadi, Yoshitaka Kajita
Abstract:
Kabul, the capital and largest city in Afghanistan has been experiencing a massive population expansion and fast economic development in last decade, in which urban land has increasingly expanded and formed a high informal development territory in the city. This paper investigates the urban land development direction based on the integrated urbanization trends in Kabul city since the last and the fastest ever urban land growth period (1999-2008), which is parallel with the establishment of the new government in Afghanistan. Considering the existing challenges in terms of informal settlements, squatter settlements, the population expansion of the city, and fast economic development, as well as the huge influx of returning refugees from neighboring countries, and the sprawl direction of urbanization of the Kabul city urban fringes, this research focuses on the possible urban land development direction and trends for the city. The paper studies the feasible future land development direction of Kabul city in the northern part called Shamali basin, in which district 17 is the gateway for future development. The area has much developable area including eight districts of Kabul province, and the vast area of Parwan and Kapisa provinces. The northern area of the Kabul city generally has favorable conditions for further urbanization from the city. It is a large and relatively flat area of area in the northern part of Kabul city, with ample water resources available from the Panjshir basin as a base principle of land development direction in the area.Keywords: Kabul city, land development trends, urban land development, urbanization
Procedia PDF Downloads 2813762 Predictive Analytics in Traffic Flow Management: Integrating Temporal Dynamics and Traffic Characteristics to Estimate Travel Time
Authors: Maria Ezziani, Rabie Zine, Amine Amar, Ilhame Kissani
Abstract:
This paper introduces a predictive model for urban transportation engineering, which is vital for efficient traffic management. Utilizing comprehensive datasets and advanced statistical techniques, the model accurately forecasts travel times by considering temporal variations and traffic dynamics. Machine learning algorithms, including regression trees and neural networks, are employed to capture sequential dependencies. Results indicate significant improvements in predictive accuracy, particularly during peak hours and holidays, with the incorporation of traffic flow and speed variables. Future enhancements may integrate weather conditions and traffic incidents. The model's applications range from adaptive traffic management systems to route optimization algorithms, facilitating congestion reduction and enhancing journey reliability. Overall, this research extends beyond travel time estimation, offering insights into broader transportation planning and policy-making realms, empowering stakeholders to optimize infrastructure utilization and improve network efficiency.Keywords: predictive analytics, traffic flow, travel time estimation, urban transportation, machine learning, traffic management
Procedia PDF Downloads 853761 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 773760 Multimedia Firearms Training System
Authors: Aleksander Nawrat, Karol Jędrasiak, Artur Ryt, Dawid Sobel
Abstract:
The goal of the article is to present a novel Multimedia Firearms Training System. The system was developed in order to compensate for major problems of existing shooting training systems. The designed and implemented solution can be characterized by five major advantages: algorithm for automatic geometric calibration, algorithm of photometric recalibration, firearms hit point detection using thermal imaging camera, IR laser spot tracking algorithm for after action review analysis, and implementation of ballistics equations. The combination of the abovementioned advantages in a single multimedia firearms training system creates a comprehensive solution for detecting and tracking of the target point usable for shooting training systems and improving intervention tactics of uniformed services. The introduced algorithms of geometric and photometric recalibration allow the use of economically viable commercially available projectors for systems that require long and intensive use without most of the negative impacts on color mapping of existing multi-projector multimedia shooting range systems. The article presents the results of the developed algorithms and their application in real training systems.Keywords: firearms shot detection, geometric recalibration, photometric recalibration, IR tracking algorithm, thermography, ballistics
Procedia PDF Downloads 2243759 Computational Neurosciences: An Inspiration from Biological Neurosciences
Authors: Harsh Sadawarti, Kamal Malik
Abstract:
Humans are the unique and the most powerful creature on this planet just because of the high level of intelligence gifted by nature. Computational Intelligence is highly influenced by the term natural intelligence, neurosciences and mathematics. To deal with the in-depth study of computational intelligence and to utilize it in real-life applications, it is quite important to understand its simulation with the human brain. In this paper, the three important parts, Frontal Lobe, Occipital Lobe and Parietal Lobe of the human brain, are compared with the ANN(Artificial Neural Network), CNN(Convolutional Neural network), and RNN(Recurrent Neural Network), respectively. Intelligent computational systems are created by combining deductive reasoning, logical concepts and high-level algorithms with the simulation and study of the human brain. Human brain is a combination of Physiology, Psychology, emotions, calculations and many other parameters which are of utmost importance that determines the overall intelligence. To create intelligent algorithms, smart machines and to simulate the human brain in an effective manner, it is quite important to have an insight into the human brain and the basic concepts of biological neurosciences.Keywords: computational intelligence, neurosciences, convolutional neural network, recurrent neural network, artificial neural network, frontal lobe, occipital lobe, parietal lobe
Procedia PDF Downloads 1123758 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks
Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul
Abstract:
Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50
Procedia PDF Downloads 1303757 Comparative Study of IC and Perturb and Observe Method of MPPT Algorithm for Grid Connected PV Module
Authors: Arvind Kumar, Manoj Kumar, Dattatraya H. Nagaraj, Amanpreet Singh, Jayanthi Prattapati
Abstract:
The purpose of this paper is to study and compare two maximum power point tracking (MPPT) algorithms in a photovoltaic simulation system and also show a simulation study of maximum power point tracking (MPPT) for photovoltaic systems using perturb and observe algorithm and Incremental conductance algorithm. Maximum power point tracking (MPPT) plays an important role in photovoltaic systems because it maximize the power output from a PV system for a given set of conditions, and therefore maximize the array efficiency and minimize the overall system cost. Since the maximum power point (MPP) varies, based on the irradiation and cell temperature, appropriate algorithms must be utilized to track the (MPP) and maintain the operation of the system in it. MATLAB/Simulink is used to establish a model of photovoltaic system with (MPPT) function. This system is developed by combining the models established of solar PV module and DC-DC Boost converter. The system is simulated under different climate conditions. Simulation results show that the photovoltaic simulation system can track the maximum power point accurately.Keywords: incremental conductance algorithm, perturb and observe algorithm, photovoltaic system, simulation results
Procedia PDF Downloads 5573756 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 933755 An Effective and Efficient Web Platform for Monitoring, Control, and Management of Drones Supported by a Microservices Approach
Authors: Jorge R. Santos, Pedro Sebastiao
Abstract:
In recent years there has been a great growth in the use of drones, being used in several areas such as security, agriculture, or research. The existence of some systems that allow the remote control of drones is a reality; however, these systems are quite simple and directed to specific functionality. This paper proposes the development of a web platform made in Vue.js and Node.js to control, manage, and monitor drones in real time. Using a microservice architecture, the proposed project will be able to integrate algorithms that allow the optimization of processes. Communication with remote devices is suggested via HTTP through 3G, 4G, and 5G networks and can be done in real time or by scheduling routes. This paper addresses the case of forest fires as one of the services that could be included in a system similar to the one presented. The results obtained with the elaboration of this project were a success. The communication between the web platform and drones allowed its remote control and monitoring. The incorporation of the fire detection algorithm in the platform proved possible a real time analysis of the images captured by the drone without human intervention. The proposed system has proved to be an asset to the use of drones in fire detection. The architecture of the application developed allows other algorithms to be implemented, obtaining a more complex application with clear expansion.Keywords: drone control, microservices, node.js, unmanned aerial vehicles, vue.js
Procedia PDF Downloads 1513754 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 1923753 Synchronous Reference Frame and Instantaneous P-Q Theory Based Control of Unified Power Quality Conditioner for Power Quality Improvement of Distribution System
Authors: Ambachew Simreteab Gebremedhn
Abstract:
Context: The paper explores the use of synchronous reference frame theory (SRFT) and instantaneous reactive power theory (IRPT) based control of Unified Power Quality Conditioner (UPQC) for improving power quality in distribution systems. Research Aim: To investigate the performance of different control configurations of UPQC using SRFT and IRPT for mitigating power quality issues in distribution systems. Methodology: The study compares three control techniques (SRFT-IRPT, SRFT-SRFT, IRPT-IRPT) implemented in series and shunt active filters of UPQC. Data is collected under various control algorithms to analyze UPQC performance. Findings: Results indicate the effectiveness of SRFT and IRPT based control techniques in addressing power quality problems such as voltage sags, swells, unbalance, harmonics, and current harmonics in distribution systems. Theoretical Importance: The study provides insights into the application of SRFT and IRPT in improving power quality, specifically in mitigating unbalanced voltage sags, where conventional methods fall short. Data Collection: Data is collected under various control algorithms using simulation in MATLAB Simulink and real-time operation executed with experimental results obtained using RT-LAB. Analysis Procedures: Performance analysis of UPQC under different control algorithms is conducted to evaluate the effectiveness of SRFT and IRPT based control techniques in mitigating power quality issues. Questions Addressed: How do SRFT and IRPT based control techniques compare in improving power quality in distribution systems? What is the impact of using different control configurations on the performance of UPQC? Conclusion: The study demonstrates the efficacy of SRFT and IRPT based control of UPQC in mitigating power quality issues in distribution systems, highlighting their potential for enhancing voltage and current quality.Keywords: power quality, UPQC, shunt active filter, series active filter, non-linear load, RT-LAB, MATLAB
Procedia PDF Downloads 143752 The Trajectory of the Ball in Football Game
Authors: Mahdi Motahari, Mojtaba Farzaneh, Ebrahim Sepidbar
Abstract:
Tracking of moving and flying targets is one of the most important issues in image processing topic. Estimating of trajectory of desired object in short-term and long-term scale is more important than tracking of moving and flying targets. In this paper, a new way of identifying and estimating of future trajectory of a moving ball in long-term scale is estimated by using synthesis and interaction of image processing algorithms including noise removal and image segmentation, Kalman filter algorithm in order to estimating of trajectory of ball in football game in short-term scale and intelligent adaptive neuro-fuzzy algorithm based on time series of traverse distance. The proposed system attain more than 96% identify accuracy by using aforesaid methods and relaying on aforesaid algorithms and data base video in format of synthesis and interaction. Although the present method has high precision, it is time consuming. By comparing this method with other methods we realize the accuracy and efficiency of that.Keywords: tracking, signal processing, moving targets and flying, artificial intelligent systems, estimating of trajectory, Kalman filter
Procedia PDF Downloads 4613751 Estimation of Transition and Emission Probabilities
Authors: Aakansha Gupta, Neha Vadnere, Tapasvi Soni, M. Anbarsi
Abstract:
Protein secondary structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine and biotechnology. Some aspects of protein functions and genome analysis can be predicted by secondary structure prediction. This is used to help annotate sequences, classify proteins, identify domains, and recognize functional motifs. In this paper, we represent protein secondary structure as a mathematical model. To extract and predict the protein secondary structure from the primary structure, we require a set of parameters. Any constants appearing in the model are specified by these parameters, which also provide a mechanism for efficient and accurate use of data. To estimate these model parameters there are many algorithms out of which the most popular one is the EM algorithm or called the Expectation Maximization Algorithm. These model parameters are estimated with the use of protein datasets like RS126 by using the Bayesian Probabilistic method (data set being categorical). This paper can then be extended into comparing the efficiency of EM algorithm to the other algorithms for estimating the model parameters, which will in turn lead to an efficient component for the Protein Secondary Structure Prediction. Further this paper provides a scope to use these parameters for predicting secondary structure of proteins using machine learning techniques like neural networks and fuzzy logic. The ultimate objective will be to obtain greater accuracy better than the previously achieved.Keywords: model parameters, expectation maximization algorithm, protein secondary structure prediction, bioinformatics
Procedia PDF Downloads 4823750 Telecontrolled Service Robots for Increasing the Quality of Life of Elderly and Disabled
Authors: Nayden Chivarov, Denis Chikurtev, Kaloyan Yovchev, Nedko Shivarov
Abstract:
This paper represents methods for improving the efficiency and precision of service mobile robot. This robot is used for increasing the quality of life of elderly and disabled people. The key concept of the proposed Intelligent Service Mobile Robot is its easier adaptability to achieve services for a wide range of Elderly or Disabled Person’s needs, by performing different tasks for supporting Elderly or Disabled Persons care. We developed robot autonomous navigation and computer vision systems in order to recognize different objects and bring them to the people. Web based user interface is developed to provide easy access and tele-control of the robot by any device through the internet. In this study algorithms for object recognition and localization are proposed for providing successful object recognition and accuracy in the positioning. Different methods for sending movement commands to the mobile robot system are proposed and evaluated. After executing some experiments to show the results of the research, we can summarize that these systems and algorithms provide good control of the service mobile robot and it will be more useful to help the elderly and disabled persons.Keywords: service robot, mobile robot, autonomous navigation, computer vision, web user interface, ROS
Procedia PDF Downloads 340