Search results for: expanded invasive weed optimization algorithm (exIWO)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7292

Search results for: expanded invasive weed optimization algorithm (exIWO)

6722 Bionaut™: A Breakthrough Robotic Microdevice to Treat Non-Communicating Hydrocephalus in Both Adult and Pediatric Patients

Authors: Suehyun Cho, Darrell Harrington, Florent Cros, Olin Palmer, John Caputo, Michael Kardosh, Eran Oren, William Loudon, Alex Kiselyov, Michael Shpigelmacher

Abstract:

Bionaut Labs, LLC is developing a minimally invasive robotic microdevice designed to treat non-communicating hydrocephalus in both adult and pediatric patients. The device utilizes biocompatible microsurgical particles (Bionaut™) that are specifically designed to safely and reliably perform accurate fenestration(s) in the 3rd ventricle, aqueduct of Sylvius, and/or trapped intraventricular cysts of the brain in order to re-establish normal cerebrospinal fluid flow dynamics and thereby balance and/or normalize intra/intercompartmental pressure. The Bionaut™ is navigated to the target via CSF or brain tissue in a minimally invasive fashion with precise control using real-time imaging. Upon reaching the pre-defined anatomical target, the external driver allows for directing the specific microsurgical action defined to achieve the surgical goal. Notable features of the proposed protocol are i) Bionaut™ access to the intraventricular target follows a clinically validated endoscopy trajectory which may not be feasible via ‘traditional’ rigid endoscopy: ii) the treatment is microsurgical, there are no foreign materials left behind post-procedure; iii) Bionaut™ is an untethered device that is navigated through the subarachnoid and intraventricular compartments of the brain, following pre-designated non-linear trajectories as determined by the safest anatomical and physiological path; iv) Overall protocol involves minimally invasive delivery and post-operational retrieval of the surgical Bionaut™. The approach is expected to be suitable to treat pediatric patients 0-12 months old as well as adult patients with obstructive hydrocephalus who fail traditional shunts or are eligible for endoscopy. Current progress, including platform optimization, Bionaut™ control, and real-time imaging and in vivo safety studies of the Bionauts™ in large animals, specifically the spine and the brain of ovine models, will be discussed.

Keywords: Bionaut™, cerebrospinal fluid, CSF, fenestration, hydrocephalus, micro-robot, microsurgery

Procedia PDF Downloads 172
6721 Leveraging Deep Q Networks in Portfolio Optimization

Authors: Peng Liu

Abstract:

Deep Q networks (DQNs) represent a significant advancement in reinforcement learning, utilizing neural networks to approximate the optimal Q-value for guiding sequential decision processes. This paper presents a comprehensive introduction to reinforcement learning principles, delves into the mechanics of DQNs, and explores its application in portfolio optimization. By evaluating the performance of DQNs against traditional benchmark portfolios, we demonstrate its potential to enhance investment strategies. Our results underscore the advantages of DQNs in dynamically adjusting asset allocations, offering a robust portfolio management framework.

Keywords: deep reinforcement learning, deep Q networks, portfolio optimization, multi-period optimization

Procedia PDF Downloads 35
6720 Cloud Monitoring and Performance Optimization Ensuring High Availability and Security

Authors: Inayat Ur Rehman, Georgia Sakellari

Abstract:

Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.

Keywords: cloud computing, cloud monitoring, performance optimization, high availability

Procedia PDF Downloads 66
6719 Updating Stochastic Hosting Capacity Algorithm for Voltage Optimization Programs and Interconnect Standards

Authors: Nicholas Burica, Nina Selak

Abstract:

The ADHCAT (Automated Distribution Hosting Capacity Assessment Tool) was designed to run Hosting Capacity Analysis on the ComEd system via a stochastic DER (Distributed Energy Resource) placement on multiple power flow simulations against a set of violation criteria. The violation criteria in the initial version of the tool captured a limited amount of issues that individual departments design against for DER interconnections. Enhancements were made to the tool to further align with individual department violation and operation criteria, as well as the addition of new modules for use for future load profile analysis. A reporting engine was created for future analytical use based on the simulations and observations in the tool.

Keywords: distributed energy resources, hosting capacity, interconnect, voltage optimization

Procedia PDF Downloads 193
6718 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, the Scale Invariant Feature Transform (SIFT)

Procedia PDF Downloads 235
6717 Phasor Measurement Unit Based on Particle Filtering

Authors: Rithvik Reddy Adapa, Xin Wang

Abstract:

Phasor Measurement Units (PMUs) are very sophisticated measuring devices that find amplitude, phase and frequency of various voltages and currents in a power system. Particle filter is a state estimation technique that uses Bayesian inference. Particle filters are widely used in pose estimation and indoor navigation and are very reliable. This paper studies and compares four different particle filters as PMUs namely, generic particle filter (GPF), genetic algorithm particle filter (GAPF), particle swarm optimization particle filter (PSOPF) and adaptive particle filter (APF). Two different test signals are used to test the performance of the filters in terms of responsiveness and correctness of the estimates.

Keywords: phasor measurement unit, particle filter, genetic algorithm, particle swarm optimisation, state estimation

Procedia PDF Downloads 12
6716 A Comparative Study of k-NN and MLP-NN Classifiers Using GA-kNN Based Feature Selection Method for Wood Recognition System

Authors: Uswah Khairuddin, Rubiyah Yusof, Nenny Ruthfalydia Rosli

Abstract:

This paper presents a comparative study between k-Nearest Neighbour (k-NN) and Multi-Layer Perceptron Neural Network (MLP-NN) classifier using Genetic Algorithm (GA) as feature selector for wood recognition system. The features have been extracted from the images using Grey Level Co-Occurrence Matrix (GLCM). The use of GA based feature selection is mainly to ensure that the database used for training the features for the wood species pattern classifier consists of only optimized features. The feature selection process is aimed at selecting only the most discriminating features of the wood species to reduce the confusion for the pattern classifier. This feature selection approach maintains the ‘good’ features that minimizes the inter-class distance and maximizes the intra-class distance. Wrapper GA is used with k-NN classifier as fitness evaluator (GA-kNN). The results shows that k-NN is the best choice of classifier because it uses a very simple distance calculation algorithm and classification tasks can be done in a short time with good classification accuracy.

Keywords: feature selection, genetic algorithm, optimization, wood recognition system

Procedia PDF Downloads 545
6715 Multi-Subpopulation Genetic Algorithm with Estimation of Distribution Algorithm for Textile Batch Dyeing Scheduling Problem

Authors: Nhat-To Huynh, Chen-Fu Chien

Abstract:

Textile batch dyeing scheduling problem is complicated which includes batch formation, batch assignment on machines, batch sequencing with sequence-dependent setup time. Most manufacturers schedule their orders manually that are time consuming and inefficient. More power methods are needed to improve the solution. Motivated by the real needs, this study aims to propose approaches in which genetic algorithm is developed with multi-subpopulation and hybridised with estimation of distribution algorithm to solve the constructed problem for minimising the makespan. A heuristic algorithm is designed and embedded into the proposed algorithms to improve the ability to get out of the local optima. In addition, an empirical study is conducted in a textile company in Taiwan to validate the proposed approaches. The results have showed that proposed approaches are more efficient than simulated annealing algorithm.

Keywords: estimation of distribution algorithm, genetic algorithm, multi-subpopulation, scheduling, textile dyeing

Procedia PDF Downloads 300
6714 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications

Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison

Abstract:

In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.

Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller

Procedia PDF Downloads 240
6713 Nanomaterials Based Biosensing Chip for Non-Invasive Detection of Oral Cancer

Authors: Suveen Kumar

Abstract:

Oral cancer (OC) is the sixth most death causing cancer in world which includes tumour of lips, floor of the mouth, tongue, palate, cheeks, sinuses, throat, etc. Conventionally, the techniques used for OC detection are toluidine blue staining, biopsy, liquid-based cytology, visual attachments, etc., however these are limited by their highly invasive nature, low sensitivity, time consumption, sophisticated instrument handling, sample processing and high cost. Therefore, we developed biosensing chips for non-invasive detection of OC via CYFRA-21-1 biomarker. CYFRA-21-1 (molecular weight: 40 kDa) is secreted in saliva of OC patients which is a non-invasive biological fluid with a cut-off value of 3.8 ng mL-1, above which the subjects will be suffering from oral cancer. Therefore, in first work, 3-aminopropyl triethoxy silane (APTES) functionalized zirconia (ZrO2) nanoparticles (APTES/nZrO2) were used to successfully detect CYFRA-21-1 in a linear detection range (LDR) of 2-16 ng mL-1 with sensitivity of 2.2 µA mL ng-1. Successively, APTES/nZrO2-RGO was employed to prevent agglomeration of ZrO2 by providing high surface area reduced graphene oxide (RGO) support and much wider LDR (2-22 ng mL-1) was obtained with remarkable limit of detection (LOD) as 0.12 ng mL-1. Further, APTES/nY2O3/ITO platform was used for oral cancer bioseneor development. The developed biosensor (BSA/anti-CYFRA-21-1/APTES/nY2O3/ITO) have wider LDR (0.01-50 ng mL-1) with remarkable limit of detection (LOD) as 0.01 ng mL-1. To improve the sensitivity of the biosensing platform, nanocomposite of yattria stabilized nanostructured zirconia-reduced graphene oxide (nYZR) based biosensor has been developed. The developed biosensing chip having ability to detect CYFRA-21-1 biomolecules in the range of 0.01-50 ng mL-1, LOD of 7.2 pg mL-1 with sensitivity of 200 µA mL ng-1. Further, the applicability of the fabricated biosensing chips were also checked through real sample (saliva) analysis of OC patients and the obtained results showed good correlation with the standard protein detection enzyme linked immunosorbent assay (ELISA) technique.

Keywords: non-invasive, oral cancer, nanomaterials, biosensor, biochip

Procedia PDF Downloads 129
6712 Kinematic Hardening Parameters Identification with Respect to Objective Function

Authors: Marina Franulovic, Robert Basan, Bozidar Krizan

Abstract:

Constitutive modelling of material behaviour is becoming increasingly important in prediction of possible failures in highly loaded engineering components, and consequently, optimization of their design. In order to account for large number of phenomena that occur in the material during operation, such as kinematic hardening effect in low cycle fatigue behaviour of steels, complex nonlinear material models are used ever more frequently, despite of the complexity of determination of their parameters. As a method for the determination of these parameters, genetic algorithm is good choice because of its capability to provide very good approximation of the solution in systems with large number of unknown variables. For the application of genetic algorithm to parameter identification, inverse analysis must be primarily defined. It is used as a tool to fine-tune calculated stress-strain values with experimental ones. In order to choose proper objective function for inverse analysis among already existent and newly developed functions, the research is performed to investigate its influence on material behaviour modelling.

Keywords: genetic algorithm, kinematic hardening, material model, objective function

Procedia PDF Downloads 335
6711 A Rapid Code Acquisition Scheme in OOC-Based CDMA Systems

Authors: Keunhong Chae, Seokho Yoon

Abstract:

We propose a code acquisition scheme called improved multiple-shift (IMS) for optical code division multiple access systems, where the optical orthogonal code is used instead of the pseudo noise code. Although the IMS algorithm has a similar process to that of the conventional MS algorithm, it has a better code acquisition performance than the conventional MS algorithm. We analyze the code acquisition performance of the IMS algorithm and compare the code acquisition performances of the MS and the IMS algorithms in single-user and multi-user environments.

Keywords: code acquisition, optical CDMA, optical orthogonal code, serial algorithm

Procedia PDF Downloads 540
6710 Metastatic Invasive Lobular Cancer Presenting as a Cervical Polyp

Authors: Sally Shepherd, Craig Murphy

Abstract:

Introduction: The uterus or cervix are unusual locations as metastatic sites for cancers. It is further unusual for it to be a site of metastasis, whilst the primary malignancy remains occult. Case Report: A 63-year-old female with three months of altered bowel habits underwent a CT scan of the abdomen and pelvis, revealing a bulky uterus and left ovary, nonspecific colonic thickening, and diffuse peritoneal changes. She underwent colposcopy, which revealed a large endocervical polyp that was excised, revealing strongly hormone-positive metastatic invasive lobular breast cancer. She subsequently underwent a PET scan, which showed moderately diffuse activity in the cervix and left adnexa. Breast examination was unremarkable, and screening mammography, ultrasound, and MRI of the breast did not identify any lesions. Her blood tests revealed a Ca 15-3 of 934, CA-125 of 220, and CEA of 27. She was commenced on letrozole and ribociclib with an improvement in her symptoms. Conclusion: It is rare for occult breast cancer to be established and diagnosed by pelvic imaging and biopsy. Suspicion of uterine or cervical metastasis should be heightened in patients with an active or past history of breast cancer.

Keywords: occult breast cancer, cervical metastasis, invasive lobular carcinoma, metastasis

Procedia PDF Downloads 125
6709 Early and Mid-Term Results of Anesthetic Management of Minimal Invasive Coronary Artery Bypass Grafting Using One Lung Ventilation

Authors: Devendra Gupta, S. P. Ambesh, P. K Singh

Abstract:

Introduction: Minimally invasive coronary artery bypass grafting (MICABG) is a less invasive method of performing surgical revascularization. Minimally invasive direct coronary artery bypass (MIDCAB) provides many anesthetic challenges including one lung ventilation (OLV), managing myocardial ischemia, and pain. We present an early and midterm result of the use of this technique with OLV. Method: We enrolled 62 patients for analysis operated between 2008 and 2012. Patients were anesthetized and left endobronchial tube was placed. During the procedure left lung was isolated and one lung ventilation was maintained through right lung. Operation was performed utilizing off pump technique of coronary artery bypass grafting through a minimal invasive incision. Left internal mammary artery graft was done for single vessel disease and radial artery was utilized for other grafts if required. Postoperative ventilation was done with single lumen endotracheal tube. Median follow-up is 2.5 years (6 months to 4 years). Results: Median age was 58.5 years (41-77) and all were male. Single vessel disease was present in 36, double vessel in 24 and triple vessel disease in 2 patients. All the patients had normal left ventricular size and function. In 2 cases difficulty were encounter in placement of endobronchial tube. In 1 case cuff of endobronchial tube was ruptured during intubation. High airway pressure was developed on OLV in 1 case and surgery was accomplished with two lung anesthesia with low tidal volume. Mean postoperative ventilation time was 14.4 hour (11-22). There was no perioperative and 30 day mortality. Conversion to median sternotomy to complete the operation was done in 3.23% (2 out of 62 patients). One patient had acute myocardial infarction postoperatively and there were no deaths during follow-up. Conclusion: MICABG is a safe and effective method of revascularization with OLV in low risk candidates for coronary artery bypass grafting.

Keywords: MIDCABG, one lung ventilation, coronary artery bypass grafting, endobronchial tube

Procedia PDF Downloads 425
6708 Genetic Algorithms for Parameter Identification of DC Motor ARMAX Model and Optimal Control

Authors: A. Mansouri, F. Krim

Abstract:

This paper presents two techniques for DC motor parameters identification. We propose a numerical method using the adaptive extensive recursive least squares (AERLS) algorithm for real time parameters estimation. This algorithm, based on minimization of quadratic criterion, is realized in simulation for parameters identification of DC motor autoregressive moving average with extra inputs (ARMAX). As advanced technique, we use genetic algorithms (GA) identification with biased estimation for high dynamic performance speed regulation. DC motors are extensively used in variable speed drives, for robot and solar panel trajectory control. GA effectiveness is derived through comparison of the two approaches.

Keywords: ARMAX model, DC motor, AERLS, GA, optimization, parameter identification, PID speed regulation

Procedia PDF Downloads 381
6707 Efficient Heuristic Algorithm to Speed Up Graphcut in Gpu for Image Stitching

Authors: Tai Nguyen, Minh Bui, Huong Ninh, Tu Nguyen, Hai Tran

Abstract:

GraphCut algorithm has been widely utilized to solve various types of computer vision problems. Its expensive computational cost encouraged many researchers to improve the speed of the algorithm. Recent works proposed schemes that work on parallel computing platforms such as CUDA. However, the problem of low convergence speed prevents the usage of GraphCut for real time applications. In this paper, we propose global suppression heuristic to boost the conver-gence process of the algorithm. A parallel implementation of GraphCut algorithm on CUDA designed for the image stitching problem is introduced. Our method achieves up to 3× time boost on the graph of size 80 × 480 compared to the best sequential GraphCut algorithm while achieving satisfactory stitched images, suitable for panorama applications. Our source code will be soon available for further research.

Keywords: CUDA, graph cut, image stitching, texture synthesis, maxflow/mincut algorithm

Procedia PDF Downloads 132
6706 Patterns of Malignant and Benign Breast Lesions in Hail Region: A Retrospective Study at King Khalid Hospital

Authors: Laila Seada, Ashraf Ibrahim, Amjad Al Shammari

Abstract:

Background and Objectives: Breast carcinoma is the most common cancer of females in Hail region, accounting for 31% of all diagnosed cancer cases followed by thyroid carcinoma (25%) and colorectal carcinoma (13%). Methods: In the present retrospective study, all cases of breast lesions received at the histopathology department in King Khalid Hospital, Hail, during the period from May 2011 to April 2016 have been retrieved from department files. For all cases, a trucut biopsy, lumpectomy, or modified radical mastectomy was available for histopathologic diagnosis, while 105/140 (75%) had, as well, preoperative fine needle aspirates (FNA). Results: 49 cases out of 140 (35%) breast lesions were carcinomas: 44/49 (89.75%) was invasive ductal, 2/49(4.1%) invasive lobular carcinomas, 1/49(2.05%) intracystic low grade papillary carcinoma and 2/49 (4.1%) ductal carcinoma in situ (DCIS). Mean age for malignant cases was 45.06 (+/-10.58): 32.6% were below the age of 40 and 30.6 below 50 years, 18.3% below 60 and 16.3% below 70 years. For the benign group, mean age was 32.52 (+/10.5) years. Benign lesions were in order of frequency: 34 fibroadenomas, 14 fibrocystic disease, 12 chronic mastitis, five granulomatous mastitis, three intraductal papillomas, and three benign phyllodes tumor. Tubular adenoma, lipoma, skin nevus, pilomatrixoma, and breast reduction specimens constituted the remaining specimens. Conclusion: Breast lesions are common in our series and invasive carcinoma accounts for more than 1/3rd of the lumps, with 63.2% incidence in pre-menopausal ladies, below the age of 50 years. FNA as a non-invasive procedure, proved to be an effective tool in diagnosing both benign and malignant/suspicious breast lumps and should continue to be used as a first assessment line of palpable breast masses.

Keywords: age incidence, breast carcinoma, fine needle aspiration, hail region

Procedia PDF Downloads 280
6705 An Audit on the Role of Sentinel Node Biopsy in High-Risk Ductal Carcinoma in Situ and Intracystic Papillary Carcinoma

Authors: M. Sulieman, H. Arabiyat, H. Ali, K. Potiszil, I. Abbas, R. English, P. King, I. Brown, P. Drew

Abstract:

Introduction: The incidence of breast ductal Carcinoma in Situ (DCIS) has been increasing; it currently represents up 20-25% of all breast carcinomas. Some aspects of DCIS management are still controversial, mainly due to the heterogeneity of its clinical presentation and of its biological and pathological characteristics. In DCIS, histological diagnosis obtained preoperatively, carries the risk of sampling error if the presence of invasive cancer is subsequently diagnosed. The mammographic extent over than 4–5 cm and the presence of architectural distortion, focal asymmetric density or mass on mammography are proven important risk factors of preoperative histological under staging. Intracystic papillary cancer (IPC) is a rare form of breast carcinoma. Despite being previously compared to DCIS it has been shown to present histologically with invasion of the basement membrane and even metastasis. SLNB – Carries the risk of associated comorbidity that should be considered when planning surgery for DCIS and IPC. Objectives: The aim of this Audit was to better define a ‘high risk’ group of patients with pre-op diagnosis of non-invasive cancer undergoing breast conserving surgery, who would benefit from sentinel node biopsy. Method: Retrospective data collection of all patients with ductal carcinoma in situ over 5 years. 636 patients identified, and after exclusion criteria applied: 394 patients were included. High risk defined as: Extensive micro-calcification >40mm OR any mass forming DCIS. IPC: Winpath search from for the term ‘papillary carcinoma’ in any breast specimen for 5 years duration;.29 patients were included in this group. Results: DCIS: 188 deemed high risk due to >40mm calcification or a mass forming (radiological or palpable) 61% of those had a mastectomy and 32% BCS. Overall, in that high-risk group - the number with invasive disease was 38%. Of those high-risk DCIS pts 85% had a SLN - 80% at the time of surgery and 5% at a second operation. For the BCS patients - 42% had SLN at time of surgery and 13% (8 patients) at a second operation. 15 (7.9%) pts in the high-risk group had a positive SLNB, 11 having a mastectomy and 4 having BCS. IPC: The provisional diagnosis of encysted papillary carcinoma is upgraded to an invasive carcinoma on final histology in around a third of cases. This has may have implications when deciding whether to offer sentinel node removal at the time of therapeutic surgery. Conclusions: We have defined a ‘high risk’ group of pts with pre-op diagnosis of non-invasive cancer undergoing BCS, who would benefit from SLNB at the time of the surgery. In patients with high-risk features; the risk of invasive disease is up to 40% but the risk of nodal involvement is approximately 8%. The risk of morbidity from SLN is up to about 5% especially the risk of lymphedema.

Keywords: breast ductal carcinoma in Situ (DCIS), intracystic papillary carcinoma (IPC), sentinel node biopsy (SLNB), high-risk, non-invasive, cancer disease

Procedia PDF Downloads 111
6704 Development of Light-Weight Fibre-Based Materials for Building Envelopes

Authors: René Čechmánek, Vladan Prachař, Ludvík Lederer, Jiří Loskot

Abstract:

Thin-walled elements with a matrix set on a base of high-valuable Portland cement with dispersed reinforcement from alkali-resistant glass fibres are used in a range of applications as claddings of buildings and infrastructure constructions as well as various architectural elements of residential buildings. Even if their elementary thickness and therefore total weight is quite low, architects and building companies demand on even further decreasing of the bulk density of these fibre-cement elements for the reason of loading elimination of connected superstructures and easier assembling in demand conditions. By the means of various kinds of light-weight aggregates it is possible to achieve light-weighing of thin-walled fibre-cement composite elements. From the range of possible fillers with different material properties granulated expanded glass worked the best. By the means of laboratory testing an effect of two fillers based on expanded glass on the fibre reinforced cement composite was verified. Practical applicability was tested in the production of commonly manufactured glass fibre reinforced concrete elements, such as channels for electrical cable deposition, products for urban equipment and especially various cladding elements. Even if these are not structural elements, it is necessary to evaluate also strength characteristics and resistance to environment for their durability in certain applications.

Keywords: fibre-cement composite, granulated expanded glass, light-weighing

Procedia PDF Downloads 292
6703 Reinforcement Learning Optimization: Unraveling Trends and Advancements in Metaheuristic Algorithms

Authors: Rahul Paul, Kedar Nath Das

Abstract:

The field of machine learning (ML) is experiencing rapid development, resulting in a multitude of theoretical advancements and extensive practical implementations across various disciplines. The objective of ML is to facilitate the ability of machines to perform cognitive tasks by leveraging knowledge gained from prior experiences and effectively addressing complex problems, even in situations that deviate from previously encountered instances. Reinforcement Learning (RL) has emerged as a prominent subfield within ML and has gained considerable attention in recent times from researchers. This surge in interest can be attributed to the practical applications of RL, the increasing availability of data, and the rapid advancements in computing power. At the same time, optimization algorithms play a pivotal role in the field of ML and have attracted considerable interest from researchers. A multitude of proposals have been put forth to address optimization problems or improve optimization techniques within the domain of ML. The necessity of a thorough examination and implementation of optimization algorithms within the context of ML is of utmost importance in order to provide guidance for the advancement of research in both optimization and ML. This article provides a comprehensive overview of the application of metaheuristic evolutionary optimization algorithms in conjunction with RL to address a diverse range of scientific challenges. Furthermore, this article delves into the various challenges and unresolved issues pertaining to the optimization of RL models.

Keywords: machine learning, reinforcement learning, loss function, evolutionary optimization techniques

Procedia PDF Downloads 76
6702 Travel Planning in Public Transport Networks Applying the Algorithm A* for Metropolitan District of Quito

Authors: M. Fernanda Salgado, Alfonso Tierra, Wilbert Aguilar

Abstract:

The present project consists in applying the informed search algorithm A star (A*) to solve traveler problems, applying it by urban public transportation routes. The digitization of the information allowed to identify 26% of the total of routes that are registered within the Metropolitan District of Quito. For the validation of this information, data were taken in field on the travel times and the difference with respect to the times estimated by the program, resulting in that the difference between them was not greater than 2:20 minutes. We validate A* algorithm with the Dijkstra algorithm, comparing nodes vectors based on the public transport stops, the validation was established through the student t-test hypothesis. Then we verified that the times estimated by the program using the A* algorithm are similar to those registered on field. Furthermore, we review the performance of the algorithm generating iterations in both algorithms. Finally, with these iterations, a hypothesis test was carried out again with student t-test where it was concluded that the iterations of the base algorithm Dijsktra are greater than those generated by the algorithm A*.

Keywords: algorithm A*, graph, mobility, public transport, travel planning, routes

Procedia PDF Downloads 241
6701 Comparative Study Between Oral and Intralesional Injection of Beta Blocker in the Treatment of Infantile Capillary Hemangioma

Authors: Nadeen Eltoukhy, Sahar S. Sheta, Walaa Elnaggar, Karim Bakr

Abstract:

Purpose: The aim of this study is to compare the clinical efficacy and side effects of oral versus intralesional propranolol treatment of infantile capillary hemangiomas in infants. Methods: The study enrolled 40 infants diagnosed with infantile capillary hemangiomas. Patients were divided into 2 groups: Group A (Non-invasive group) included 20 infants who received oral propranolol hydrochloride starting at a dose of 1 mg/kg/day BID, then increased to a max of 2 mg/kg/day BID gradually over 2 weeks for 3 months. Group B (Invasive group) included 20 infants who received intralesional propranolol injection at a dose of 1 mg/mL; the volume of the injected drug depended on the size of the lesion (0.2 mL injected per cm of lesion diameter), with a maximum volume of 1 mL for a lesion of 5 cm diameter under complete aseptic conditions in the operating theater. Results: At three months after initiating treatment, the circumferential size of the hemangioma showed a statistically significant decrease in both groups; in Group A from 3.66±2.89 cm to 1.56±1.26 cm with p-value <0.05 and in Group B from 2.99±2.73 cm to 1.32±1.18 cm with p-value <0.05. There is no statistically significant comparative difference between the two groups (p-value = 0.538 = insignificant). Regarding the complications of oral propranolol, one patient (5%) had bradycardia, and one patient (5%) had diarrhea. In the injection group, 20 patients (100%) had local edema, and one patient (5%) had a local infection. Conclusions: Both oral non-invasive and intralesional invasive propranolol are safely used to successfully treat and decrease the size of infantile hemangioma while showing no statistically comparative difference between both treatment techniques.

Keywords: hemangioma, oral beta blocker, intralesional beta blocker, infants

Procedia PDF Downloads 52
6700 Impact of Enzyme-Treated Bran on the Physical and Functional Properties of Extruded Sorghum Snacks

Authors: Charles Kwasi Antwi, Mohammad Naushad Emmambux, Natalia Rosa-Sibakov

Abstract:

The consumption of high-fibre snacks is beneficial in reducing the prevalence of most non-communicable diseases and improving human health. However, using high-fibre flour to produce snacks by extrusion cooking reduces the expansion ratio of snacks, thereby decreasing sensory properties and consumer acceptability of the snack. The study determines the effects of adding Viscozyme®-treated sorghum bran on the properties of extruded sorghum snacks with the aim of producing high-fibre expanded snacks with acceptable quality. With a twin-screw extruder, sorghum endosperm flour [by decortication] with and without sorghum bran and with enzyme-treated sorghum bran was extruded at high shear rates with feed moisture of 20%, feed rate of 10 kg/hr, screw speed of 500 rpm, and temperature zones of 60°C, 70°C, 80°C, 140°C, and 140°C toward the die. The expanded snacks that resulted from this process were analysed in terms of their physical (expansion ratio, bulk density, colour profile), chemical (soluble and insoluble dietary fibre), and functional (water solubility index (WSI) and water absorption index (WAI)) characteristics. The expanded snacks produced from refined sorghum flour enriched with Viscozyme-treated bran had similar expansion ratios to refined sorghum flour extrudates, which were higher than those for untreated bran-sorghum extrudate. Sorghum extrudates without bran showed higher values of expansion ratio and low values of bulk density compared to the untreated bran extrudates. The enzyme-treated fibre increased the expansion ratio significantly with low bulk density values compared to untreated bran. Compared to untreated bran extrudates, WSI values in enzyme-treated samples increased, while WAI values decreased. Enzyme treatment of bran reduced particle size and increased soluble dietary fibre to increase expansion. Lower particle size suggests less interference with bubble formation at the die. Viscozyme-treated bran-sorghum composite flour could be used as raw material to produce high-fibre expanded snacks with improved physicochemical and functional properties.

Keywords: extrusion, sorghum bran, decortication, expanded snacks

Procedia PDF Downloads 93
6699 Review on Optimization of Drinking Water Treatment Process

Authors: M. Farhaoui, M. Derraz

Abstract:

In the drinking water treatment processes, the optimization of the treatment is an issue of particular concern. In general, the process consists of many units as settling, coagulation, flocculation, sedimentation, filtration and disinfection. The optimization of the process consists of some measures to decrease the managing and monitoring expenses and improve the quality of the produced water. The objective of this study is to provide water treatment operators with methods and practices that enable to attain the most effective use of the facility and, in consequence, optimize the of the cubic meter price of the treated water. This paper proposes a review on optimization of drinking water treatment process by analyzing all of the water treatment units and gives some solutions in order to maximize the water treatment performances without compromising the water quality standards. Some solutions and methods are performed in the water treatment plant located in the middle of Morocco (Meknes).

Keywords: coagulation process, optimization, turbidity removal, water treatment

Procedia PDF Downloads 423
6698 Design of an Augmented Automatic Choosing Control with Constrained Input by Lyapunov Functions Using Gradient Optimization Automatic Choosing Functions

Authors: Toshinori Nawata

Abstract:

In this paper a nonlinear feedback control called augmented automatic choosing control (AACC) for a class of nonlinear systems with constrained input is presented. When designing the control, a constant term which arises from linearization of a given nonlinear system is treated as a coefficient of a stable zero dynamics. Parameters of the control are suboptimally selected by maximizing the stable region in the sense of Lyapunov with the aid of a genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.

Keywords: augmented automatic choosing control, nonlinear control, genetic algorithm, zero dynamics

Procedia PDF Downloads 479
6697 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence

Procedia PDF Downloads 119
6696 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 330
6695 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm

Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim

Abstract:

All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.

Keywords: currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features

Procedia PDF Downloads 235
6694 Features Reduction Using Bat Algorithm for Identification and Recognition of Parkinson Disease

Authors: P. Shrivastava, A. Shukla, K. Verma, S. Rungta

Abstract:

Parkinson's disease is a chronic neurological disorder that directly affects human gait. It leads to slowness of movement, causes muscle rigidity and tremors. Gait serve as a primary outcome measure for studies aiming at early recognition of disease. Using gait techniques, this paper implements efficient binary bat algorithm for an early detection of Parkinson's disease by selecting optimal features required for classification of affected patients from others. The data of 166 people, both fit and affected is collected and optimal feature selection is done using PSO and Bat algorithm. The reduced dataset is then classified using neural network. The experiments indicate that binary bat algorithm outperforms traditional PSO and genetic algorithm and gives a fairly good recognition rate even with the reduced dataset.

Keywords: parkinson, gait, feature selection, bat algorithm

Procedia PDF Downloads 549
6693 A New Framework for ECG Signal Modeling and Compression Based on Compressed Sensing Theory

Authors: Siavash Eftekharifar, Tohid Yousefi Rezaii, Mahdi Shamsi

Abstract:

The purpose of this paper is to exploit compressed sensing (CS) method in order to model and compress the electrocardiogram (ECG) signals at a high compression ratio. In order to obtain a sparse representation of the ECG signals, first a suitable basis matrix with Gaussian kernels, which are shown to nicely fit the ECG signals, is constructed. Then the sparse model is extracted by applying some optimization technique. Finally, the CS theory is utilized to obtain a compressed version of the sparse signal. Reconstruction of the ECG signal from the compressed version is also done to prove the reliability of the algorithm. At this stage, a greedy optimization technique is used to reconstruct the ECG signal and the Mean Square Error (MSE) is calculated to evaluate the precision of the proposed compression method.

Keywords: compressed sensing, ECG compression, Gaussian kernel, sparse representation

Procedia PDF Downloads 463