Search results for: light weight algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10745

Search results for: light weight algorithm

9815 A Method for Compression of Short Unicode Strings

Authors: Masoud Abedi, Abbas Malekpour, Peter Luksch, Mohammad Reza Mojtabaei

Abstract:

The use of short texts in communication has been greatly increasing in recent years. Applying different languages in short texts has led to compulsory use of Unicode strings. These strings need twice the space of common strings, hence, applying algorithms of compression for the purpose of accelerating transmission and reducing cost is worthwhile. Nevertheless, other compression methods like gzip, bzip2 or PAQ due to high overhead data size are not appropriate. The Huffman algorithm is one of the rare algorithms effective in reducing the size of short Unicode strings. In this paper, an algorithm is proposed for compression of very short Unicode strings. At first, every new character to be sent to a destination is inserted in the proposed mapping table. At the beginning, every character is new. In case the character is repeated for the same destination, it is not considered as a new character. Next, the new characters together with the mapping value of repeated characters are arranged through a specific technique and specially formatted to be transmitted. The results obtained from an assessment made on a set of short Persian and Arabic strings indicate that this proposed algorithm outperforms the Huffman algorithm in size reduction.

Keywords: Algorithms, Data Compression, Decoding, Encoding, Huffman Codes, Text Communication

Procedia PDF Downloads 349
9814 Evaluation of the UV Stability of Unidirectional Crossply Ultrahigh-Molecular-Weight-Polyethylene Composite

Authors: Jonmichael Weaver, David Miller

Abstract:

Dyneema is an ultra-high molecular weight polyethylene (UHMWPE) fiber created by DSM. This fiber has many applications due to the high tensile strength, low weight, and inability to absorb water. DSM manufactures a non-woven unidirectional cross-ply [0,90]2 lamina, using the Dyneema fiber. Using this lamina system, various thickness panels are created for a 40% lighter weight alternative to Kevlar for the same ballistics protection. Environmental effects on the ply/laminate system alter the material properties, resulting in diminished ultimate performance. Understanding the specific environmental parameters and characterizing the resulting material property degradation is essential for determining the safety and reliability of Dyneema in service. Two laminas were contrasted for their response to accelerated aging by UV, humidity, and temperature cycling. Both lamina contain the same fiber, SK-99, but differ in matrix composition, Dyneema HB-210 employs a polyurethane (PUR) based matrix, and HB-212 contains a rubber-based matrix. Each system was inspected using a scanning electron microscope (SEM) and evaluated by dynamic mechanical analysis (DMA) to characterize the material property changes alongside the corresponding composite damage and matrix failure mode over the aging parameters. Overall, resulting in the HB-212 degrading faster compared with the HB-210.

Keywords: dyneema, accelerated aging, polymers, ballistics protection, armor, DSM, kevlar, composites

Procedia PDF Downloads 151
9813 Non-Dominated Sorting Genetic Algorithm (NSGA-II) for the Redistricting Problem in Mexico

Authors: Antonin Ponsich, Eric Alfredo Rincon Garcia, Roman Anselmo Mora Gutierrez, Miguel Angel Gutierrez Andrade, Sergio Gerardo De Los Cobos Silva, Pedro Lara Velzquez

Abstract:

The electoral zone design problem consists in redrawing the boundaries of legislative districts for electoral purposes in such a way that federal or state requirements are fulfilled. In Mexico, this process has been historically carried out by the National Electoral Institute (INE), by optimizing an integer nonlinear programming model, in which population equality and compactness of the designed districts are considered as two conflicting objective functions, while contiguity is included as a hard constraint. The solution technique used by the INE is a Simulated Annealing (SA) based algorithm, which handles the multi-objective nature of the problem through an aggregation function. The present work represents the first intent to apply a classical Multi-Objective Evolutionary Algorithm (MOEA), the second version of the Non-dominated Sorting Genetic Algorithm (NSGA-II), to this hard combinatorial problem. First results show that, when compared with the SA algorithm, the NSGA-II obtains promising results. The MOEA manages to produce well-distributed solutions over a wide-spread front, even though some convergence troubles for some instances constitute an issue, which should be corrected in future adaptations of MOEAs to the redistricting problem.

Keywords: multi-objective optimization, NSGA-II, redistricting, zone design problem

Procedia PDF Downloads 367
9812 Application of Hybrid Honey Bees Mating Optimization Algorithm in Multiuser Detection of Wireless Communication Systems

Authors: N. Larbi, F. Debbat

Abstract:

Wireless communication systems have changed dramatically and shown spectacular evolution over the past two decades. These radio technologies are engaged in a quest endless high-speed transmission coupled to a constant need to improve transmission quality. Various radio communication systems being developed use code division multiple access (CDMA) technique. This work analyses a hybrid honey bees mating optimization algorithm (HBMO) applied to multiuser detection (MuD) in CDMA communication systems. The HBMO is a swarm-based optimization algorithm, which simulates the mating process of real honey bees. We apply a hybridization of HBMO with simulated annealing (SA) in order to improve the solution generated by the HBMO. Simulation results show that the detection based on Hybrid HBMO, in term of bit error rate (BER), is viable option when compared with the classic detectors from literature under Rayleigh flat fading channel.

Keywords: BER, DS-CDMA multiuser detection, genetic algorithm, hybrid HBMO, simulated annealing

Procedia PDF Downloads 437
9811 Use of Improved Genetic Algorithm in Cloud Computing to Reduce Energy Consumption in Migration of Virtual Machines

Authors: Marziyeh Bahrami, Hamed Pahlevan Hsseini, Behnam Ghamami, Arman Alvanpour, Hamed Ezzati, Amir Salar Sadeghi

Abstract:

One of the ways to increase the efficiency of services in the system of agents and, of course, in the world of cloud computing, is to use virtualization techniques. The aim of this research is to create changes in cloud computing services that will reduce as much as possible the energy consumption related to the migration of virtual machines and, in some way, the energy related to the allocation of resources and reduce the amount of pollution. So far, several methods have been proposed to increase the efficiency of cloud computing services in order to save energy in the cloud environment. The method presented in this article tries to prevent energy consumption by data centers and the subsequent production of carbon and biological pollutants as much as possible by increasing the efficiency of cloud computing services. The results show that the proposed algorithm, using the improvement in virtualization techniques and with the help of a genetic algorithm, improves the efficiency of cloud services in the matter of migrating virtual machines and finally saves consumption. becomes energy.

Keywords: consumption reduction, cloud computing, genetic algorithm, live migration, virtual Machine

Procedia PDF Downloads 61
9810 Content-Based Color Image Retrieval Based on the 2-D Histogram and Statistical Moments

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach can overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.

Keywords: 2-D histogram, statistical moments, indexing, similarity distance, histograms intersection

Procedia PDF Downloads 457
9809 The Effect of Nutrition Education on Glycemic and Lipidemic Control in Iranian Patients with Type 2 Diabetes

Authors: Samira Rabiei, Faezeh Askari, Reza Rastmanesh

Abstract:

Objective: To evaluate the effects of nutrition education and adherence to a healthy diet on glycemic and lipidemic control in patients with T2DM. Material and Methods: A randomized controlled trial was conducted on 494 patients with T2DM, aged 14-87 years from both sexes who were selected by convenience sampling from referees to Aliebneabitaleb hospital in Ghom. The participants were divided into two 247 person groups by stratified randomization. Both groups received a diet adjusted based on ideal body weight, and the intervention group was additionally educated about healthy food choices regarding diabetes. Information on medications, psychological factors, diet and physical activity was obtained from questionnaires. Blood samples were collected to measure FBS, 2 hPG, HbA1c, cholesterol, and triglyceride. After 2 months, weight and biochemical parameters were measured again. Independent T-test, Mann-Whitney, Chi-square, and Wilcoxon were used as appropriate. Logistic regression was used to determine the odds ratio of abnormal glycemic and lipidemic control according to the intervention. Results: The mean weight, FBS, 2 hPG, cholesterol and triglyceride after intervention were significantly lower than before that (p < 0.05). Discussion: Nutrition education plus a weigh reducer diet is more effective on glycemic and lipidemic control than a weight reducer diet, alone.

Keywords: type 2 diabetes mellitus, nutrition education, glycemic control, lipid profile

Procedia PDF Downloads 209
9808 DOA Estimation Using Golden Section Search

Authors: Niharika Verma, Sandeep Santosh

Abstract:

DOA technique is a localization technique used in the communication field. Various algorithms have been developed for direction of arrival estimation like MUSIC, ROOT MUSIC, etc. These algorithms depend on various parameters like antenna array elements, number of snapshots and various others. Basically the MUSIC spectrum is evaluated and peaks obtained are considered as the angle of arrivals. The angles evaluated using this process depends on the scanning interval chosen. The accuracy of the results obtained depends on the coarseness of the interval chosen. In this paper, golden section search is applied to the MUSIC algorithm and therefore, more accurate results are achieved. Initially the coarse DOA estimations is done using the MUSIC algorithm in the range -90 to 90 degree at the interval of 10 degree. After the peaks obtained then fine DOA estimation is done using golden section search. Also, the partitioning method is applied to estimate the number of signals incident on the antenna array. Dependency of the algorithm on the number of snapshots is also being explained. Hence, the accurate results are being determined using this algorithm.

Keywords: Direction of Arrival (DOA), golden section search, MUSIC, number of snapshots

Procedia PDF Downloads 447
9807 Fruit Identification System in Sweet Orange Citrus (L.) Osbeck Using Thermal Imaging and Fuzzy

Authors: Ingrid Argote, John Archila, Marcelo Becker

Abstract:

In agriculture, intelligent systems applications have generated great advances in automating some of the processes in the production chain. In order to improve the efficiency of those systems is proposed a vision system to estimate the amount of fruits in sweet orange trees. This work presents a system proposal using capture of thermal images and fuzzy logic. A bibliographical review has been done to analyze the state-of-the-art of the different systems used in fruit recognition, and also the different applications of thermography in agricultural systems. The algorithm developed for this project uses the metrics of the fuzzines parameter to the contrast improvement and segmentation of the image, for the counting algorith m was used the Hough transform. In order to validate the proposed algorithm was created a bank of images of sweet orange Citrus (L.) Osbeck acquired in the Maringá Farm. The tests with the algorithm Indicated that the variation of the tree branch temperature and the fruit is not very high, Which makes the process of image segmentation using this differentiates, This Increases the amount of false positives in the fruit counting algorithm. Recognition of fruits isolated with the proposed algorithm present an overall accuracy of 90.5 % and grouped fruits. The accuracy was 81.3 %. The experiments show the need for a more suitable hardware to have a better recognition of small temperature changes in the image.

Keywords: Agricultural systems, Citrus, Fuzzy logic, Thermal images.

Procedia PDF Downloads 230
9806 Optimization of Flexible Job Shop Scheduling Problem with Sequence-Dependent Setup Times Using Genetic Algorithm Approach

Authors: Sanjay Kumar Parjapati, Ajai Jain

Abstract:

This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.

Keywords: flexible job shop, genetic algorithm, makespan, sequence dependent setup times

Procedia PDF Downloads 333
9805 A Review of Optomechatronic Ecosystem

Authors: Sam Zhang

Abstract:

The landscape of Opto mechatronics is viewed along the line of light vs. matter, photonics vs. semiconductors, and optics vs. mechatronics. Optomechatronics is redefined as the integration of light and matter from the atom, device, and system to the application. The markets and megatrends in Opto mechatronics are further listed. The author then focuses on Opto mechatronic technology in the semiconductor industry as an example and reviews the practical systems, characteristics, and trends. Opto mechatronics, together with photonics and semiconductor, will continue producing the computational and smart infrastructure required for the 4th industrial revolution.

Keywords: photonics, semiconductor, optomechatronics, 4th industrial revolution

Procedia PDF Downloads 131
9804 Unseen Classes: The Paradigm Shift in Machine Learning

Authors: Vani Singhal, Jitendra Parmar, Satyendra Singh Chouhan

Abstract:

Unseen class discovery has now become an important part of a machine-learning algorithm to judge new classes. Unseen classes are the classes on which the machine learning model is not trained on. With the advancement in technology and AI replacing humans, the amount of data has increased to the next level. So while implementing a model on real-world examples, we come across unseen new classes. Our aim is to find the number of unseen classes by using a hierarchical-based active learning algorithm. The algorithm is based on hierarchical clustering as well as active sampling. The number of clusters that we will get in the end will give the number of unseen classes. The total clusters will also contain some clusters that have unseen classes. Instead of first discovering unseen classes and then finding their number, we directly calculated the number by applying the algorithm. The dataset used is for intent classification. The target data is the intent of the corresponding query. We conclude that when the machine learning model will encounter real-world data, it will automatically find the number of unseen classes. In the future, our next work would be to label these unseen classes correctly.

Keywords: active sampling, hierarchical clustering, open world learning, unseen class discovery

Procedia PDF Downloads 173
9803 A Deep Learning-Based Pedestrian Trajectory Prediction Algorithm

Authors: Haozhe Xiang

Abstract:

With the rise of the Internet of Things era, intelligent products are gradually integrating into people's lives. Pedestrian trajectory prediction has become a key issue, which is crucial for the motion path planning of intelligent agents such as autonomous vehicles, robots, and drones. In the current technological context, deep learning technology is becoming increasingly sophisticated and gradually replacing traditional models. The pedestrian trajectory prediction algorithm combining neural networks and attention mechanisms has significantly improved prediction accuracy. Based on in-depth research on deep learning and pedestrian trajectory prediction algorithms, this article focuses on physical environment modeling and learning of historical trajectory time dependence. At the same time, social interaction between pedestrians and scene interaction between pedestrians and the environment were handled. An improved pedestrian trajectory prediction algorithm is proposed by analyzing the existing model architecture. With the help of these improvements, acceptable predicted trajectories were successfully obtained. Experiments on public datasets have demonstrated the algorithm's effectiveness and achieved acceptable results.

Keywords: deep learning, graph convolutional network, attention mechanism, LSTM

Procedia PDF Downloads 73
9802 A Nutritional Wellness Program for Overweight Health Care Providers in Hospital Setting: A Randomized Controlled Trial Pilot Study

Authors: Kim H. K. Choy, Oliva H. K. Chu, W. Y. Keung, B. Lim, Winnie P. Y. Tang

Abstract:

Background: The prevalence of workplace obesity is rising worldwide; therefore, the workplace is an ideal venue to implement weight control intervention. This pilot randomized controlled trial aimed to develop, implement, and evaluate a nutritional wellness program for obese health care providers working in a hospital. Methods: This hospital-based nutritional wellness program was an 8-week pilot randomized controlled trial for obese health care providers. The primary outcomes were body weight and body mass index (BMI). The secondary outcomes were serum fasting glucose, fasting cholesterol, triglyceride, high-density (HDL) and low-density (LDL) lipoprotein, body fat percentage, and body mass. Participants were randomly assigned to the intervention (n = 20) or control (n = 22) group. Participants in both groups received individual nutrition counselling and nutrition pamphlets, whereas only participants in the intervention group were given mobile phone text messages. Results: 42 participants completed the study. In comparison with the control group, the intervention group showed approximately 0.98 kg weight reduction after two months. Participants in intervention group also demonstrated clinically significant improvement in BMI, serum cholesterol level, and HDL level. There was no improvement of body fat percentage and body mass for both intervention and control groups. Conclusion: The nutritional wellness program for obese health care providers was feasible in hospital settings. Health care providers demonstrated short-term weight loss, decrease in serum fasting cholesterol level, and HDL level after completing the program.

Keywords: weight management, weight control, health care providers, hospital

Procedia PDF Downloads 244
9801 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing

Authors: Jaimin Patel

Abstract:

Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.

Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man in middle attack

Procedia PDF Downloads 282
9800 Optimal Design of Substation Grounding Grid Based on Genetic Algorithm Technique

Authors: Ahmed Z. Gabr, Ahmed A. Helal, Hussein E. Said

Abstract:

With the incessant increase of power systems capacity and voltage grade, the safety of grounding grid becomes more and more prominent. In this paper, the designing substation grounding grid is presented by means of genetic algorithm (GA). This approach purposes to control the grounding cost of the power system with the aid of controlling grounding rod number and conductor lengths under the same safety limitations. The proposed technique is used for the design of the substation grounding grid in Khalda Petroleum Company “El-Qasr” power plant and the design was simulated by using CYMGRD software for results verification. The result of the design is highly complying with IEEE 80-2000 standard requirements.

Keywords: genetic algorithm, optimum grounding grid design, power system analysis, power system protection, single layer model, substation

Procedia PDF Downloads 537
9799 Enhancement of Road Defect Detection Using First-Level Algorithm Based on Channel Shuffling and Multi-Scale Feature Fusion

Authors: Yifan Hou, Haibo Liu, Le Jiang, Wandong Su, Binqing Wang

Abstract:

Road defect detection is crucial for modern urban management and infrastructure maintenance. Traditional road defect detection methods mostly rely on manual labor, which is not only inefficient but also difficult to ensure their reliability. However, existing deep learning-based road defect detection models have poor detection performance in complex environments and lack robustness to multi-scale targets. To address this challenge, this paper proposes a distinct detection framework based on the one stage algorithm network structure. This article designs a deep feature extraction network based on RCSDarknet, which applies channel shuffling to enhance information fusion between tensors. Through repeated stacking of RCS modules, the information flow between different channels of adjacent layer features is enhanced to improve the model's ability to capture target spatial features. In addition, a multi-scale feature fusion mechanism with weighted dual flow paths was adopted to fuse spatial features of different scales, thereby further improving the detection performance of the model at different scales. To validate the performance of the proposed algorithm, we tested it using the RDD2022 dataset. The experimental results show that the enhancement algorithm achieved 84.14% mAP, which is 1.06% higher than the currently advanced YOLOv8 algorithm. Through visualization analysis of the results, it can also be seen that our proposed algorithm has good performance in detecting targets of different scales in complex scenes. The above experimental results demonstrate the effectiveness and superiority of the proposed algorithm, providing valuable insights for advancing real-time road defect detection methods.

Keywords: roads, defect detection, visualization, deep learning

Procedia PDF Downloads 13
9798 Simulation of Non-Crimp 3D Orthogonal Carbon Fabric Composite for Aerospace Applications Using Finite Element Method

Authors: Sh. Minapoor, S. Ajeli, M. Javadi Toghchi

Abstract:

Non-crimp 3D orthogonal fabric composite is one of the textile-based composite materials that are rapidly developing light-weight engineering materials. The present paper focuses on geometric and micro mechanical modeling of non-crimp 3D orthogonal carbon fabric and composites reinforced with it for aerospace applications. In this research meso-finite element (FE) modeling employs for stress analysis in different load conditions. Since mechanical testing of expensive textile carbon composites with specific application isn't affordable, simulation composite in a virtual environment is a helpful way to investigate its mechanical properties in different conditions.

Keywords: woven composite, aerospace applications, finite element method, mechanical properties

Procedia PDF Downloads 465
9797 Arabic Character Recognition Using Regression Curves with the Expectation Maximization Algorithm

Authors: Abdullah A. AlShaher

Abstract:

In this paper, we demonstrate how regression curves can be used to recognize 2D non-rigid handwritten shapes. Each shape is represented by a set of non-overlapping uniformly distributed landmarks. The underlying models utilize 2nd order of polynomials to model shapes within a training set. To estimate the regression models, we need to extract the required coefficients which describe the variations for a set of shape class. Hence, a least square method is used to estimate such modes. We then proceed by training these coefficients using the apparatus Expectation Maximization algorithm. Recognition is carried out by finding the least error landmarks displacement with respect to the model curves. Handwritten isolated Arabic characters are used to evaluate our approach.

Keywords: character recognition, regression curves, handwritten Arabic letters, expectation maximization algorithm

Procedia PDF Downloads 145
9796 An Improved Parallel Algorithm of Decision Tree

Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng

Abstract:

Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.

Keywords: classification, Gini index, parallel data mining, pruning ahead

Procedia PDF Downloads 124
9795 A Hybrid Particle Swarm Optimization-Nelder- Mead Algorithm (PSO-NM) for Nelson-Siegel- Svensson Calibration

Authors: Sofia Ayouche, Rachid Ellaia, Rajae Aboulaich

Abstract:

Today, insurers may use the yield curve as an indicator evaluation of the profit or the performance of their portfolios; therefore, they modeled it by one class of model that has the ability to fit and forecast the future term structure of interest rates. This class of model is the Nelson-Siegel-Svensson model. Unfortunately, many authors have reported a lot of difficulties when they want to calibrate the model because the optimization problem is not convex and has multiple local optima. In this context, we implement a hybrid Particle Swarm optimization and Nelder Mead algorithm in order to minimize by least squares method, the difference between the zero-coupon curve and the NSS curve.

Keywords: optimization, zero-coupon curve, Nelson-Siegel-Svensson, particle swarm optimization, Nelder-Mead algorithm

Procedia PDF Downloads 430
9794 Stabilisation of a Soft Soil by Alkaline Activation

Authors: Mohammadjavad Yaghoubi, Arul Arulrajah, Mahdi M. Disfani, Suksun Horpibulsuk, Myint W. Bo, Stephen P. Darmawan

Abstract:

This paper investigates the changes in the strength development of a high water content soft soil stabilised with alkaline activation of fly ash (FA) to use in deep soil mixing (DSM) technology. The content of FA was 20% by dry mass of soil, and the alkaline activator was sodium silicate (Na2SiO3). Samples were cured for 3, 7, 14, 28 and 56 days to evaluate the effect of curing time on strength development. To study the effect of adding slag (S) to the mixture on the strength development, 5% S was replaced with FA. In addition, the effect of the initial unit weight of samples on strength development was studied by preparing specimens with two different static compaction stresses. This was to replicate the field conditions where during implementing the DSM technique, the pressure on the soil while being mixed, increases with depth. Unconfined compression strength (UCS), scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDS) tests were conducted on the specimens. The results show that adding S to the FA based geopolymer activated by Na2SiO3 decreases the strength. Furthermore, samples prepared at a higher unit weight demonstrate greater strengths. Moreover, samples prepared at lower unit weight reached their final strength at about 14 days of curing, whereas the strength development continues to 56 days for specimens prepared at a higher unit weight.

Keywords: alkaline activation, curing time, fly ash, geopolymer, slag

Procedia PDF Downloads 338
9793 Large-Capacity Image Information Reduction Based on Single-Cue Saliency Map for Retinal Prosthesis System

Authors: Yili Chen, Xiaokun Liang, Zhicheng Zhang, Yaoqin Xie

Abstract:

In an effort to restore visual perception in retinal diseases, an electronic retinal prosthesis with thousands of electrodes has been developed. The image processing strategies of retinal prosthesis system converts the original images from the camera to the stimulus pattern which can be interpreted by the brain. Practically, the original images are with more high resolution (256x256) than that of the stimulus pattern (such as 25x25), which causes a technical image processing challenge to do large-capacity image information reduction. In this paper, we focus on developing an efficient image processing stimulus pattern extraction algorithm by using a single cue saliency map for extracting salient objects in the image with an optimal trimming threshold. Experimental results showed that the proposed stimulus pattern extraction algorithm performs quite well for different scenes in terms of the stimulus pattern. In the algorithm performance experiment, our proposed SCSPE algorithm have almost five times of the score compared with Boyle’s algorithm. Through experiment s we suggested that when there are salient objects in the scene (such as the blind meet people or talking with people), the trimming threshold should be set around 0.4max, in other situations, the trimming threshold values can be set between 0.2max-0.4max to give the satisfied stimulus pattern.

Keywords: retinal prosthesis, image processing, region of interest, saliency map, trimming threshold selection

Procedia PDF Downloads 249
9792 Effect of the pH on the Degradation Kinetics of Biodegradable Mg-0.8Ca Orthopedic Implants

Authors: A. Mohamed, A. El-Aziz

Abstract:

The pH of the body plays a great role in the degradation kinetics of biodegradable Mg-Ca orthopedic implants. At the location of fracture, the pH of the body becomes no longer neutral which draws the attention towards studying a range of different pH values of the body fluid. In this study, the pH of Hank’s balanced salt solution (HBSS) was modified by phosphate buffers into an aggressive acidic pH 1.8, a slightly acidic pH 5.3 and an alkaline pH 8.1. The biodegradation of Mg-0.8Ca implant was tested in those three different media using immersion test and electrochemical polarization means. It was proposed that the degradation rate has increased with decreasing the pH of HBSS. The immersion test revealed weight gain for all the samples followed by weight loss as the immersion time increased. The highest weight gain was pronounced for the acidic pH 1.8 and the least weight gain was observed for the alkaline pH 8.1. This was in agreement with the electrochemical polarization test results where the degradation rate was found to be high (7.29 ± 2.2 mm/year) in the aggressive acidic solution of pH 1.8 and relatively minimum (0.31 ± 0.06 mm/year) in the alkaline medium of pH 8.1. Furthermore, it was confirmed that the pH of HBSS has reached a steady state of an alkaline pH (~pH 11) at the end of the two-month immersion period regardless of the initial pH of the solution. Finally, the corrosion products formed on the samples’ surface were investigated by SEM, EDX and XRD analyses that revealed the formation of magnesium and calcium phosphates with different morphologies according to the pH.

Keywords: biodegradable, electrochemical polarization means, orthopedics, immersion test, simulated body fluid

Procedia PDF Downloads 124
9791 Intelligent Minimal Allocation of Capacitors in Distribution Networks Using Genetic Algorithm

Authors: S. Neelima, P. S. Subramanyam

Abstract:

A distribution system is an interface between the bulk power system and the consumers. Among these systems, radial distributions system is popular because of low cost and simple design. In distribution systems, the voltages at buses reduces when moved away from the substation, also the losses are high. The reason for a decrease in voltage and high losses is the insufficient amount of reactive power, which can be provided by the shunt capacitors. But the placement of the capacitor with an appropriate size is always a challenge. Thus, the optimal capacitor placement problem is to determine the location and size of capacitors to be placed in distribution networks in an efficient way to reduce the power losses and improve the voltage profile of the system. For this purpose, in this paper, two stage methodologies are used. In the first stage, the load flow of pre-compensated distribution system is carried out using ‘dimension reducing distribution load flow algorithm (DRDLFA)’. On the basis of this load flow the potential locations of compensation are computed. In the second stage, Genetic Algorithm (GA) technique is used to determine the optimal location and size of the capacitors such that the cost of the energy loss and capacitor cost to be a minimum. The above method is tested on IEEE 9 and 34 bus system and compared with other methods in the literature.

Keywords: dimension reducing distribution load flow algorithm, DRDLFA, genetic algorithm, electrical distribution network, optimal capacitors placement, voltage profile improvement, loss reduction

Procedia PDF Downloads 392
9790 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms

Authors: Bliss Singhal

Abstract:

Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.

Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression

Procedia PDF Downloads 83
9789 Using Maximization Entropy in Developing a Filipino Phonetically Balanced Wordlist for a Phoneme-Level Speech Recognition System

Authors: John Lorenzo Bautista, Yoon-Joong Kim

Abstract:

In this paper, a set of Filipino Phonetically Balanced Word list consisting of 250 words (PBW250) were constructed for a phoneme-level ASR system for the Filipino language. The Entropy Maximization is used to obtain phonological balance in the list. Entropy of phonemes in a word is maximized, providing an optimal balance in each word’s phonological distribution using the Add-Delete Method (PBW algorithm) and is compared to the modified PBW algorithm implemented in a dynamic algorithm approach to obtain optimization. The gained entropy score of 4.2791 and 4.2902 for the PBW and modified algorithm respectively. The PBW250 was recorded by 40 respondents, each with 2 sets data. Recordings from 30 respondents were trained to produce an acoustic model that were tested using recordings from 10 respondents using the HMM Toolkit (HTK). The results of test gave the maximum accuracy rate of 97.77% for a speaker dependent test and 89.36% for a speaker independent test.

Keywords: entropy maximization, Filipino language, Hidden Markov Model, phonetically balanced words, speech recognition

Procedia PDF Downloads 458
9788 Cycleloop Personal Rapid Transit: An Exploratory Study for Last Mile Connectivity in Urban Transport

Authors: Suresh Salla

Abstract:

In this paper, author explores for most sustainable last mile transport mode addressing present problems of traffic congestion, jams, pollution and travel stress. Development of energy-efficient sustainable integrated transport system(s) is/are must to make our cities more livable. Emphasis on autonomous, connected, electric, sharing system for effective utilization of systems (vehicles and public infrastructure) is on the rise. Many surface mobility innovations like PBS, Ride hailing, ride sharing, etc. are, although workable but if we analyze holistically, add to the already congested roads, difficult to ride in hostile weather, causes pollution and poses commuter stress. Sustainability of transportation is evaluated with respect to public adoption, average speed, energy consumption, and pollution. Why public prefer certain mode over others? How commute time plays a role in mode selection or shift? What are the factors play-ing role in energy consumption and pollution? Based on the study, it is clear that public prefer a transport mode which is exhaustive (i.e., less need for interchange – network is widespread) and intensive (i.e., less waiting time - vehicles are available at frequent intervals) and convenient with latest technologies. Average speed is dependent on stops, number of intersections, signals, clear route availability, etc. It is clear from Physics that higher the kerb weight of a vehicle; higher is the operational energy consumption. Higher kerb weight also demands heavier infrastructure. Pollution is dependent on source of energy, efficiency of vehicle, average speed. Mode can be made exhaustive when the unit infrastructure cost is less and can be offered intensively when the vehicle cost is less. Reliable and seamless integrated mobility till last ¼ mile (Five Minute Walk-FMW) is a must to encourage sustainable public transportation. Study shows that average speed and reliability of dedicated modes (like Metro, PRT, BRT, etc.) is high compared to road vehicles. Electric vehicles and more so battery-less or 3rd rail vehicles reduce pollution. One potential mode can be Cycleloop PRT, where commuter rides e-cycle in a dedicated path – elevated, at grade or underground. e-Bike with kerb weight per rider at 15 kg being 1/50th of car or 1/10th of other PRT systems makes it sustainable mode. Cycleloop tube will be light, sleek and scalable and can be modular erected, either on modified street lamp-posts or can be hanged/suspended between the two stations. Embarking and dis-embarking points or offline stations can be at an interval which suits FMW to mass public transit. In terms of convenience, guided e-Bike can be made self-balancing thus encouraging driverless on-demand vehicles. e-Bike equipped with smart electronics and drive controls can intelligently respond to field sensors and autonomously move reacting to Central Controller. Smart switching allows travel from origin to destination without interchange of cycles. DC Powered Batteryless e-cycle with voluntary manual pedaling makes it sustainable and provides health benefits. Tandem e-bike, smart switching and Platoon operations algorithm options provide superior through-put of the Cycleloop. Thus Cycleloop PRT will be exhaustive, intensive, convenient, reliable, speedy, sustainable, safe, pollution-free and healthy alternative mode for last mile connectivity in cities.

Keywords: cycleloop PRT, five-minute walk, lean modular infrastructure, self-balanced intelligent e-cycle

Procedia PDF Downloads 134
9787 Feasibility Study of Distributed Lightless Intersection Control with Level 1 Autonomous Vehicles

Authors: Bo Yang, Christopher Monterola

Abstract:

Urban intersection control without the use of the traffic light has the potential to vastly improve the efficiency of the urban traffic flow. For most proposals in the literature, such lightless intersection control depends on the mass market commercialization of highly intelligent autonomous vehicles (AV), which limits the prospects of near future implementation. We present an efficient lightless intersection traffic control scheme that only requires Level 1 AV as defined by NHTSA. The technological barriers of such lightless intersection control are thus very low. Our algorithm can also accommodate a mixture of AVs and conventional vehicles. We also carry out large scale numerical analysis to illustrate the feasibility, safety and robustness, comfort level, and control efficiency of our intersection control scheme.

Keywords: intersection control, autonomous vehicles, traffic modelling, intelligent transport system

Procedia PDF Downloads 459
9786 Consideration of Starlight Waves Redshift as Produced by Friction of These Waves on Its Way through Space

Authors: Angel Pérez Sánchez

Abstract:

In 1929, a light redshift was discovered in distant galaxies and was interpreted as produced by galaxies moving away from each other at high speed. This interpretation led to the consideration of a new source of energy, which was called Dark Energy. Redshift is a loss of light wave frequency produced by galaxies moving away at high speed, but the loss of frequency can also be produced by the friction of light waves on their way to Earth. This friction is impossible because outer space is empty, but if it were not empty and a medium existed in this empty space, it would be possible. The consequences would be extraordinary because Universe acceleration and Dark Energy would be in doubt. This article presents evidence that empty space is actually a medium occupied by different particles, among them the most significant would-be Graviton or Higgs Boson, because let's not forget that gravity also affects empty space.

Keywords: Big Bang, dark energy, doppler effect, redshift, starlight frequency reduction, universe acceleration

Procedia PDF Downloads 65