Search results for: hybrid PSO-GA algorithm and mutual information
13922 A Study on the Synthesis and Antioxidant Activity of Hybrid Pyrazoline Integrated with Pyrazole and Thiazole Nuclei
Authors: Desta Gebretekle Shiferaw, Balakrishna Kalluraya
Abstract:
Pyrazole is an aromatic five-membered heterocycle with two nitrogen and three carbon atoms in its ring structure. According to the literature, pyrazoline, pyrazole, and thiazole-containing moieties are found in various drug structures and are responsible for nearly all pharmacological effects. The pyrazoline linked to pyrazole moiety carbothioamides was synthesized via the reaction of pyrazole-bearing chalcones (3-(5-chloro-3-methyl-¹-phenyl-1H-pyrazol-4-yl)-¹-(substituted aryl) prop-2-ene-¹-one derivatives) with a nucleophile thiosemicarbohyrazide by heating in ethanol using fused sodium acetate as a catalyst. Then the carbothioamide derivatives were converted into the pyrazoline hybrid to pyrazole and thiazole derivatives by condensing with substituted phenacyl bromide in alcohol in a basic medium. Next, the chemical structure of the newly synthesized molecules was confirmed by IR, 1H-NMR, and mass spectral data. Further, they were screened for their in vitro antioxidant activity. Compared to butylated hydroxy anisole (BHA)., the antioxidant data showed that the synthesized compounds had good to moderate activity.Keywords: pyrazoline-pyrazole carbothioamide derivatives, pyrazoline-pyrazole-thiazole derivatives, spectral studies, antioxidant activity
Procedia PDF Downloads 7213921 Fixed Point of Lipschitz Quasi Nonexpansive Mappings
Authors: Maryam Moosavi, Hadi Khatibzadeh
Abstract:
The main purpose of this paper is to study the proximal point algorithm for quasi-nonexpansive mappings in Hadamard spaces. △-convergence and strong convergence of cyclic resolvents for a finite family of quasi-nonexpansive mappings one to a fixed point of the mappings are establishedKeywords: Fixed point, Hadamard space, Proximal point algorithm, Quasi-nonexpansive sequence of mappings, Resolvent
Procedia PDF Downloads 9113920 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm
Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian
Abstract:
The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool
Procedia PDF Downloads 43613919 An Investigation on Interface Shear Resistance of Twinwall Units for Tank Structures
Authors: Jaylina Rana, Chanakya Arya, John Stehle
Abstract:
Hybrid precast twinwall concrete units, mainly used in basement, core and crosswall construction, are now being adopted in water retaining tank structures. Their use offers many advantages compared with conventional in-situ concrete alternatives, however, the design could be optimised further via a deeper understanding of the unique load transfer mechanisms in the system. In the tank application, twinwall units, which consist of two precast concrete biscuits connected by steel lattices and in-situ concrete core, are subject to bending. Uncertainties about the degree of composite action between the precast biscuits and hence flexural performance of the units necessitated laboratory tests to investigate the interface shear resistance. Testing was also required to assess both the leakage performance and buildability of a variety of joint details. This paper describes some aspects of this novel approach to the design/construction of tank structures as well as selected results from some of the tests that were carried out.Keywords: hybrid construction, twinwall, precast construction, composite action
Procedia PDF Downloads 48213918 Expert Supporting System for Diagnosing Lymphoid Neoplasms Using Probabilistic Decision Tree Algorithm and Immunohistochemistry Profile Database
Authors: Yosep Chong, Yejin Kim, Jingyun Choi, Hwanjo Yu, Eun Jung Lee, Chang Suk Kang
Abstract:
For the past decades, immunohistochemistry (IHC) has been playing an important role in the diagnosis of human neoplasms, by helping pathologists to make a clearer decision on differential diagnosis, subtyping, personalized treatment plan, and finally prognosis prediction. However, the IHC performed in various tumors of daily practice often shows conflicting and very challenging results to interpret. Even comprehensive diagnosis synthesizing clinical, histologic and immunohistochemical findings can be helpless in some twisted cases. Another important issue is that the IHC data is increasing exponentially and more and more information have to be taken into account. For this reason, we reached an idea to develop an expert supporting system to help pathologists to make a better decision in diagnosing human neoplasms with IHC results. We gave probabilistic decision tree algorithm and tested the algorithm with real case data of lymphoid neoplasms, in which the IHC profile is more important to make a proper diagnosis than other human neoplasms. We designed probabilistic decision tree based on Bayesian theorem, program computational process using MATLAB (The MathWorks, Inc., USA) and prepared IHC profile database (about 104 disease category and 88 IHC antibodies) based on WHO classification by reviewing the literature. The initial probability of each neoplasm was set with the epidemiologic data of lymphoid neoplasm in Korea. With the IHC results of 131 patients sequentially selected, top three presumptive diagnoses for each case were made and compared with the original diagnoses. After the review of the data, 124 out of 131 were used for final analysis. As a result, the presumptive diagnoses were concordant with the original diagnoses in 118 cases (93.7%). The major reason of discordant cases was that the similarity of the IHC profile between two or three different neoplasms. The expert supporting system algorithm presented in this study is in its elementary stage and need more optimization using more advanced technology such as deep-learning with data of real cases, especially in differentiating T-cell lymphomas. Although it needs more refinement, it may be used to aid pathological decision making in future. A further application to determine IHC antibodies for a certain subset of differential diagnoses might be possible in near future.Keywords: database, expert supporting system, immunohistochemistry, probabilistic decision tree
Procedia PDF Downloads 22413917 A Parallel Implementation of k-Means in MATLAB
Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas
Abstract:
The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.Keywords: K-means algorithm, clustering, parallel computations, Matlab
Procedia PDF Downloads 38513916 Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods
Authors: Juan Heredia, Naci Dilekli
Abstract:
The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking.Keywords: remote sensing, oil pollution quatification, amazon forest, hyperspectral remote sensing
Procedia PDF Downloads 16313915 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster
Authors: Trapti Sharma, Devesh Kumar Srivastava
Abstract:
This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.Keywords: hadoop, mapreduce, k-mediod, validation, verification
Procedia PDF Downloads 36913914 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography
Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo
Abstract:
Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding
Procedia PDF Downloads 20413913 HR MRI CS Based Image Reconstruction
Authors: Krzysztof Malczewski
Abstract:
Magnetic Resonance Imaging (MRI) reconstruction algorithm using compressed sensing is presented in this paper. It is exhibited that the offered approach improves MR images spatial resolution in circumstances when highly undersampled k-space trajectories are applied. Compressed Sensing (CS) aims at signal and images reconstructing from significantly fewer measurements than were conventionally assumed necessary. Magnetic Resonance Imaging (MRI) is a fundamental medical imaging method struggles with an inherently slow data acquisition process. The use of CS to MRI has the potential for significant scan time reductions, with visible benefits for patients and health care economics. In this study the objective is to combine super-resolution image enhancement algorithm with CS framework benefits to achieve high resolution MR output image. Both methods emphasize on maximizing image sparsity on known sparse transform domain and minimizing fidelity. The presented algorithm considers the cardiac and respiratory movements.Keywords: super-resolution, MRI, compressed sensing, sparse-sense, image enhancement
Procedia PDF Downloads 43013912 Triangulations via Iterated Largest Angle Bisection
Authors: Yeonjune Kang
Abstract:
A triangulation of a planar region is a partition of that region into triangles. In the finite element method, triangulations are often used as the grid underlying a computation. In order to be suitable as a finite element mesh, a triangulation must have well-shaped triangles, according to criteria that depend on the details of the particular problem. For instance, most methods require that all triangles be small and as close to the equilateral shape as possible. Stated differently, one wants to avoid having either thin or flat triangles in the triangulation. There are many triangulation procedures, a particular one being the one known as the longest edge bisection algorithm described below. Starting with a given triangle, locate the midpoint of the longest edge and join it to the opposite vertex of the triangle. Two smaller triangles are formed; apply the same bisection procedure to each of these triangles. Continuing in this manner after n steps one obtains a triangulation of the initial triangle into 2n smaller triangles. The longest edge algorithm was first considered in the late 70’s. It was shown by various authors that this triangulation has the desirable properties for the finite element method: independently of the number of iterations the angles of these triangles cannot get too small; moreover, the size of the triangles decays exponentially. In the present paper we consider a related triangulation algorithm we refer to as the largest angle bisection procedure. As the name suggests, rather than bisecting the longest edge, at each step we bisect the largest angle. We study the properties of the resulting triangulation and prove that, while the general behavior resembles the one in the longest edge bisection algorithm, there are several notable differences as well.Keywords: angle bisectors, geometry, triangulation, applied mathematics
Procedia PDF Downloads 40113911 Hybrid Finite Element Analysis of Expansion Joints for Piping Systems in Aircraft Engine External Configurations and Nuclear Power Plants
Authors: Dong Wook Lee
Abstract:
This paper presents a method to analyze the stiffness of the expansion joint with structural support using a hybrid method combining computational and analytical methods. Many expansion joints found in tubes and ducts of mechanical structures are designed to absorb thermal expansion mismatch between their structural members and deal with misalignments introduced from the assembly/manufacturing processes. One of the important design perspectives is the system’s vibrational characteristics. We calculate the stiffness as a characterization parameter for structural joint systems using a combined Finite Element Analysis (FEA) and an analytical method. We apply the methods to two sample applications: external configurations of aircraft engines and nuclear power plant structures.Keywords: expansion joint, expansion joint stiffness, finite element analysis, nuclear power plants, aircraft engine external configurations
Procedia PDF Downloads 11113910 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 27513909 Facial Biometric Privacy Using Visual Cryptography: A Fundamental Approach to Enhance the Security of Facial Biometric Data
Authors: Devika Tanna
Abstract:
'Biometrics' means 'life measurement' but the term is usually associated with the use of unique physiological characteristics to identify an individual. It is important to secure the privacy of digital face image that is stored in central database. To impart privacy to such biometric face images, first, the digital face image is split into two host face images such that, each of it gives no idea of existence of the original face image and, then each cover image is stored in two different databases geographically apart. When both the cover images are simultaneously available then only we can access that original image. This can be achieved by using the XM2VTS and IMM face database, an adaptive algorithm for spatial greyscale. The algorithm helps to select the appropriate host images which are most likely to be compatible with the secret image stored in the central database based on its geometry and appearance. The encryption is done using GEVCS which results in a reconstructed image identical to the original private image.Keywords: adaptive algorithm, database, host images, privacy, visual cryptography
Procedia PDF Downloads 13013908 Antibacterial Studies on Cellulolytic Bacteria for Termite Control
Authors: Essam A. Makky, Chan Cai Wen, Muna Jalal, Mashitah M. Yusoff
Abstract:
Termites are considered as important pests that could cause severe wood damage and economic losses in urban, agriculture and forest of Malaysia. The ability of termites to degrade cellulose depends on association of gut cellulolytic microflora or better known as mutual symbionts. With the idea of disrupting the mutual symbiotic association, better pest control practices can be attained. This study is aimed to isolate cellulolytic bacteria from the gut of termites and carry out antibacterial studies for the termite. Confirmation of cellulase activity is done by qualitative and quantitative methods. Impacts of antibiotics and their combinations, as well as heavy metals and disinfectants, are conducted by using disc diffusion method. Effective antibacterial agents are then subjected for termite treatment to study the effectiveness of the agents as termiticides. 24 cellulolytic bacteria are isolated, purified and screened from the gut of termites. All isolates were identified as Gram-negative with either rod or cocci in shape. For antibacterial studies result, isolates were found to be 100% sensitive to 4 antibiotics (rifampicin, tetracycline, gentamycin, and neomycin), 2 heavy metals (cadmium and mercury) and 3 disinfectants (lactic acid, formalin, and hydrogen peroxide). 22 out of 36 antibiotic combinations showed synergistic effect while 15 antibiotic combinations showed an antagonistic effect on isolates. The 2 heavy metals and 3 disinfectants that showed 100% effectiveness, as well as 22 antibiotic combinations, that showed synergistic effect were used for termite control. Among the 27 selected antibacterial agents, 12 of them were found to be effective to kill all the termites within 1 to 6 days. Mercury, lactic acid, formalin and hydrogen peroxide were found to be the most effective termiticides in which all termites were killed within 1 day only. These effective antibacterial agents possess a great potential to be a new application to control the termite pest species in the future.Keywords: antibacterial, cellulase, termicide, termites
Procedia PDF Downloads 46713907 The Information-Seeking Behaviour of Kuwaiti Judges (KJs)
Authors: Essam Mansour
Abstract:
The key purpose of this study is to show information-seeking behaviour of Kuwaiti Judges (KJs). Being one of the few studies about the information needs and information-seeking behaviour conducted in Arab and developing countries, this study is a pioneer one among many studies conducted in information seeking, especially with this significant group of information users. The authors tried to investigate this seeking behavior in terms of KJs' thoughts, perceptions, motivations, techniques, preferences, tools and barriers met when seeking information. The authors employed a questionnaire, with a response rate 77.2 percent. This study showed that most of KJs were likely to be older, educated and with a work experience ranged from new to old experience. There is a statistically reliable significant difference between KJs' demographic characteristics and some sources of information, such as books, encyclopedias, references and mass media. KJs were using information moderately to make a decision, to be in line with current events, to collect statistics and to make a specific/general research. The office and home were the most frequent location KJs were accessing information from. KJs' efficiency level of the English language is described to be moderately good, and a little number of them confirmed that their efficiency level of French was not bad. The assistance provided by colleagues, followed by consultants, translators, sectaries and librarians were found to be most strong types of assistance needed when seeking information. Mobile apps, followed by PCs, information networks (the Internet) and information databases were the highest technology tool used by KJs. Printed materials, followed by non-printed and audiovisual materials were the most preferred information formats KJs use. The use of languages, the recency of information and the place of information, the deficit role of the library to deliver information were at least significant barriers to KJs when seeking information.Keywords: information users, information-seeking behaviour, information needs, judges, Kuwait
Procedia PDF Downloads 30713906 Development and Characterization of Acoustic Energy Harvesters for Low Power Wireless Sensor Network
Authors: Waheed Gul, Muhammad Zeeshan, Ahmad Raza Khan, Muhammad Khurram
Abstract:
Wireless Sensor Nodes (WSNs) have developed significantly over the years and have significant potential in diverse applications in the fields of science and technology. The inadequate energy accompanying WSNs is a key constraint of WSN skills. To overcome this main restraint, the development and expansion of effective and reliable energy harvesting systems for WSN atmospheres are being discovered. In this research, low-power acoustic energy harvesters are designed and developed by applying different techniques of energy transduction from the sound available in the surroundings. Three acoustic energy harvesters were developed based on the piezoelectric phenomenon, electromagnetic transduction, and hybrid, respectively. The CAD modelling, lumped modelling and Finite Element Analysis of the harvesters were carried out. The voltages were obtained using FEA for each Acoustic Harvester. Characterization of all three harvesters was carried out and the power generated by the piezoelectric harvester, electromagnetic harvester and Hybrid Acoustic Energy harvester are 2.25x10-9W, 0.0533W and 0.0232W, respectively.Keywords: energy harvesting, WSNs, piezoelectric, electromagnetic, power
Procedia PDF Downloads 7113905 A Review on the Potential of Electric Vehicles in Reducing World CO2 Footprints
Authors: S. Alotaibi, S. Omer, Y. Su
Abstract:
The conventional Internal Combustion Engine (ICE) based vehicles are a threat to the environment as they account for a large proportion of the overall greenhouse gas (GHG) emissions in the world. Hence, it is required to replace these vehicles with more environment-friendly vehicles. Electric Vehicles (EVs) are promising technologies which offer both human comfort “noise, pollution” as well as reduced (or no) emissions of GHGs. In this paper, different types of EVs are reviewed and their advantages and disadvantages are identified. It is found that in terms of fuel economy, Plug-in Hybrid EVs (PHEVs) have the best fuel economy, followed by Hybrid EVs (HEVs) and ICE vehicles. Since Battery EVs (BEVs) do not use any fuel, their fuel economy is estimated as price per kilometer. Similarly, in terms of GHG emissions, BEVs are the most environmentally friendly since they do not result in any emissions while HEVs and PHEVs produce less emissions compared to the conventional ICE based vehicles. Fuel Cell EVs (FCEVs) are also zero-emission vehicles, but they have large costs associated with them. Finally, if the electricity is provided by using the renewable energy technologies through grid connection, then BEVs could be considered as zero emission vehicles.Keywords: electric vehicles, zero emission car, fuel economy, CO₂ footprint
Procedia PDF Downloads 14713904 Distribution System Planning with Distributed Generation and Capacitor Placements
Authors: Nattachote Rugthaicharoencheep
Abstract:
This paper presents a feeder reconfiguration problem in distribution systems. The objective is to minimize the system power loss and to improve bus voltage profile. The optimization problem is subjected to system constraints consisting of load-point voltage limits, radial configuration format, no load-point interruption, and feeder capability limits. A method based on genetic algorithm, a search algorithm based on the mechanics of natural selection and natural genetics, is proposed to determine the optimal pattern of configuration. The developed methodology is demonstrated by a 33-bus radial distribution system with distributed generations and feeder capacitors. The study results show that the optimal on/off patterns of the switches can be identified to give the minimum power loss while respecting all the constraints.Keywords: network reconfiguration, distributed generation capacitor placement, loss reduction, genetic algorithm
Procedia PDF Downloads 17713903 Hybrid Method for Smart Suggestions in Conversations for Online Marketplaces
Authors: Yasamin Rahimi, Ali Kamandi, Abbas Hoseini, Hesam Haddad
Abstract:
Online/offline chat is a convenient approach in the electronic markets of second-hand products in which potential customers would like to have more information about the products to fill the information gap between buyers and sellers. Online peer in peer market is trying to create artificial intelligence-based systems that help customers ask more informative questions in an easier way. In this article, we introduce a method for the question/answer system that we have developed for the top-ranked electronic market in Iran called Divar. When it comes to secondhand products, incomplete product information in a purchase will result in loss to the buyer. One way to balance buyer and seller information of a product is to help the buyer ask more informative questions when purchasing. Also, the short time to start and achieve the desired result of the conversation was one of our main goals, which was achieved according to A/B tests results. In this paper, we propose and evaluate a method for suggesting questions and answers in the messaging platform of the e-commerce website Divar. Creating such systems is to help users gather knowledge about the product easier and faster, All from the Divar database. We collected a dataset of around 2 million messages in Persian colloquial language, and for each category of product, we gathered 500K messages, of which only 2K were Tagged, and semi-supervised methods were used. In order to publish the proposed model to production, it is required to be fast enough to process 10 million messages daily on CPU processors. In order to reach that speed, in many subtasks, faster and simplistic models are preferred over deep neural models. The proposed method, which requires only a small amount of labeled data, is currently used in Divar production on CPU processors, and 15% of buyers and seller’s messages in conversations is directly chosen from our model output, and more than 27% of buyers have used this model suggestions in at least one daily conversation.Keywords: smart reply, spell checker, information retrieval, intent detection, question answering
Procedia PDF Downloads 18713902 Energy Harvesting and Storage System for Marine Applications
Authors: Sayem Zafar, Mahmood Rahi
Abstract:
Rigorous international maritime regulations are in place to limit boat and ship hydrocarbon emissions. The global sustainability goals are reducing the fuel consumption and minimizing the emissions from the ships and boats. These maritime sustainability goals have attracted a lot of research interest. Energy harvesting and storage system is designed in this study based on hybrid renewable and conventional energy systems. This energy harvesting and storage system is designed for marine applications, such as, boats and small ships. These systems can be utilized for mobile use or off-grid remote electrification. This study analyzed the use of micro power generation for boats and small ships. The energy harvesting and storage system has two distinct systems i.e. dockside shore-based system and on-board system. The shore-based system consists of a small wind turbine, photovoltaic (PV) panels, small gas turbine, hydrogen generator and high-pressure hydrogen storage tank. This dockside system is to provide easy access to the boats and small ships for supply of hydrogen. The on-board system consists of hydrogen storage tanks and fuel cells. The wind turbine and PV panels generate electricity to operate electrolyzer. A small gas turbine is used as a supplementary power system to contribute in case the hybrid renewable energy system does not provide the required energy. The electrolyzer performs the electrolysis on distilled water to produce hydrogen. The hydrogen is stored in high-pressure tanks. The hydrogen from the high-pressure tank is filled in the low-pressure tanks on-board seagoing vessels to operate the fuel cell. The boats and small ships use the hydrogen fuel cell to provide power to electric propulsion motors and for on-board auxiliary use. For shore-based system, a small wind turbine with the total length of 4.5 m and the disk diameter of 1.8 m is used. The small wind turbine dimensions make it big enough to be used to charge batteries yet small enough to be installed on the rooftops of dockside facility. The small dimensions also make the wind turbine easily transportable. In this paper, PV, sizing and solar flux are studied parametrically. System performance is evaluated under different operating and environmental conditions. The parametric study is conducted to evaluate the energy output and storage capacity of energy storage system. Results are generated for a wide range of conditions to analyze the usability of hybrid energy harvesting and storage system. This energy harvesting method significantly improves the usability and output of the renewable energy sources. It also shows that small hybrid energy systems have promising practical applications.Keywords: energy harvesting, fuel cell, hybrid energy system, hydrogen, wind turbine
Procedia PDF Downloads 13813901 A Comparison of Sequential Quadratic Programming, Genetic Algorithm, Simulated Annealing, Particle Swarm Optimization for the Design and Optimization of a Beam Column
Authors: Nima Khosravi
Abstract:
This paper describes an integrated optimization technique with concurrent use of sequential quadratic programming, genetic algorithm, and simulated annealing particle swarm optimization for the design and optimization of a beam column. In this research, the comparison between 4 different types of optimization methods. The comparison is done and it is found out that all the methods meet the required constraints and the lowest value of the objective function is achieved by SQP, which was also the fastest optimizer to produce the results. SQP is a gradient based optimizer hence its results are usually the same after every run. The only thing which affects the results is the initial conditions given. The initial conditions given in the various test run were very large as compared. Hence, the value converged at a different point. Rest of the methods is a heuristic method which provides different values for different runs even if every parameter is kept constant.Keywords: beam column, genetic algorithm, particle swarm optimization, sequential quadratic programming, simulated annealing
Procedia PDF Downloads 38613900 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation
Authors: Aicha Majda, Abdelhamid El Hassani
Abstract:
Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric
Procedia PDF Downloads 16913899 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.Keywords: indexing, retrieval, multimedia, graph algorithm, graph code
Procedia PDF Downloads 16113898 Interpretation and Clustering Framework for Analyzing ECG Survey Data
Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif
Abstract:
As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix
Procedia PDF Downloads 47013897 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.Keywords: regression, piecewise, Bayesian, reversible Jump MCMC
Procedia PDF Downloads 52113896 Kinematic Modelling and Task-Based Synthesis of a Passive Architecture for an Upper Limb Rehabilitation Exoskeleton
Authors: Sakshi Gupta, Anupam Agrawal, Ekta Singla
Abstract:
An exoskeleton design for rehabilitation purpose encounters many challenges, including ergonomically acceptable wearing technology, architectural design human-motion compatibility, actuation type, human-robot interaction, etc. In this paper, a passive architecture for upper limb exoskeleton is proposed for assisting in rehabilitation tasks. Kinematic modelling is detailed for task-based kinematic synthesis of the wearable exoskeleton for self-feeding tasks. The exoskeleton architecture possesses expansion and torsional springs which are able to store and redistribute energy over the human arm joints. The elastic characteristics of the springs have been optimized to minimize the mechanical work of the human arm joints. The concept of hybrid combination of a 4-bar parallelogram linkage and a serial linkage were chosen, where the 4-bar parallelogram linkage with expansion spring acts as a rigid structure which is used to provide the rotational degree-of-freedom (DOF) required for lowering and raising of the arm. The single linkage with torsional spring allows for the rotational DOF required for elbow movement. The focus of the paper is kinematic modelling, analysis and task-based synthesis framework for the proposed architecture, keeping in considerations the essential tasks of self-feeding and self-exercising during rehabilitation of partially healthy person. Rehabilitation of primary functional movements (activities of daily life, i.e., ADL) is routine activities that people tend to every day such as cleaning, dressing, feeding. We are focusing on the feeding process to make people independent in respect of the feeding tasks. The tasks are focused to post-surgery patients under rehabilitation with less than 40% weakness. The challenges addressed in work are ensuring to emulate the natural movement of the human arm. Human motion data is extracted through motion-sensors for targeted tasks of feeding and specific exercises. Task-based synthesis procedure framework will be discussed for the proposed architecture. The results include the simulation of the architectural concept for tracking the human-arm movements while displaying the kinematic and static study parameters for standard human weight. D-H parameters are used for kinematic modelling of the hybrid-mechanism, and the model is used while performing task-based optimal synthesis utilizing evolutionary algorithm.Keywords: passive mechanism, task-based synthesis, emulating human-motion, exoskeleton
Procedia PDF Downloads 13713895 Genetic Algorithm Optimization of a Small Scale Natural Gas Liquefaction Process
Authors: M. I. Abdelhamid, A. O. Ghallab, R. S. Ettouney, M. A. El-Rifai
Abstract:
An optimization scheme based on COM server is suggested for communication between Genetic Algorithm (GA) toolbox of MATLAB and Aspen HYSYS. The structure and details of the proposed framework are discussed. The power of the developed scheme is illustrated by its application to the optimization of a recently developed natural gas liquefaction process in which Aspen HYSYS was used for minimization of the power consumption by optimizing the values of five operating variables. In this work, optimization by coupling between the GA in MATLAB and Aspen HYSYS model of the same process using the same five decision variables enabled improvements in power consumption by 3.3%, when 77% of the natural gas feed is liquefied. Also on inclusion of the flow rates of both nitrogen and carbon dioxide refrigerants as two additional decision variables, the power consumption decreased by 6.5% for a 78% liquefaction of the natural gas feed.Keywords: stranded gas liquefaction, genetic algorithm, COM server, single nitrogen expansion, carbon dioxide pre-cooling
Procedia PDF Downloads 44913894 Reliable Soup: Reliable-Driven Model Weight Fusion on Ultrasound Imaging Classification
Authors: Shuge Lei, Haonan Hu, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Yan Tong
Abstract:
It remains challenging to measure reliability from classification results from different machine learning models. This paper proposes a reliable soup optimization algorithm based on the model weight fusion algorithm Model Soup, aiming to improve reliability by using dual-channel reliability as the objective function to fuse a series of weights in the breast ultrasound classification models. Experimental results on breast ultrasound clinical datasets demonstrate that reliable soup significantly enhances the reliability of breast ultrasound image classification tasks. The effectiveness of the proposed approach was verified via multicenter trials. The results from five centers indicate that the reliability optimization algorithm can enhance the reliability of the breast ultrasound image classification model and exhibit low multicenter correlation.Keywords: breast ultrasound image classification, feature attribution, reliability assessment, reliability optimization
Procedia PDF Downloads 8513893 A Memetic Algorithm for an Energy-Costs-Aware Flexible Job-Shop Scheduling Problem
Authors: Christian Böning, Henrik Prinzhorn, Eric C. Hund, Malte Stonis
Abstract:
In this article, the flexible job-shop scheduling problem is extended by consideration of energy costs which arise owing to the power peak, and further decision variables such as work in process and throughput time are incorporated into the objective function. This enables a production plan to be simultaneously optimized in respect of the real arising energy and logistics costs. The energy-costs-aware flexible job-shop scheduling problem (EFJSP) which arises is described mathematically, and a memetic algorithm (MA) is presented as a solution. In the MA, the evolutionary process is supplemented with a local search. Furthermore, repair procedures are used in order to rectify any infeasible solutions that have arisen in the evolutionary process. The potential for lowering the real arising costs of a production plan through consideration of energy consumption levels is highlighted.Keywords: energy costs, flexible job-shop scheduling, memetic algorithm, power peak
Procedia PDF Downloads 345