Search results for: entity extraction
589 The Weavability of Waste Plants and Their Application in Fashion and Textile Design
Authors: Jichi Wu
Abstract:
The dwindling of resources requires a more sustainable design. New technology could bring new materials and processing techniques to the fashion industry and push it to a more sustainable future. Thus this paper explores cutting-edge researches on the life-cycle of closed-loop products and aims to find innovative ways to recycle and upcycle. For such a goal, the author investigated how low utilization plants and leftover fiber could be turned into ecological textiles in fashion. Through examining the physical and chemical properties (cellulose content/ fiber form) of ecological textiles to explore their wearability, this paper analyzed the prospect of bio-fabrics (weavable plants) in body-oriented fashion design and their potential in sustainable fashion and textile design. By extracting cellulose from 9 different types or sections of plants, the author intends to find an appropriate method (such as ion solution extraction) to mostly increase the weavability of plants, so raw materials could be more effectively changed into fabrics. All first-hand experiment data were carefully collected and then analyzed under the guidance of related theories. The result of the analysis was recorded in detail and presented in an understandable way. Various research methods are adopted through this project, including field trip and experiments to make comparisons and recycle materials. Cross-discipline cooperation is also conducted for related knowledge and theories. From this, experiment data will be collected, analyzed, and interpreted into a description and visualization results. Based on the above conclusions, it is possible to apply weavable plant fibres to develop new textile and fashion.Keywords: wearable bio-textile, sustainability, economy, ecology, technology, weavability, fashion design
Procedia PDF Downloads 147588 A Chemical-Free Colouration Technique for Regenerated Fibres Using Waste Alpaca Fibres
Authors: M. Abdullah Al Faruque, Rechana Remadevi, Abu Naser M. Ahsanul Haque, Joselito Razal, Xungai Wang, Maryam Naebe
Abstract:
Generally, the colouration of textile fibres is performed by using synthetic colourants in dope dyeing or conventional dyeing methods. However, the toxic effect of some synthetic colorants due to long-term exposure can cause several health threats including cancer, asthma and skin diseases. Moreover, in colouration process, these colourants not only consume a massive amount of water but also generates huge proportion of wastewater to the environment. Despite having the environmentally friendly characteristics, current natural colourants have downsides in their yield and need chemical extraction processes which are water consuming as well. In view of this, the present work focuses to develop a chemical-free biocompatible and natural pigment based colouration technique to colour regenerated fibres. Waste alpaca fibre was used as a colourant and the colour properties, as well as the mechanical properties, of the regenerated fibres were investigated. The colourant from waste alpaca was fabricated through mechanical milling process and it was directly applied to the polyacrylonitrile (PAN) dope solution in different ratios of alpaca: PAN (10:90, 20:80, 30:70). The results obtained from the chemical structure characterization suggested that all the coloured regenerated fibres exhibited chemical functional groups of both PAN and alpaca. Furthermore, the color strength was increased gradually with the increment of alpaca content and showed excellent washing fastness properties. These results reveal a potential new pathway for chemical-free dyeing technique for fibres with improved properties.Keywords: alpaca, chemical-free coloration, natural colorant, polyacrylonitrile, water consumption, wet spinning
Procedia PDF Downloads 172587 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints
Authors: Maxime Merheb, Rachel Matar
Abstract:
Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR
Procedia PDF Downloads 400586 The Optimization of the Parameters for Eco-Friendly Leaching of Precious Metals from Waste Catalyst
Authors: Silindile Gumede, Amir Hossein Mohammadi, Mbuyu Germain Ntunka
Abstract:
Goal 12 of the 17 Sustainable Development Goals (SDGs) encourages sustainable consumption and production patterns. This necessitates achieving the environmentally safe management of chemicals and all wastes throughout their life cycle and the proper disposal of pollutants and toxic waste. Fluid catalytic cracking (FCC) catalysts are widely used in the refinery to convert heavy feedstocks to lighter ones. During the refining processes, the catalysts are deactivated and discarded as hazardous toxic solid waste. Spent catalysts (SC) contain high-cost metal, and the recovery of metals from SCs is a tactical plan for supplying part of the demand for these substances and minimizing the environmental impacts. Leaching followed by solvent extraction, has been found to be the most efficient method to recover valuable metals with high purity from spent catalysts. However, the use of inorganic acids during the leaching process causes a secondary environmental issue. Therefore, it is necessary to explore other alternative efficient leaching agents that are economical and environmentally friendly. In this study, the waste catalyst was collected from a domestic refinery and was characterised using XRD, ICP, XRF, and SEM. Response surface methodology (RSM) and Box Behnken design were used to model and optimize the influence of some parameters affecting the acidic leaching process. The parameters selected in this investigation were the acid concentration, temperature, and leaching time. From the characterisation results, it was found that the spent catalyst consists of high concentrations of Vanadium (V) and Nickel (Ni); hence this study focuses on the leaching of Ni and V using a biodegradable acid to eliminate the formation of the secondary pollution.Keywords: eco-friendly leaching, optimization, metal recovery, leaching
Procedia PDF Downloads 68585 Novel Aspects of Merger Control Pertaining to Nascent Acquisition: An Analytical Legal Research
Authors: Bhargavi G. Iyer, Ojaswi Bhagat
Abstract:
It is often noted that the value of a novel idea lies in its successful implementation. However, successful implementation requires the nurturing and encouragement of innovation. Nascent competitors are a true representation of innovation in any given industry. A nascent competitor is an entity whose prospective innovation poses a future threat to an incumbent dominant competitor. While a nascent competitor benefits in several ways, it is also exposed significantly and is at greater risk of facing the brunt of exclusionary practises and abusive conduct by dominant incumbent competitors in the industry. This research paper aims to explore the risks and threats faced by nascent competitors and analyse the benefits they accrue as well as the advantages they proffer to the economy; through an analytical, critical study. In such competitive market environments, a rise of the acquisitions of nascent competitors by the incumbent dominants is observed. Therefore, this paper will examine the dynamics of nascent acquisition. Further, this paper hopes to specifically delve into the role of antitrust bodies in regulating nascent acquisition. This paper also aspires to deal with the question how to distinguish harmful from harmless acquisitions in order to facilitate ideal enforcement practice. This paper proposes mechanisms of scrutiny in order to ensure healthy market practises and efficient merger control in the context of nascent acquisitions. Taking into account the scope and nature of the topic, as well as the resources available and accessible, a combination of the methods of doctrinal research and analytical research were employed, utilising secondary sources in order to assess and analyse the subject of research. While legally evaluating the Killer Acquisition theory and the Nascent Potential Acquisition theory, this paper seeks to critically survey the precedents and instances of nascent acquisitions. In addition to affording a compendious account of the legislative framework and regulatory mechanisms in the United States, the United Kingdom, and the European Union; it hopes to suggest an internationally practicable legal foundation for domestic legislation and enforcement to adopt. This paper hopes to appreciate the complexities and uncertainties with respect to nascent acquisitions and attempts to suggest viable and plausible policy measures in antitrust law. It additionally attempts to examine the effects of such nascent acquisitions upon the consumer and the market economy. This paper weighs the argument of shifting the evidentiary burden on to the merging parties in order to improve merger control and regulation and expounds on its discovery of the strengths and weaknesses of the approach. It is posited that an effective combination of factual, legal, and economic analysis of both the acquired and acquiring companies possesses the potential to improve ex post and ex ante merger review outcomes involving nascent companies; thus, preventing anti-competitive practises. This paper concludes with an analysis of the possibility and feasibility of industry-specific identification of anti-competitive nascent acquisitions and implementation of measures accordingly.Keywords: acquisition, antitrust law, exclusionary practises merger control, nascent competitor
Procedia PDF Downloads 161584 Comparison of Different Machine Learning Algorithms for Solubility Prediction
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.Keywords: random forest, machine learning, comparison, feature extraction
Procedia PDF Downloads 40583 Free Radical Scavenging, Antioxidant Activity, Phenolic, Alkaloids Contents and Inhibited Properties against α-Amylase and Invertase Enzymes of Stem Bark Extracts Coula edulis B
Authors: Eric Beyegue, Boris Azantza, Judith Laure Ngondi, Julius E. Oben
Abstract:
Background: It is clearly that phytochemical constituents of plants in relation exhibit free radical scavenging, antioxidant and glycosylation properties. This study investigated the in vitro antioxidant and free radical scavenging, inhibited activities against α-amylase and invertase enzymes of stem bark extracts C. edulis (Olacaceae). Methods: Four extracts (hexane, dichloromethane, ethanol and aqueous) from the barks of C. edulis were used in this study. Colorimetric in vitro methods were using for evaluate free radical scavenging activity DPPH, ABTS, NO, OH, antioxidant capacity, glycosylation activity, inhibition of α-amylase and invertase activities, phenolic, flavonoid and alkaloid contents. Results: C. edulis extracts (CEE) had a higher scavenging potential on the 2, 2-diphenyl-1-picrylhydrazyl (DPPH), hydroxyl (OH), nitrite oxide (NO), 2, 2-azinobis (3-ethylbenzthiazoline)-6-sulfonic acid (ABTS) radicals and glucose scavenging with the IC50 varied between 41.95 and 36694.43 µg/ml depending on the solvent of extraction. The ethanol extract of C. edulis stem bark (CE EtOH) showed the highest polyphenolic (289.10 + 30.32), flavonoid (1.12 + 0.09) and alkaloids (18.47 + 0.16) content. All the tested extracts demonstrated a relative high inhibition potential against α-amylase and invertase digestive enzymes activities. Conclusion: This study suggests that CEE exhibited higher antioxidant potential and significant inhibition potential against digestive enzymes.Keywords: Coula edulis, antioxidant, scavenging activity, amylase, invertase
Procedia PDF Downloads 351582 Inhibitory Activity of Lactic Acid Bacteria on the Growth and Biogenic Amines Production by Foodborne Pathogens and Food Spoilage Bacteria
Authors: Abderrezzak khatib
Abstract:
Biogenic amines are low molecular weight nitrogenous compounds that have the potential to accumulate in food, posing a significant risk to food safety and human health. In this study, we investigated the inhibitory activity of three strains of lactic acid bacteria (LAB), against the growth and production of biogenic amines by both foodborne pathogens and food spoilage bacteria. The foodborne pathogens studied included Staphylococcus aureus, Pseudomonas aeruginosa, and Salmonella Paratyphi, while the food spoilage bacteria comprised Enterobacter cloacae and Proteus mirabilis. The methodology involved bacterial growth determination in petri dishes, bacterial culture extraction and derivatization, and biogenic amine analysis using HPLC. Our findings revealed that the inhibitory effects of LAB on these pathogens varied, with all three LAB strains demonstrating a remarkable reduction in the total bacterial count when combined with most pathogens, compared to the individual cultures of the pathogens. Furthermore, the presence of LAB in co-cultures with the pathogens resulted in a significant decrease in the production of tyramine and other biogenic amines by the pathogens themselves. These results suggest that LAB strains hold considerable promise in preventing the accumulation of biogenic amines in food products, thereby enhancing food safety. This study provides insights into the potential utilization of LAB in the context of preserving and ensuring the safety of food products. It highlights the significance of conducting additional research endeavors to elucidate the underlying mechanisms involved and to identify the precise bioactive compounds that are responsible for the observed inhibitory effects.Keywords: food safety, lactic acid bacteria, foodborne pathogens, food spoilage bacteria, biogenic amines, tyrosine
Procedia PDF Downloads 55581 Salting Effect in Partially Miscible Systems of Water/Acétic Acid/1-Butanol at 298.15k: Experimental Study and Estimation of New Solvent-Solvent and Salt-Solvent Binary Interaction Parameters for NRTL Model
Authors: N. Bourayou, A. -H. Meniai, A. Gouaoura
Abstract:
The presence of salt can either raise or lower the distribution coefficient of a solute acetic acid in liquid- liquid equilibria. The coefficient of solute is defined as the ratio of the composition of solute in solvent rich phase to the composition of solute in diluents (water) rich phase. The phenomena are known as salting–out or salting-in, respectively. The effect of monovalent salt, sodium chloride and the bivalent salt, sodium sulfate on the distribution of acetic acid between 1-butanol and water at 298.15K were experimentally shown to be effective in modifying the liquid-liquid equilibrium of water/acetic acid/1-butanol system in favour of the solvent extraction of acetic acid from an aqueous solution with 1-butanol, particularly at high salt concentrations of both salts. All the two salts studied are found to have to salt out effect for acetic acid in varying degrees. The experimentally measured data were well correlated by Eisen-Joffe equation. NRTL model for solvent mixtures containing salts was able to provide good correlation of the present liquid-liquid equilibrium data. Using the regressed salt concentration coefficients for the salt-solvent interaction parameters and the solvent-solvent interaction parameters obtained from the same system without salt. The calculated phase equilibrium was in a quite good agreement with the experimental data, showing the ability of NRTL model to correlate salt effect on the liquid-liquid equilibrium.Keywords: activity coefficient, Eisen-Joffe, NRTL model, sodium chloride
Procedia PDF Downloads 283580 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 589579 Teaching Children about Their Brains: Evaluating the Role of Neuroscience Undergraduates in Primary School Education
Authors: Clea Southall
Abstract:
Many children leave primary school having formed preconceptions about their relationship with science. Thus, primary school represents a critical window for stimulating scientific interest in younger children. Engagement relies on the provision of hands-on activities coupled with an ability to capture a child’s innate curiosity. This requires children to perceive science topics as interesting and relevant to their everyday life. Teachers and pupils alike have suggested the school curriculum be tailored to help stimulate scientific interest. Young children are naturally inquisitive about the human body; the brain is one topic which frequently engages pupils, although it is not currently included in the UK primary curriculum. Teaching children about the brain could have wider societal impacts such as increasing knowledge of neurological disorders. However, many primary school teachers do not receive formal neuroscience training and may feel apprehensive about delivering lessons on the nervous system. This is exacerbated by a lack of educational neuroscience resources. One solution is for undergraduates to form partnerships with schools - delivering engaging lessons and supplementing teacher knowledge. The aim of this project was to evaluate the success of a short lesson on the brain delivered by an undergraduate neuroscientist to primary school pupils. Prior to entering schools, semi-structured online interviews were conducted with teachers to gain pedagogical advice and relevant websites were searched for neuroscience resources. Subsequently, a single lesson plan was created comprising of four hands-on activities. The activities were devised in a top-down manner, beginning with learning about the brain as an entity, before focusing on individual neurons. Students were asked to label a ‘brain map’ to assess prior knowledge of brain structure and function. They viewed animal brains and created ‘pipe-cleaner neurons’ which were later used to depict electrical transmission. The same session was delivered by an undergraduate student to 570 key stage 2 (KS2) pupils across five schools in Leeds, UK. Post-session surveys, designed for teachers and pupils respectively, were used to evaluate the session. Children in all year groups had relatively poor knowledge of brain structure and function at the beginning of the session. When asked to label four brain regions with their respective functions, older pupils labeled a mean of 1.5 (± 1.0) brain regions compared to 0.8 (± 0.96) for younger pupils (p=0.002). However, by the end of the session, 95% of pupils felt their knowledge of the brain had increased. Hands-on activities were rated most popular by pupils and were considered the most successful aspect of the session by teachers. Although only half the teachers were aware of neuroscience educational resources, nearly all (95%) felt they would have more confidence in teaching a similar session in the future. All teachers felt the session was engaging and that the content could be linked to the current curriculum. Thus, a short fifty-minute session can successfully enhance pupils’ knowledge of a new topic: the brain. Partnerships with an undergraduate student can provide an alternative method for supplementing teacher knowledge, increasing their confidence in delivering future lessons on the nervous system.Keywords: education, neuroscience, primary school, undergraduate
Procedia PDF Downloads 211578 A Hybrid Energy Storage Module for the Emergency Energy System of the Community Shelter in Yucatán, México
Authors: María Reveles-Miranda, Daniella Pacheco-Catalán
Abstract:
Sierra Papacal commissary is located north of Merida, Yucatan, México, where the indigenous Maya population predominates. Due to its location, the region has an elevation of fewer than 4.5 meters above sea level, with a high risk of flooding associated with storms and hurricanes and a high vulnerability of infrastructure and housing in the presence of strong gusts of wind. In environmental contingencies, the challenge is providing an autonomous electrical supply using renewable energy sources that cover vulnerable populations' health, food, and water pumping needs. To address this challenge, a hybrid energy storage module is proposed for the emergency photovoltaic (PV) system of the community shelter in Sierra Papacal, Yucatán, which combines high-energy-density batteries and high-power-density supercapacitors (SC) in a single module, providing a quick response to energy demand, reducing the thermal stress on batteries and extending their useful life. Incorporating SC in energy storage modules can provide fast response times to power variations and balanced energy extraction, ensuring a more extended period of electrical supply to vulnerable populations during contingencies. The implemented control strategy increases the module's overall performance by ensuring the optimal use of devices and balanced energy exploitation. The operation of the module with the control algorithm is validated with MATLAB/Simulink® and experimental tests.Keywords: batteries, community shelter, environmental contingencies, hybrid energy storage, isolated photovoltaic system, supercapacitors
Procedia PDF Downloads 91577 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 155576 Polymerization of Epsilon-Caprolactone Using Lipase Enzyme for Medical Applications
Authors: Sukanya Devi Ramachandran, Vaishnavi Muralidharan, Kavya Chandrasekaran
Abstract:
Polycaprolactone is polymer belonging to the polyester family that has noticeable characteristics of biodegradability and biocompatibility which is essential for medical applications. Polycaprolactone is produced by the ring opening polymerization of the monomer epsilon-Caprolactone (ε-CL) which is a closed ester, comprising of seven-membered ring. This process is normally catalysed by metallic components such as stannous octoate. It is difficult to remove the catalysts after the reaction, and they are also toxic to the human body. An alternate route of using enzymes as catalysts is being employed to reduce the toxicity. Lipase enzyme is a subclass of esterase that can easily attack the ester bonds of ε-CL. This research paper throws light on the extraction of lipase from germinating sunflower seeds and the activity of the biocatalyst in the polymerization of ε-CL. Germinating Sunflower seeds were crushed with fine sand in phosphate buffer of pH 6.5 into a fine paste which was centrifuged at 5000rpm for 10 minutes. The clear solution of the enzyme was tested for activity at various pH ranging from 5 to 7 and temperature ranging from 40oC to 70oC. The enzyme was active at pH6.0 and at 600C temperature. Polymerization of ε-CL was done using toluene as solvent with the catalysis of lipase enzyme, after which chloroform was added to terminate the reaction and was washed in cold methanol to obtain the polymer. The polymerization was done by varying the time from 72 hours to 6 days and tested for the molecular weight and the conversion of the monomer. The molecular weight obtained at 6 days is comparably higher. This method will be very effective, economical and eco-friendly to produce as the enzyme used can be regenerated as such at the end of the reaction and can be reused. The obtained polymers can be used for drug delivery and other medical applications.Keywords: lipase, monomer, polycaprolactone, polymerization
Procedia PDF Downloads 296575 Low-Cost Parking Lot Mapping and Localization for Home Zone Parking Pilot
Authors: Hongbo Zhang, Xinlu Tang, Jiangwei Li, Chi Yan
Abstract:
Home zone parking pilot (HPP) is a fast-growing segment in low-speed autonomous driving applications. It requires the car automatically cruise around a parking lot and park itself in a range of up to 100 meters inside a recurrent home/office parking lot, which requires precise parking lot mapping and localization solution. Although Lidar is ideal for SLAM, the car OEMs favor a low-cost fish-eye camera based visual SLAM approach. Recent approaches have employed segmentation models to extract semantic features and improve mapping accuracy, but these AI models are memory unfriendly and computationally expensive, making deploying on embedded ADAS systems difficult. To address this issue, we proposed a new method that utilizes object detection models to extract robust and accurate parking lot features. The proposed method could reduce computational costs while maintaining high accuracy. Once combined with vehicles’ wheel-pulse information, the system could construct maps and locate the vehicle in real-time. This article will discuss in detail (1) the fish-eye based Around View Monitoring (AVM) with transparent chassis images as the inputs, (2) an Object Detection (OD) based feature point extraction algorithm to generate point cloud, (3) a low computational parking lot mapping algorithm and (4) the real-time localization algorithm. At last, we will demonstrate the experiment results with an embedded ADAS system installed on a real car in the underground parking lot.Keywords: ADAS, home zone parking pilot, object detection, visual SLAM
Procedia PDF Downloads 67574 Quartz Crystal Microbalance Based Hydrophobic Nanosensor for Lysozyme Detection
Authors: F. Yılmaz, Y. Saylan, A. Derazshamshir, S. Atay, A. Denizli
Abstract:
Quartz crystal microbalance (QCM), high-resolution mass-sensing technique, measures changes in mass on oscillating quartz crystal surface by measuring changes in oscillation frequency of crystal in real time. Protein adsorption techniques via hydrophobic interaction between protein and solid support, called hydrophobic interaction chromatography (HIC), can be favorable in many cases. Some nanoparticles can be effectively applied for HIC. HIC takes advantage of the hydrophobicity of proteins by promoting its separation on the basis of hydrophobic interactions between immobilized hydrophobic ligands and nonpolar regions on the surface of the proteins. Lysozyme is found in a variety of vertebrate cells and secretions, such as spleen, milk, tears, and egg white. Its common applications are as a cell-disrupting agent for extraction of bacterial intracellular products, as an antibacterial agent in ophthalmologic preparations, as a food additive in milk products and as a drug for treatment of ulcers and infections. Lysozyme has also been used in cancer chemotherapy. The aim of this study is the synthesis of hydrophobic nanoparticles for Lysozyme detection. For this purpose, methacryoyl-L-phenylalanine was chosen as a hydrophobic matrix. The hydrophobic nanoparticles were synthesized by micro-emulsion polymerization method. Then, hydrophobic QCM nanosensor was characterized by Attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy, atomic force microscopy (AFM) and zeta size analysis. Hydrophobic QCM nanosensor was tested for real-time detection of Lysozyme from aqueous solution. The kinetic and affinity studies were determined by using Lysozyme solutions with different concentrations. The responses related to a mass (Δm) and frequency (Δf) shifts were used to evaluate adsorption properties.Keywords: nanosensor, HIC, lysozyme, QCM
Procedia PDF Downloads 348573 Screening of Antiviral Compounds in Medicinal Plants: Non-Volatiles
Authors: Tomas Drevinskas, Ruta Mickiene, Audrius Maruska, Nicola Tiso, Algirdas Salomskas, Raimundas Lelesius, Agneta Karpovaite, Ona Ragazinskiene, Loreta Kubiliene
Abstract:
Antiviral effect of substances accumulated by plants and natural products is known to ethno-pharmacy and modern day medicine. Antiviral properties are usually assigned to volatile compounds and polyphenols. This research work is divided into several parts and the task of this part was to investigate potential plants, potential substances and potential preparation conditions that can be used for the preparation of antiviral agents. Sixteen different medicinal plants, their parts and two types of propolis were selected for screening. Firstly, extraction conditions of non-volatile compounds were investigated: 3 pre-selected plants were extracted with 5 different ethanol – water mixtures (96%, 75%, 60%, 40%, 20 %, vol.) and bidistilled water. Total phenolic content, total flavonoid content and radical scavenging activity was determined. The results indicated that optimal extrahent is 40%, vol. of ethanol – water mixture. Further investigations were performed with the extrahent of 40%, vol. ethanol – water mixture. All 16 of selected plants, their parts and two types of propolis were extracted using selected extrahent. Determined total phenolic content, total flavonoid content and radical scavenging activity indicated that extracts of Origanum Vulgare L., Mentha piperita L., Geranium macrorrhizum L., Melissa officinalis L. and Desmodium canadence L. contains highest amount of extractable phenolic compounds (7.31, 5.48, 7.88, 8.02 and 7.16 rutin equivalents (mg/ ml) respectively), flavonoid content (2.14, 2.23, 2.49, 0.79 and 1.51 rutin equivalents (mg/ml) respectively) and radical scavenging activity (11.98, 8.72, 13.47, 13.22 and 12.22 rutin equivalents (mg/ml) respectively). Composition of the extracts is analyzed using HPLC.Keywords: antiviral effect, plants, propolis, phenols
Procedia PDF Downloads 324572 Fake News Detection Based on Fusion of Domain Knowledge and Expert Knowledge
Authors: Yulan Wu
Abstract:
The spread of fake news on social media has posed significant societal harm to the public and the nation, with its threats spanning various domains, including politics, economics, health, and more. News on social media often covers multiple domains, and existing models studied by researchers and relevant organizations often perform well on datasets from a single domain. However, when these methods are applied to social platforms with news spanning multiple domains, their performance significantly deteriorates. Existing research has attempted to enhance the detection performance of multi-domain datasets by adding single-domain labels to the data. However, these methods overlook the fact that a news article typically belongs to multiple domains, leading to the loss of domain knowledge information contained within the news text. To address this issue, research has found that news records in different domains often use different vocabularies to describe their content. In this paper, we propose a fake news detection framework that combines domain knowledge and expert knowledge. Firstly, it utilizes an unsupervised domain discovery module to generate a low-dimensional vector for each news article, representing domain embeddings, which can retain multi-domain knowledge of the news content. Then, a feature extraction module uses the domain embeddings discovered through unsupervised domain knowledge to guide multiple experts in extracting news knowledge for the total feature representation. Finally, a classifier is used to determine whether the news is fake or not. Experiments show that this approach can improve multi-domain fake news detection performance while reducing the cost of manually labeling domain labels.Keywords: fake news, deep learning, natural language processing, multiple domains
Procedia PDF Downloads 73571 Determination of Selected Engineering Properties of Giant Palm Seeds (Borassus Aethiopum) in Relation to Its Oil Potential
Authors: Rasheed Amao Busari, Ahmed Ibrahim
Abstract:
The engineering properties of giant palms are crucial for the reasonable design of the processing and handling systems. The research was conducted to investigate some engineering properties of giant palm seeds in relation to their oil potential. The ripe giant palm fruit was sourced from some parts of Zaria in Kaduna State and Ado Ekiti in Ekiti State, Nigeria. The mesocarps of the fruits collected were removed to obtain the nuts, while the collected nuts were dried under ambient conditions for several days. The actual moisture content of the nuts at the time of the experiment was determined using KT100S Moisture Meter, with moisture content ranged 17.9% to 19.15%. The physical properties determined are axial dimension, geometric mean diameter, arithmetic mean diameter, sphericity, true and bulk densities, porosity, angles of repose, and coefficients of friction. The nuts were measured using a vernier caliper for physical assessment of their sizes. The axial dimensions of 100 nuts were taken and the result shows that the size ranges from 7.30 to 9.32cm for major diameter, 7.2 to 8.9 cm for intermediate diameter, and 4.2 to 6.33 for minor diameter. The mechanical properties determined were compressive force, compressive stress, and deformation both at peak and break using Instron hydraulic universal tensile testing machine. The work also revealed that giant palm seed can be classified as an oil-bearing seed. The seed gave 18% using the solvent extraction method. The results obtained from the study will help in solving the problem of equipment design, handling, and further processing of the seeds.Keywords: giant palm seeds, engineering properties, oil potential, moisture content, and giant palm fruit
Procedia PDF Downloads 78570 Using Visualization Techniques to Support Common Clinical Tasks in Clinical Documentation
Authors: Jonah Kenei, Elisha Opiyo
Abstract:
Electronic health records, as a repository of patient information, is nowadays the most commonly used technology to record, store and review patient clinical records and perform other clinical tasks. However, the accurate identification and retrieval of relevant information from clinical records is a difficult task due to the unstructured nature of clinical documents, characterized in particular by a lack of clear structure. Therefore, medical practice is facing a challenge thanks to the rapid growth of health information in electronic health records (EHRs), mostly in narrative text form. As a result, it's becoming important to effectively manage the growing amount of data for a single patient. As a result, there is currently a requirement to visualize electronic health records (EHRs) in a way that aids physicians in clinical tasks and medical decision-making. Leveraging text visualization techniques to unstructured clinical narrative texts is a new area of research that aims to provide better information extraction and retrieval to support clinical decision support in scenarios where data generated continues to grow. Clinical datasets in electronic health records (EHR) offer a lot of potential for training accurate statistical models to classify facets of information which can then be used to improve patient care and outcomes. However, in many clinical note datasets, the unstructured nature of clinical texts is a common problem. This paper examines the very issue of getting raw clinical texts and mapping them into meaningful structures that can support healthcare professionals utilizing narrative texts. Our work is the result of a collaborative design process that was aided by empirical data collected through formal usability testing.Keywords: classification, electronic health records, narrative texts, visualization
Procedia PDF Downloads 118569 Optimum Dewatering Network Design Using Firefly Optimization Algorithm
Authors: S. M. Javad Davoodi, Mojtaba Shourian
Abstract:
Groundwater table close to the ground surface causes major problems in construction and mining operation. One of the methods to control groundwater in such cases is using pumping wells. These pumping wells remove excess water from the site project and lower the water table to a desirable value. Although the efficiency of this method is acceptable, it needs high expenses to apply. It means even small improvement in a design of pumping wells can lead to substantial cost savings. In order to minimize the total cost in the method of pumping wells, a simulation-optimization approach is applied. The proposed model integrates MODFLOW as the simulation model with Firefly as the optimization algorithm. In fact, MODFLOW computes the drawdown due to pumping in an aquifer and the Firefly algorithm defines the optimum value of design parameters which are numbers, pumping rates and layout of the designing wells. The developed Firefly-MODFLOW model is applied to minimize the cost of the dewatering project for the ancient mosque of Kerman city in Iran. Repetitive runs of the Firefly-MODFLOW model indicates that drilling two wells with the total rate of pumping 5503 m3/day is the result of the minimization problem. Results show that implementing the proposed solution leads to at least 1.5 m drawdown in the aquifer beneath mosque region. Also, the subsidence due to groundwater depletion is less than 80 mm. Sensitivity analyses indicate that desirable groundwater depletion has an enormous impact on total cost of the project. Besides, in a hypothetical aquifer decreasing the hydraulic conductivity contributes to decrease in total water extraction for dewatering.Keywords: groundwater dewatering, pumping wells, simulation-optimization, MODFLOW, firefly algorithm
Procedia PDF Downloads 294568 Management Prospects of Winery By-Products Based on Phenolic Compounds and Antioxidant Activity of Grape Skins: The Case of Greek Ionian Islands
Authors: Marinos Xagoraris, Iliada K. Lappa, Charalambos Kanakis, Dimitra Daferera, Christina Papadopoulou, Georgios Sourounis, Charilaos Giotis, Pavlos Bouchagier, Christos S. Pappas, Petros A. Tarantilis, Efstathia Skotti
Abstract:
The aim of this work was to recover phenolic compounds from grape skins produced in Greek varieties of the Ionian Islands in order to form the basis of calculations for their further utilization in the context of the circular economy. Isolation and further utilization of phenolic compounds is an important issue in winery by-products. For this purpose, 37 samples were collected, extracted, and analyzed in an attempt to provide the appropriate basis for their sustainable exploitation. Extraction of the bioactive compounds was held using an eco-friendly, non-toxic, and highly effective water-glycerol solvent system. Then, extracts were analyzed using UV-Vis, liquid chromatography-mass spectrometry (LC-MS), FTIR, and Raman spectroscopy. Also, total phenolic content and antioxidant activity were measured. LC-MS chromatography showed qualitative differences between different varieties. Peaks were attributed to monomeric 3-flavanols as well as monomeric, dimeric, and trimeric proanthocyanidins. The FT-IR and Raman spectra agreed with the chromatographic data and contributed to identifying phenolic compounds. Grape skins exhibited high total phenolic content (TPC), and it was proved that during vinification, a large number of polyphenols remained in the pomace. This study confirmed that grape skins from Ionian Islands are a promising source of bioactive compounds, suggesting their utilization under a bio-economic and environmental strategic framework.Keywords: antioxidant activity, grape skin, phenolic compounds, waste recovery
Procedia PDF Downloads 148567 Effect of Ultrasonic Assisted High Pressure Soaking of Soybean on Soymilk Properties
Authors: Rahul Kumar, Pavuluri Srinivasa Rao
Abstract:
This study investigates the effect of ultrasound-assisted high pressure (HP) treatment on the soaking characteristic of soybeans and extracted soy milk quality. The soybean (variety) was subjected to sonication (US) at ambient temperature for 15 and 30 min followed by HP treatment in the range of 200-400 MPa for dwell times 5-10 min. The bean samples were also compared with HPP samples (200-400 MPa; 5-10 mins), overnight soaked samples(12-15 h) and thermal treated samples (100°C/30 min) followed by overnight soaking for 12-15 h soaking. Rapid soaking within 40 min was achieved by the combined US-HPP treatment, and it reduced the soaking time by about 25 times in comparison to overnight soaking or thermal treatment followed by soaking. Reducing the soaking time of soybeans is expected to suppress the development of undesirable beany flavor of soy milk developed during normal soaking milk extraction. The optimum moisture uptake by the sonicated-pressure treated soybeans was 60-62% (w.b) similar to that obtained after overnight soaking for 12-15 h or thermal treatment followed by overnight soaking. pH of soy milk was not much affected by the different US-HPP treatments and overnight soaking which centered around the range of 6.6-6.7 much like the normal cow milk. For milk extracted from thermally treated soy samples, pH reduced to 6.2. Total soluble solids were found to be maximum for the normal overnight soaked soy samples, and it was in the range of 10.3-10.6. For the HPP treated soy milk, the TSS reduced to 7.4 while sonication further reduced it to 6.2. TSS was found to be getting reduced with increasing time of ultrasonication. Further reduction in TSS to 2.3 was observed in soy milk produced from thermally treated samples following overnight soaking. Our results conclude that thermally treated beans' milk is less stable and more acidic, soaking is very rapid compared to overnight soaking hence milk productivity can be enhanced with less development of undesirable beany flavor.Keywords: beany flavor, high pressure processing, high pressure, soybean, soaking, milk, ultrasound, wet basis
Procedia PDF Downloads 255566 Inhouse Inhibitor for Mitigating Corrosion in the Algerian Oil and Gas Industry
Authors: Hadjer Didouh, Mohamed Hadj Meliani, Izzeddine Sameut Bouhaik
Abstract:
As global demand for natural gas intensifies, Algeria is increasing its production to meet this rising need, placing significant strain on the nation's extensive pipeline infrastructure. Sonatrach, Algeria's national oil and gas company, faces persistent challenges from metal corrosion, particularly microbiologically influenced corrosion (MIC), leading to substantial economic losses. This study investigates the corrosion-inhibiting properties of Calotropis procera extracts, known as karanka, as a sustainable alternative to conventional inhibitors, which often pose environmental risks. The Calotropis procera extracts were evaluated for their efficacy on carbon steel API 5L X52 through electrochemical techniques, including potentiodynamic polarization and electrochemical impedance spectroscopy (EIS), under simulated operational conditions at varying concentrations, particularly at 10%, and elevated temperatures up to 60°C. The results demonstrated remarkable inhibition efficiency, achieving 96.73% at 60°C, attributed to the formation of a stable protective film on the metal surface that suppressed anodic and cathodic corrosion reactions. Scanning electron microscopy (SEM) confirmed the stability and adherence of these protective films, while EIS analysis indicated a significant increase in charge transfer resistance, highlighting the extract's effectiveness in enhancing corrosion resistance. The abundant availability of Calotropis procera in Algeria and its low-cost extraction processes present a promising opportunity for sustainable biocorrosion management strategies in the oil and gas industry, reinforcing the potential of plant-based extracts as viable alternatives to synthetic inhibitors for environmentally friendly corrosion control.Keywords: corrosion inhibition, calotropis procera, microbiologically influenced corrosion, eco-friendly inhibitor
Procedia PDF Downloads 25565 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics
Procedia PDF Downloads 124564 Mordenite as Catalyst Support for Complete Volatile Organic Compounds Oxidation
Authors: Yuri A. Kalvachev, Totka D. Todorova
Abstract:
Zeolite mordenite has been investigated as a transition metal support for the preparation of efficient catalysts in the oxidation of volatile organic compounds (VOCs). The highly crystalline mordenite samples were treated with hydrofluoric acid and ammonium fluoride to get hierarchical material with secondary porosity. The obtained supports by this method have a high active surface area, good diffusion properties and prevent the extraction of metal components during catalytic reactions. The active metal phases platinum and copper were loaded by impregnation on both mordenite materials (parent and acid treated counterparts). Monometalic Pt and Cu, and bimetallic Pt/Cu catalysts were obtained. The metal phases were fine dispersed as nanoparticles on the functional porous materials. The catalysts synthesized in this way were investigated in the reaction of complete oxidation of propane and benzene. Platinum, copper and platinum/copper were loaded and there catalytic activity was investigated and compared. All samples are characterized by X-ray diffraction analysis, nitrogen adsorption, scanning electron microscopy (SEM), X-ray photoelectron measurements (XPS) and temperature programed reduction (TPR). The catalytic activity of the samples obtained is investigated in the reaction of complete oxidation of propane and benzene by using of Gas Chromatography (GC). The oxidation of three organic molecules was investigated—methane, propane and benzene. The activity of metal loaded mordenite catalysts for methane oxidation is almost the same for parent and treated mordenite as a support. For bigger molecules as propane and benzene, the activity of catalysts based on treated mordenite is higher than those based on parent zeolite.Keywords: metal loaded catalysts, mordenite, VOCs oxidation, zeolites
Procedia PDF Downloads 131563 Preserving Urban Cultural Heritage with Deep Learning: Color Planning for Japanese Merchant Towns
Authors: Dongqi Li, Yunjia Huang, Tomo Inoue, Kohei Inoue
Abstract:
With urbanization, urban cultural heritage is facing the impact and destruction of modernization and urbanization. Many historical areas are losing their historical information and regional cultural characteristics, so it is necessary to carry out systematic color planning for historical areas in conservation. As an early focus on urban color planning, Japan has a systematic approach to urban color planning. Hence, this paper selects five merchant towns from the category of important traditional building preservation areas in Japan as the subject of this study to explore the color structure and emotion of this type of historic area. First, the image semantic segmentation method identifies the buildings, roads, and landscape environments. Their color data were extracted for color composition and emotion analysis to summarize their common features. Second, the obtained Internet evaluations were extracted by natural language processing for keyword extraction. The correlation analysis of the color structure and keywords provides a valuable reference for conservation decisions for this historic area in the town. This paper also combines the color structure and Internet evaluation results with generative adversarial networks to generate predicted images of color structure improvements and color improvement schemes. The methods and conclusions of this paper can provide new ideas for the digital management of environmental colors in historic districts and provide a valuable reference for the inheritance of local traditional culture.Keywords: historic districts, color planning, semantic segmentation, natural language processing
Procedia PDF Downloads 88562 Unlocking Justice: Exploring the Power and Challenges of DNA Analysis in the Criminal Justice System
Authors: Sandhra M. Pillai
Abstract:
This article examines the relevance, difficulties, and potential applications of DNA analysis in the criminal justice system. A potent tool for connecting suspects to crime sites, clearing the innocent of wrongdoing, and resolving cold cases, DNA analysis has transformed forensic investigations. The scientific foundations of DNA analysis, including DNA extraction, sequencing, and statistical analysis, are covered in the article. To guarantee accurate and trustworthy findings, it also discusses the significance of quality assurance procedures, chain of custody, and DNA sample storage. DNA analysis has significantly advanced science, but it also brings up substantial moral and legal issues. To safeguard individual rights and uphold public confidence, privacy concerns, possible discrimination, and abuse of DNA information must be properly addressed. The paper also emphasises the effects of the criminal justice system on people and communities while highlighting the necessity of equity, openness, and fair access to DNA testing. The essay describes the obstacles and future directions for DNA analysis. It looks at cutting-edge technology like next-generation sequencing, which promises to make DNA analysis quicker and more affordable. To secure the appropriate and informed use of DNA evidence, it also emphasises the significance of multidisciplinary collaboration among scientists, law enforcement organisations, legal experts, and policymakers. In conclusion, DNA analysis has enormous potential for improving the course of criminal justice. We can exploit the potential of DNA technology while respecting the ideals of justice, fairness, and individual rights by navigating the ethical, legal, and societal issues and encouraging discussion and collaboration.Keywords: DNA analysis, DNA evidence, reliability, validity, legal frame, admissibility, ethical considerations, impact, future direction, challenges
Procedia PDF Downloads 64561 Computational Intelligence and Machine Learning for Urban Drainage Infrastructure Asset Management
Authors: Thewodros K. Geberemariam
Abstract:
The rapid physical expansion of urbanization coupled with aging infrastructure presents a unique decision and management challenges for many big city municipalities. Cities must therefore upgrade and maintain the existing aging urban drainage infrastructure systems to keep up with the demands. Given the overall contribution of assets to municipal revenue and the importance of infrastructure to the success of a livable city, many municipalities are currently looking for a robust and smart urban drainage infrastructure asset management solution that combines management, financial, engineering and technical practices. This robust decision-making shall rely on sound, complete, current and relevant data that enables asset valuation, impairment testing, lifecycle modeling, and forecasting across the multiple asset portfolios. On this paper, predictive computational intelligence (CI) and multi-class machine learning (ML) coupled with online, offline, and historical record data that are collected from an array of multi-parameter sensors are used for the extraction of different operational and non-conforming patterns hidden in structured and unstructured data to determine and produce actionable insight on the current and future states of the network. This paper aims to improve the strategic decision-making process by identifying all possible alternatives; evaluate the risk of each alternative, and choose the alternative most likely to attain the required goal in a cost-effective manner using historical and near real-time urban drainage infrastructure data for urban drainage infrastructures assets that have previously not benefited from computational intelligence and machine learning advancements.Keywords: computational intelligence, machine learning, urban drainage infrastructure, machine learning, classification, prediction, asset management space
Procedia PDF Downloads 152560 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 131