Search results for: data mining techniques
27711 Cultural References in Jean-François Menard's French Translation of Harry Potter a L'ecole Des Sorciers: An Analysis of the Translated Catchphrases and Spells and Cultural Elements
Authors: Brynn Patrice Fader
Abstract:
The objective of this research project is to assess the ways in which Jean-Francois Menards French translation Harry Potter a l'ecole des sorciers translates the cultural references from the original text JK Rowlings' Harry Potter and the Philosophers Stone. The method of this analysis is to focus on analyzing the reasons for and the ways in which Menard translates the spells and catchphrases throughout the novel and the effects that these choices have on the reader. While at times Menard resorts to the omission or manipulation and borrowing he also contrasts these techniques by transferring the cultural references using the direct translational approach. It appears that the translator resorts to techniques other than direct translation when it is necessary to ensure that the target audience will understand the events and conversations taking place.Keywords: cultural elements, direct translation, manipulation, omission
Procedia PDF Downloads 31727710 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction
Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz
Abstract:
In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.Keywords: software quality, fuzzy logic, perception, prediction
Procedia PDF Downloads 31727709 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans
Authors: Tomas Premoli, Sareh Rowlands
Abstract:
In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI
Procedia PDF Downloads 7327708 Integration of Smart Grid Technologies with Smart Phones for Energy Monitoring and Management
Authors: Arjmand Khaliq, Pemra Sohaib
Abstract:
There is increasing trend of use of smart devices in the present age. The growth of computing techniques and advancement in hardware has also brought the use of sensors and smart devices to a high degree during the course of time. So use of smart devices for control, management communication and optimization has become very popular. This paper gives proposed methodology which involves sensing and switching unite for load, two way communications between utility company and smart phones of consumers using cellular techniques and price signaling resulting active participation of user in energy management .The goal of this proposed control methodology is active participation of user in energy management with accommodation of renewable energy resource. This will provide load adjustment according to consumer’s choice, increased security and reliability for consumer, switching of load according to consumer need and monitoring and management of energy.Keywords: cellular networks, energy management, renewable energy source, smart grid technology
Procedia PDF Downloads 41327707 Cooperative CDD Scheme Based On Hierarchical Modulation in OFDM System
Authors: Seung-Jun Yu, Yeong-Seop Ahn, Young-Min Ko, Hyoung-Kyu Song
Abstract:
In order to achieve high data rate and increase the spectral efficiency, multiple input multiple output (MIMO) system has been proposed. However, multiple antennas are limited by size and cost. Therefore, recently developed cooperative diversity scheme, which profits the transmit diversity only with the existing hardware by constituting a virtual antenna array, can be a solution. However, most of the introduced cooperative techniques have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In this paper, we propose a cooperative cyclic delay diversity (CDD) scheme that uses hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.Keywords: MIMO, cooperative communication, CDD, hierarchical modulation
Procedia PDF Downloads 54927706 Exploring Mechanical Properties of Additive Manufacturing Ceramic Components Across Techniques and Materials
Authors: Venkatesan Sundaramoorthy
Abstract:
The field of ceramics has undergone a remarkable transformation with the advent of additive manufacturing technologies. This comprehensive review explores the mechanical properties of additively manufactured ceramic components, focusing on key materials such as Alumina, Zirconia, and Silicon Carbide. The study delves into various authors' review technology into the various additive manufacturing techniques, including Stereolithography, Powder Bed Fusion, and Binder Jetting, highlighting their advantages and challenges. It provides a detailed analysis of the mechanical properties of these ceramics, offering insights into their hardness, strength, fracture toughness, and thermal conductivity. Factors affecting mechanical properties, such as microstructure and post-processing, are thoroughly examined. Recent advancements and future directions in 3D-printed ceramics are discussed, showcasing the potential for further optimization and innovation. This review underscores the profound implications of additive manufacturing for ceramics in industries such as aerospace, healthcare, and electronics, ushering in a new era of engineering and design possibilities for ceramic components.Keywords: mechanical properties, additive manufacturing, ceramic materials, PBF
Procedia PDF Downloads 6627705 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 27027704 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review
Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie
Abstract:
The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield
Procedia PDF Downloads 5927703 Finite Element Modeling Techniques of Concrete in Steel and Concrete Composite Members
Authors: J. Bartus, J. Odrobinak
Abstract:
The paper presents a nonlinear analysis 3D model of composite steel and concrete beams with web openings using the Finite Element Method (FEM). The core of the study is the introduction of basic modeling techniques comprehending the description of material behavior, appropriate elements selection, and recommendations for overcoming problems with convergence. Results from various finite element models are compared in the study. The main objective is to observe the concrete failure mechanism and its influence on the structural performance of numerical models of the beams at particular load stages. The bearing capacity of beams, corresponding deformations, stresses, strains, and fracture patterns were determined. The results show how load-bearing elements consisting of concrete parts can be analyzed using FEM software with various options to create the most suitable numerical model. The paper demonstrates the versatility of Ansys software usage for structural simulations.Keywords: Ansys, concrete, modeling, steel
Procedia PDF Downloads 12127702 A Study of Fatigue Life Estimation of a Modular Unmanned Aerial Vehicle by Developing a Structural Health Monitoring System
Authors: Zain Ul Hassan, Muhammad Zain Ul Abadin, Muhammad Zubair Khan
Abstract:
Unmanned aerial vehicles (UAVs) have now become of predominant importance for various operations, and an immense amount of work is going on in this specific category. The structural stability and life of these UAVs is key factor that should be considered while deploying them to different intelligent operations as their failure leads to loss of sensitive real-time data and cost. This paper presents an applied research on the development of a structural health monitoring system for a UAV designed and fabricated by deploying modular approach. Firstly, a modular UAV has been designed which allows to dismantle and to reassemble the components of the UAV without effecting the whole assembly of UAV. This novel approach makes the vehicle very sustainable and decreases its maintenance cost to a significant value by making possible to replace only the part leading to failure. Then the SHM for the designed architecture of the UAV had been specified as a combination of wings integrated with strain gauges, on-board data logger, bridge circuitry and the ground station. For the research purpose sensors have only been attached to the wings being the most load bearing part and as per analysis was done on ANSYS. On the basis of analysis of the load time spectrum obtained by the data logger during flight, fatigue life of the respective component has been predicted using fracture mechanics techniques of Rain Flow Method and Miner’s Rule. Thus allowing us to monitor the health of a specified component time to time aiding to avoid any failure.Keywords: fracture mechanics, rain flow method, structural health monitoring system, unmanned aerial vehicle
Procedia PDF Downloads 29427701 A Proper Design of Wind Turbine Grounding Systems under Lightning
Authors: M. A. Abd-Allah, Mahmoud N. Ali, A. Said
Abstract:
Lightning Protection Systems (LPS) for wind power generation is becoming an important public issue. A serious damage of blades, accidents where low-voltage and control circuit breakdowns frequently occur in many wind farms. A grounding system is one of the most important components required for appropriate LPSs in wind turbines WTs. Proper design of a wind turbine grounding system is demanding and several factors for the proper and effective implementation must be taken into account. This paper proposed procedure of proper design of grounding systems for a wind turbine was introduced. This procedure depends on measuring of ground current of simulated wind farm under lightning taking into consideration the soil ionization. The procedure also includes the Ground Potential Rise (GPR) and the voltage distributions at ground surface level and Touch potential. In particular, the contribution of mitigating techniques, such as rings, rods and the proposed design were investigated.Keywords: WTs, Lightning Protection Systems (LPS), GPR, grounding system, mitigating techniques
Procedia PDF Downloads 37727700 Comparison of Two Fuzzy Skyhook Control Strategies Applied to an Active Suspension
Authors: Reginaldo Cardoso, Magno Enrique Mendoza Meza
Abstract:
This work focuses on simulation and comparison of two control skyhook techniques applied to a quarter-car of the active suspension. The objective is to provide comfort to the driver. The main idea of skyhook control is to imagine a damper connected to an imaginary sky; thus, the feedback is performed with the resultant force between the imaginary and the suspension damper. The first control technique is the Mandani fuzzy skyhook and the second control technique is a Takagi-Sugeno fuzzy skyhook controller, in the both controllers the inputs are the relative velocity between the two masses and the vehicle body velocity, the output of the Mandani fuzzy skyhook is the coefficient of imaginary damper viscous-friction and the Takagi-Sugeno fuzzy skyhook is the force. Finally, we compared the techniques. The Mandani fuzzy skyhook showed a more comfortable response to the driver, followed closely by the Takagi- Sugeno fuzzy skyhook.Keywords: active suspention, Mandani, quarter-car, skyhook, Sugeno
Procedia PDF Downloads 46427699 CanVis: Towards a Web Platform for Cancer Progression Tree Analysis
Authors: Michael Aupetit, Mahmoud Al-ismail, Khaled Mohamed
Abstract:
Cancer is a major public health problem all over the world. Breast cancer has the highest incidence rate over all cancers for women in Qatar making its study a top priority of the country. Human cancer is a dynamic disease that develops over an extended period through the accumulation of a series of genetic alterations. A Darwinian process drives the tumor cells toward higher malignancy growing the branches of a progression tree in the space of genes expression. Although it is not possible to track these genetic alterations dynamically for one patient, it is possible to reconstruct the progression tree from the aggregation of thousands of tumor cells’ genetic profiles from thousands of different patients at different stages of the disease. Analyzing the progression tree is a way to detect pivotal molecular events that drive the malignant evolution and to provide a guide for the development of cancer diagnostics, prognostics and targeted therapeutics. In this work we present the development of a Visual Analytic web platform CanVis enabling users to upload gene-expression data and analyze their progression tree. The server computes the progression tree based on state-of-the-art techniques and allows an interactive visual exploration of this tree and the gene-expression data along its branching structure helping to discover potential driver genes.Keywords: breast cancer, progression tree, visual analytics, web platform
Procedia PDF Downloads 41627698 Collective Intelligence-Based Early Warning Management for Agriculture
Authors: Jarbas Lopes Cardoso Jr., Frederic Andres, Alexandre Guitton, Asanee Kawtrakul, Silvio E. Barbin
Abstract:
The important objective of the CyberBrain Mass Agriculture Alarm Acquisition and Analysis (CBMa4) project is to minimize the impacts of diseases and disasters on rice cultivation. For example, early detection of insects will reduce the volume of insecticides that is applied to the rice fields through the use of CBMa4 platform. In order to reach this goal, two major factors need to be considered: (1) the social network of smart farmers; and (2) the warning data alarm acquisition and analysis component. This paper outlines the process for collecting the warning and improving the decision-making result to the warning. It involves two sub-processes: the warning collection and the understanding enrichment. Human sensors combine basic suitable data processing techniques in order to extract warning related semantic according to collective intelligence. We identify each warning by a semantic content called 'warncons' with multimedia metaphors and metadata related to these metaphors. It is important to describe the metric to measuring the relation among warncons. With this knowledge, a collective intelligence-based decision-making approach determines the action(s) to be launched regarding one or a set of warncons.Keywords: agricultural engineering, warning systems, social network services, context awareness
Procedia PDF Downloads 38227697 The Representation of J. D. Salinger’s Views on Changes in American Society in the 1940s in The Catcher in the Rye
Authors: Jessadaporn Achariyopas
Abstract:
The objectives of this study aim to analyze both the protagonist in The Catcher in the Rye in terms of ideological concepts and narrative techniques which influence the construction of the representation and the relationship between the representation and J. D. Salinger’s views on changes in American society in the 1940s. This area of study might concern two theories: namely, a theory of representation and narratology. In addition, this research is intended to answer the following three questions. Firstly, how is the production of meaning through language in The Catcher in the Rye constructed? Secondly, what are J. D. Salinger’s views on changes in American society in the 1940s? Lastly, how is the relationship between the representation and J. D. Salinger’s views? The findings showed that the protagonist’s views, J. D. Salinger’s views, and changes in American society in the 1940s are obviously interrelated. The production of meaning which is the representation of the protagonist’s views was constructed of narrative techniques. J. D. Salinger’s views on changes in American society in the 1940s were the same antisocial perspectives as Holden Caulfield’s which are phoniness, alienation and meltdown.Keywords: representation, construction of the representation, systems of representation, phoniness, alienation, meltdown
Procedia PDF Downloads 32127696 Development of Fuzzy Logic and Neuro-Fuzzy Surface Roughness Prediction Systems Coupled with Cutting Current in Milling Operation
Authors: Joseph C. Chen, Venkata Mohan Kudapa
Abstract:
Development of two real-time surface roughness (Ra) prediction systems for milling operations was attempted. The systems used not only cutting parameters, such as feed rate and spindle speed, but also the cutting current generated and corrected by a clamp type energy sensor. Two different approaches were developed. First, a fuzzy inference system (FIS), in which the fuzzy logic rules are generated by experts in the milling processes, was used to conduct prediction modeling using current cutting data. Second, a neuro-fuzzy system (ANFIS) was explored. Neuro-fuzzy systems are adaptive techniques in which data are collected on the network, processed, and rules are generated by the system. The inference system then uses these rules to predict Ra as the output. Experimental results showed that the parameters of spindle speed, feed rate, depth of cut, and input current variation could predict Ra. These two systems enable the prediction of Ra during the milling operation with an average of 91.83% and 94.48% accuracy by FIS and ANFIS systems, respectively. Statistically, the ANFIS system provided better prediction accuracy than that of the FIS system.Keywords: surface roughness, input current, fuzzy logic, neuro-fuzzy, milling operations
Procedia PDF Downloads 14527695 Innovation Potential of Palm Kernel Shells from the Littoral Region in Cameroon
Authors: Marcelle Muriel Domkam Tchunkam, Rolin Feudjio
Abstract:
This work investigates the ultrastructure, physicochemical and thermal properties evaluation of Palm Kernel Shells (PKS). PKS Tenera waste samples were obtained from a palm oil mill in Dizangué Sub-Division, Littoral region of Cameroon, while PKS Dura waste samples were collected from the Institute of Agricultural Research for Development (IRAD) of Mbongo. A sodium hydroxide solution was used to wash the shells. They were then rinsed by demineralised water and dried in an oven at 70 °C during 72 hours. They were then grounded and sieved to obtained powders from 0.04 mm to 0.45 mm in size. Transmission Electron Microscopy (TEM) and Surface Electron Microscopy (SEM) were used to characterized powder samples. Chemical compounds and elemental constituents, as well as thermal performance were evaluated by Van Soest Method, TEM/EDXA and SEM/EDS techniques. Thermal characterization was also performed using Differential Scanning Calorimetry (DSC) and Thermogravimetric Analysis (TGA). Our results from microstructural analysis revealed that most of the PKS material is made of particles with irregular morphology, mainly amorphous phases of carbon/oxygen with small amounts of Ca, K, and Mg. The DSC data enabled the derivation of the materials’ thermal transition phases and the relevant characteristic temperatures and physical properties. Overall, our data show that PKS have nanopores and show potential in 3D printing and membrane filtration applications.Keywords: DSC, EDXA, palm kernel shells, SEM, TEM
Procedia PDF Downloads 12027694 Dynamic Analysis of Viscoelastic Plates with Variable Thickness
Authors: Gülçin Tekin, Fethi Kadıoğlu
Abstract:
In this study, the dynamic analysis of viscoelastic plates with variable thickness is examined. The solutions of dynamic response of viscoelastic thin plates with variable thickness have been obtained by using the functional analysis method in the conjunction with the Gâteaux differential. The four-node serendipity element with four degrees of freedom such as deflection, bending, and twisting moments at each node is used. Additionally, boundary condition terms are included in the functional by using a systematic way. In viscoelastic modeling, Three-parameter Kelvin solid model is employed. The solutions obtained in the Laplace-Carson domain are transformed to the real time domain by using MDOP, Dubner & Abate, and Durbin inverse transform techniques. To test the performance of the proposed mixed finite element formulation, numerical examples are treated.Keywords: dynamic analysis, inverse laplace transform techniques, mixed finite element formulation, viscoelastic plate with variable thickness
Procedia PDF Downloads 33127693 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 31627692 Investigating the Prevalence of HCV from Laboratory Centers in Tehran City - Iran by Electrochemiluminescence (ECL) and PCR Techniques
Authors: Zahra Rakhshan Masoudi, Sona Rostampour Yasouri
Abstract:
Considering that the only way to save the lives of patients and healthy people who have suffered sudden accidents is blood transfusion, what is important is the presence of the known HCV virus as the most important cause of the disease after blood transfusion. HCV is one of the major global problems, and its transmission through blood causes life-threatening complications and extensive legal, social and economic consequences. On the one hand, unfortunately, there is still no effective vaccine available to prevent HCV. In Iran, the exact statistics of the prevalence of this disease have not yet been fully announced. The main purpose of this study is to investigate the prevalence rate and rapid diagnosis of HCV among those who refer to laboratory centers in Tehran. From spring to winter of 1401 (2022-2023), 2166 blood samples were collected from laboratory centers in Tehran. Blood samples were evaluated for the presence of HCV by Electrochemiluminescence (ECL) and PCR techniques along with specific HCV primers. In general, 36 samples (1.6%) were tested positive by the mentioned techniques. The results indicated that the ECL technique is a sensitive and specific diagnostic method for detecting HCV in the early stages of the disease and can be very helpful and provide the possibility of starting the treatment steps to prevent the exacerbation of the disease earlier. Also, the results of PCR technique showed that PCR is an accurate, sensitive and fast method for definitive diagnosis of HCV. It seems that the incidence rate of this disease is increasing in Iran, and investigating the spread of the disease throughout Iran for a longer period of time in the continuation of our research can be helpful in the future to take the necessary measures to prevent the transmission of the disease to people and the rapid onset Treatment steps for patients with HCV should be carried out.Keywords: electrochemiluminescence, HCV, PCR, prevalence
Procedia PDF Downloads 6827691 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels
Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur
Abstract:
With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography
Procedia PDF Downloads 12427690 Irrigation Challenges, Climate Change Adaptation and Sustainable Water Usage in Developing Countries. A Case Study, Nigeria
Authors: Faith Eweluegim Enahoro-Ofagbe
Abstract:
Worldwide, every nation is experiencing the effects of global warming. In developing countries, due to the heavy reliance on agriculture for socioeconomic growth and security, among other things, these countries are more affected by climate change, particularly with the availability of water. Floods, droughts, rising temperatures, saltwater intrusion, groundwater depletion, and other severe environmental alterations are all brought on by climatic change. Life depends on water, a vital resource; these ecological changes affect all water use, including agriculture and household water use. Therefore adequate and adaptive water usage strategies for sustainability are essential in developing countries. Therefore, this paper investigates Nigeria's challenges due to climate change and adaptive techniques that have evolved in response to such issues to ensure water management and sustainability for irrigation and provide quality water to residents. Questionnaires were distributed to respondents in the study area, central Nigeria, for quantitative evaluation of sustainable water resource management techniques. Physicochemical analysis was done, collecting soil and water samples from several locations under investigation. Findings show that farmers use different methods, ranging from intelligent technologies to traditional strategies for water resource management. Also, farmers need to learn better water resource management techniques for sustainability. Since more residents obtain their water from privately held sources, the government should enforce legislation to ensure that private borehole construction businesses treat water sources of poor quality before the general public uses them.Keywords: developing countries, irrigation, strategies, sustainability, water resource management, water usage
Procedia PDF Downloads 11527689 Terrestrial Laser Scans to Assess Aerial LiDAR Data
Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani
Abstract:
The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy
Procedia PDF Downloads 10027688 Data Recording for Remote Monitoring of Autonomous Vehicles
Authors: Rong-Terng Juang
Abstract:
Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar
Procedia PDF Downloads 16327687 Digital Twin for a Floating Solar Energy System with Experimental Data Mining and AI Modelling
Authors: Danlei Yang, Luofeng Huang
Abstract:
The integration of digital twin technology with renewable energy systems offers an innovative approach to predicting and optimising performance throughout the entire lifecycle. A digital twin is a continuously updated virtual replica of a real-world entity, synchronised with data from its physical counterpart and environment. Many digital twin companies today claim to have mature digital twin products, but their focus is primarily on equipment visualisation. However, the core of a digital twin should be its model, which can mirror, shadow, and thread with the real-world entity, which is still underdeveloped. For a floating solar energy system, a digital twin model can be defined in three aspects: (a) the physical floating solar energy system along with environmental factors such as solar irradiance and wave dynamics, (b) a digital model powered by artificial intelligence (AI) algorithms, and (c) the integration of real system data with the AI-driven model and a user interface. The experimental setup for the floating solar energy system, is designed to replicate real-ocean conditions of floating solar installations within a controlled laboratory environment. The system consists of a water tank that simulates an aquatic surface, where a floating catamaran structure supports a solar panel. The solar simulator is set up in three positions: one directly above and two inclined at a 45° angle in front and behind the solar panel. This arrangement allows the simulation of different sun angles, such as sunrise, midday, and sunset. The solar simulator is positioned 400 mm away from the solar panel to maintain consistent solar irradiance on its surface. Stability for the floating structure is achieved through ropes attached to anchors at the bottom of the tank, which simulates the mooring systems used in real-world floating solar applications. The floating solar energy system's sensor setup includes various devices to monitor environmental and operational parameters. An irradiance sensor measures solar irradiance on the photovoltaic (PV) panel. Temperature sensors monitor ambient air and water temperatures, as well as the PV panel temperature. Wave gauges measure wave height, while load cells capture mooring force. Inclinometers and ultrasonic sensors record heave and pitch amplitudes of the floating system’s motions. An electric load measures the voltage and current output from the solar panel. All sensors collect data simultaneously. Artificial neural network (ANN) algorithms are central to developing the digital model, which processes historical and real-time data, identifies patterns, and predicts the system’s performance in real time. The data collected from various sensors are partly used to train the digital model, with the remaining data reserved for validation and testing. The digital twin model combines the experimental setup with the ANN model, enabling monitoring, analysis, and prediction of the floating solar energy system's operation. The digital model mirrors the functionality of the physical setup, running in sync with the experiment to provide real-time insights and predictions. It provides useful industrial benefits, such as informing maintenance plans as well as design and control strategies for optimal energy efficiency. In long term, this digital twin will help improve overall solar energy yield whilst minimising the operational costs and risks.Keywords: digital twin, floating solar energy system, experiment setup, artificial intelligence
Procedia PDF Downloads 827686 Applications of Multi-Path Futures Analyses for Homeland Security Assessments
Authors: John Hardy
Abstract:
A range of future-oriented intelligence techniques is commonly used by states to assess their national security and develop strategies to detect and manage threats, to develop and sustain capabilities, and to recover from attacks and disasters. Although homeland security organizations use future's intelligence tools to generate scenarios and simulations which inform their planning, there have been relatively few studies of the methods available or their applications for homeland security purposes. This study presents an assessment of one category of strategic intelligence techniques, termed Multi-Path Futures Analyses (MPFA), and how it can be applied to three distinct tasks for the purpose of analyzing homeland security issues. Within this study, MPFA are categorized as a suite of analytic techniques which can include effects-based operations principles, general morphological analysis, multi-path mapping, and multi-criteria decision analysis techniques. These techniques generate multiple pathways to potential futures and thereby generate insight into the relative influence of individual drivers of change, the desirability of particular combinations of pathways, and the kinds of capabilities which may be required to influence or mitigate certain outcomes. The study assessed eighteen uses of MPFA for homeland security purposes and found that there are five key applications of MPFA which add significant value to analysis. The first application is generating measures of success and associated progress indicators for strategic planning. The second application is identifying homeland security vulnerabilities and relationships between individual drivers of vulnerability which may amplify or dampen their effects. The third application is selecting appropriate resources and methods of action to influence individual drivers. The fourth application is prioritizing and optimizing path selection preferences and decisions. The fifth application is informing capability development and procurement decisions to build and sustain homeland security organizations. Each of these applications provides a unique perspective of a homeland security issue by comparing a range of potential future outcomes at a set number of intervals and by contrasting the relative resource requirements, opportunity costs, and effectiveness measures of alternative courses of action. These findings indicate that MPFA enhances analysts’ ability to generate tangible measures of success, identify vulnerabilities, select effective courses of action, prioritize future pathway preferences, and contribute to ongoing capability development in homeland security assessments.Keywords: homeland security, intelligence, national security, operational design, strategic intelligence, strategic planning
Procedia PDF Downloads 13927685 Mitigating Acid Mine Drainage Pollution: A Case Study In the Witwatersrand Area of South Africa
Authors: Elkington Sibusiso Mnguni
Abstract:
In South Africa, mining has been a key economic sector since the discovery of gold in 1886 in the Witwatersrand region, where the city of Johannesburg is located. However, some mines have since been decommissioned, and the continuous pumping of acid mine drainage (AMD) also stopped causing the AMD to rise towards the ground surface. This posed a serious environmental risk to the groundwater resources and river systems in the region. This paper documents the development and extent of the environmental damage as well as the measures implemented by the government to alleviate such damage. The study will add to the body of knowledge on the subject of AMD treatment to prevent environmental degradation. The method used to gather and collate relevant data and information was the desktop study. The key findings include the social and environmental impact of the AMD, which include the pollution of water sources for domestic use leading to skin and other health problems and the loss of biodiversity in some areas. It was also found that the technical intervention of constructing a plant to pump and treat the AMD using the high-density sludge technology was the most effective short-term solution available while a long-term solution was being explored. Some successes and challenges experienced during the implementation of the project are also highlighted. The study will be a useful record of the current status of the AMD treatment interventions in the region.Keywords: acid mine drainage, groundwater resources, pollution, river systems, technical intervention, high density sludge
Procedia PDF Downloads 18627684 Spatial Information and Urbanizing Futures
Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini
Abstract:
Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.Keywords: PPGIS, spatial information, urbanizing futures, urban planning
Procedia PDF Downloads 72627683 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 12927682 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods
Authors: Cristina Vatamanu, Doina Cosovan, Dragos Gavrilut, Henri Luchian
Abstract:
In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through semi-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.Keywords: ensembles, false positives, feature selection, one side class algorithm
Procedia PDF Downloads 292