Search results for: data mining applications and discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30900

Search results for: data mining applications and discovery

27390 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 48
27389 Liposomal Encapsulation of Silver Nanoparticle for Improved Delivery and Enhanced Anticancer Properties

Authors: Azeez Yusuf, Alan Casey

Abstract:

Silver nanoparticles (AgNP) are one of the most widely investigated metallic nanoparticles due to their promising antibacterial activities. In recent years, AgNP research has shifted beyond antimicrobial use to potential applications in the medical arena. This shift coupled with the extensive commercial applications of AgNP will further increase human exposure, and the subsequent risk of adverse effects that may result from repeated exposures and inefficient delivery meaning research into improved AgNP delivery is of paramount importance. In this study, AgNP were encapsulated in a natural bio-surfactant, dipalmitoylphosphatyidyl choline (DPPC), in an attempt to enhance the intracellular delivery and simultaneously mediate the associated cytotoxicity of the AgNP. It was noted that as a result of the encapsulation, liposomal-AgNP (Lipo-AgNP) at 0.625 μg/ml induced significant cell death in THP1 cell lines a notably lower dose than that of the uncoated AgNP induced cytotoxicity. The induced cytotoxicity was shown to result in an increased level of DNA fragmentation resulting in a cell cycle interruption at the S phase of the cell cycle. It was shown that the predominate form of cell death upon exposure to both uncoated and Lipo-AgNP was apoptosis, however, a ROS-independent activation of the executioner caspases 3/7 occurred when exposed to the Lipo-AgNP. These findings showed that encapsulation of AgNP enhances AgNP cytotoxicity and mediates an ROS-independent induction of apoptosis.

Keywords: silver nanoparticles, AgNP, cytotoxicity, encapsulation, liposome

Procedia PDF Downloads 159
27388 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 90
27387 Magnetic Navigation in Underwater Networks

Authors: Kumar Divyendra

Abstract:

Underwater Sensor Networks (UWSNs) have wide applications in areas such as water quality monitoring, marine wildlife management etc. A typical UWSN system consists of a set of sensors deployed randomly underwater which communicate with each other using acoustic links. RF communication doesn't work underwater, and GPS too isn't available underwater. Additionally Automated Underwater Vehicles (AUVs) are deployed to collect data from some special nodes called Cluster Heads (CHs). These CHs aggregate data from their neighboring nodes and forward them to the AUVs using optical links when an AUV is in range. This helps reduce the number of hops covered by data packets and helps conserve energy. We consider the three-dimensional model of the UWSN. Nodes are initially deployed randomly underwater. They attach themselves to the surface using a rod and can only move upwards or downwards using a pump and bladder mechanism. We use graph theory concepts to maximize the coverage volume while every node maintaining connectivity with at least one surface node. We treat the surface nodes as landmarks and each node finds out its hop distance from every surface node. We treat these hop-distances as coordinates and use them for AUV navigation. An AUV intending to move closer to a node with given coordinates moves hop by hop through nodes that are closest to it in terms of these coordinates. In absence of GPS, multiple different approaches like Inertial Navigation System (INS), Doppler Velocity Log (DVL), computer vision-based navigation, etc., have been proposed. These systems have their own drawbacks. INS accumulates error with time, vision techniques require prior information about the environment. We propose a method that makes use of the earth's magnetic field values for navigation and combines it with other methods that simultaneously increase the coverage volume under the UWSN. The AUVs are fitted with magnetometers that measure the magnetic intensity (I), horizontal inclination (H), and Declination (D). The International Geomagnetic Reference Field (IGRF) is a mathematical model of the earth's magnetic field, which provides the field values for the geographical coordinateson earth. Researchers have developed an inverse deep learning model that takes the magnetic field values and predicts the location coordinates. We make use of this model within our work. We combine this with with the hop-by-hop movement described earlier so that the AUVs move in such a sequence that the deep learning predictor gets trained as quickly and precisely as possible We run simulations in MATLAB to prove the effectiveness of our model with respect to other methods described in the literature.

Keywords: clustering, deep learning, network backbone, parallel computing

Procedia PDF Downloads 101
27386 Genome Editing in Sorghum: Advancements and Future Possibilities: A Review

Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie

Abstract:

The advancement of target-specific genome editing tools, including clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein9 (Cas9), mega-nucleases, base editing (BE), prime editing (PE), transcription activator-like endonucleases (TALENs), and zinc-finger nucleases (ZFNs), have paved the way for a modern era of gene editing. CRISPR/Cas9, as a versatile, simple, cost-effective and robust system for genome editing, has dominated the genome manipulation field over the last few years. The application of CRISPR/Cas9 in sorghum improvement is particularly vital in the context of ecological, environmental and agricultural challenges, as well as global climate change. In this context, gene editing using CRISPR/Cas9 can improve nutritional value, yield, resistance to pests and disease and tolerance to different abiotic stress. Moreover, CRISPR/Cas9 can potentially perform complex editing to reshape already available elite varieties and new genetic variations. However, existing research is targeted at improving even further the effectiveness of the CRISPR/Cas9 genome editing techniques to fruitfully edit endogenous sorghum genes. These findings suggest that genome editing is a feasible and successful venture in sorghum. Newer improvements and developments of CRISPR/Cas9 techniques have further qualified researchers to modify extra genes in sorghum with improved efficiency. The fruitful application and development of CRISPR techniques for genome editing in sorghum will not only help in gene discovery, creating new, improved traits in sorghum regulating gene expression sorghum functional genomics, but also in making site-specific integration events.

Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield

Procedia PDF Downloads 65
27385 In vitro Cytotoxicity Study on Silver Powders Synthesized via Different Routes

Authors: Otilia Ruxandra Vasile, Ecaterina Andronescu, Cristina Daniela Ghitulica, Bogdan Stefan Vasile, Roxana Trusca, Eugeniu Vasile, Alina Maria Holban, Carmen Mariana Chifiriuc, Florin Iordache, Horia Maniu

Abstract:

Engineered powders offer great promise in several applications, but little information is known about cytotoxicity effects. The aim of the current study was the synthesis and cytotoxicity examination of silver powders using pyrosol method at temperatures of 600°C, 650°C and 700°C, respectively sol-gel method and calcinations at 500°C, 600°C, 700°C and 800°C. We have chosen to synthesize and examine silver particles cytotoxicity due to its use in biological applications. The synthesized Ag powders were characterized from the structural, compositional and morphological point of view by using XRD, SEM, and TEM with SAED. In order to determine the influence of the synthesis route on Ag particles cytotoxicity, different sizes of micro and nanosilver synthesized powders were evaluated for their potential toxicity. For the study of their cytotoxicity, cell cycle and apoptosis have been done analysis through flow cytometry on human colon carcinoma cells and mesenchymal stem cells and through the MTT assay, while the viability and the morphological changes of the cells have been evaluated by using cloning studies. The results showed that the synthesized silver nanoparticles have displayed significant cytotoxicity effects on cell cultures. Our synthesized silver powders were found to present toxicity in a synthesis route and time-dependent manners for pyrosol synthesized nanoparticles; whereas a lower cytotoxicity has been measured after cells were treated with silver nanoparticles synthesized through sol-gel method.

Keywords: Ag, cytotoxicity, pyrosol method, sol-gel method

Procedia PDF Downloads 602
27384 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 189
27383 Culture, Consumption, and Markets of Aesthetics: A10-Year Literature Review

Authors: Chin-Hsiang Chu

Abstract:

This article review the literature in the field among the marketing and aesthetics, the current market and customer-oriented product sales, and gradually from the practical functionality, transformed into the visual appearance of the concept note and the importance of marketing experience substance 'economic Aesthetics' trend. How to introduce the concept of aesthetic and differentiate products have become an important content of marketing management in for an organization in marketing.In previous studies,marketing aesthetic related researches are rare.Therefore, the purpose of this study to explore the connection between aesthetics and marketing of the market economy, and aggregated content through literature review, trying to find related research implications for the management of marketing aesthetics, market-oriented and customer value and development of the product. In this study, the problem statement and background, the development of the theory of evolution, as well as methods and results of discovery stage, literature review was conducted to explore. The results found: (1) Study of Aesthetics will help deepen the shopping environment and service environment commonly understood. (2) the perceived value of products imported aesthetic, consumer willingness to buy, and even premium products will be more attractive. (3) marketing personnel for general marketing management with a high degree of aesthetic identity. (4) management in marketing aesthetics connotation, aesthetic characteristics of five elements is greatly valued by the real-time, complex, specificity, attract sexual and richness. (5) allows consumers to experience through the process due to stimulate the senses, the mind and thinking with the corporate brand or have a deeper link. Results of this study can be used as business in a competitive market, new product development and design of the guide.

Keywords: marketing aesthetics, aesthetics economic, aesthetic, experiential marketing

Procedia PDF Downloads 263
27382 Influence of Geometry on Performance of Type-4 Filament Wound Composite Cylinder for Compressed Gas Storage

Authors: Pranjali Sharma, Swati Neogi

Abstract:

Composite pressure vessels are low weight structures mainly used in a variety of applications such as automobiles, aeronautics and chemical engineering. Fiber reinforced polymer (FRP) composite materials offer the simplicity of design and use, high fuel storage capacity, rapid refueling capability, excellent shelf life, minimal infrastructure impact, high safety due to the inherent strength of the pressure vessel, and little to no development risk. Apart from these preliminary merits, the subsidized weight of composite vessels over metallic cylinders act as the biggest asset to the automotive industry, increasing the fuel efficiency. The result is a lightweight, flexible, non-explosive, and non-fragmenting pressure vessel that can be tailor-made to attune with specific applications. The winding pattern of the composite over-wrap is a primary focus while designing a pressure vessel. The critical stresses in the system depend on the thickness, angle and sequence of the composite layers. The composite over-wrap is wound over a plastic liner, whose geometry can be varied for the ease of winding. In the present study, we aim to optimize the FRP vessel geometry that provides an ease in winding and also aids in weight reduction for enhancing the vessel performance. Finite element analysis is used to study the effect of dome geometry, yielding a design with maximum value of burst pressure and least value of vessel weight. The stress and strain analysis of different dome ends along with the cylindrical portion is carried out in ANSYS 19.2. The failure is predicted using different failure theories like Tsai-Wu theory, Tsai-Hill theory and Maximum stress theory. Corresponding to a given winding sequence, the optimum dome geometry is determined for a fixed internal pressure to identify the theoretical value of burst pressure. Finally, this geometry is used to decrease the number of layers to reach the set value of safety in accordance with the available safety standards. This results in decrease in the weight of the composite over-wrap and manufacturing cost of the pressure vessel. An improvement in the overall weight performance of the pressure vessel gives higher fuel efficiency for its use in automobile applications.

Keywords: Compressed Gas Storage, Dome geometry, Theoretical Analysis, Type-4 Composite Pressure Vessel, Improvement in Vessel Weight Performance

Procedia PDF Downloads 151
27381 Advanced Nuclear Measurements and Systems for Facilitating the Implementation of Safeguards and Safeguards-By-Design in SMR, AMR and Microreactor

Authors: Massimo Morichi

Abstract:

Over the last five years, starting in 2019, several nuclear measurement systems have been conceived and realized for specific nuclear safeguards applications, as well as for nuclear security, implementing some innovative technologies and methods for attended and unattended nuclear measurements. Some of those technologies imply the integration of combined gamma and neutron detection systems both for counting and spectroscopic applications that allow the SNM (Special Nuclear Material) verification and quantification through specific simultaneous measurements (gamma and neutron) from standard to high count rate due to high flux irradiation. IAEA has implemented some of these technologies in key international safeguards inspections worldwide, like a Fast Neutron Collar Monitor for fresh fuel verification of U235 mass (used during inspections for material declaration verification) or for unattended measuring systems with a distinct shift register installed in an anti-tampering sealed housing in unattended mode (remote inspection and continuous monitoring) together with an Unattended Multichannel Analyzer for spectroscopy analysis of SNM like canisters. Such developments, realized with integrated mid-resolution scintillators (FWHM: <3,5%) together with organic scintillators such as Stilbene detectors or Liquid sealed scintillators like EJ-309 with great pulse shape discrimination managed by a fast DAQ and with a high level of system integration, are offering in the near term the possibility to enhance further their implementation, reducing the form factor in order to facilitate their implementation in many critical parts of the Nuclear Fuel Operations as well as of the Next generation of Nuclear Reactors. This will facilitate embedding these advanced technical solutions in the next generation of nuclear installations, assuring the implementation of the Safeguards by Design requested by IAEA for all future/novel nuclear installations. This work presents the most recent designs/systems and provides some clear examples of ongoing applications on the Fuel Cycle-Fuel Fabrication as well as for the SMR/AMR and microreactors. Detailed technology testing and validation in different configuration if provided together with some case-studies and operational implications.

Keywords: nuclear safeguard, gamma and neutron detection systems, spectroscopy analysis, nuclear fuel cycle, nuclear power reactors, decommissioning dismantling, nuclear security

Procedia PDF Downloads 13
27380 Turmeric Mediated Synthesis and Characterization of Cerium Oxide Nanoparticles

Authors: Nithin Krisshna Gunasekaran, Prathima Prabhu Tumkur, Nicole Nazario Bayon, Krishnan Prabhakaran, Joseph C. Hall, Govindarajan T. Ramesh

Abstract:

Cerium oxide and turmeric have antioxidant properties, which have gained interest among researchers to study their applications in the field of biomedicine, such asanti-inflammatory, anticancer, and antimicrobial applications. In this study, the turmeric extract was prepared and mixed with cerium nitrate hexahydrate, stirred continuously to obtain a homogeneous solution and then heated on a hot plate to get the supernatant evaporated, then calcinated at 600°C to obtain the cerium oxide nanoparticles. Characterization of synthesized cerium oxide nanoparticles through Scanning Electron Microscopy determined the particle size to be in the range of 70 nm to 250 nm. Energy Dispersive X-Ray Spectroscopy determined the elemental composition of cerium and oxygen. Individual particles were identified through the characterization of cerium oxide nanoparticles using Field Emission Scanning Electron Microscopy, in which the particles were determined to be spherical and in the size of around 70 nm. The presence of cerium oxide was assured by analyzing the spectrum obtained through the characterization of cerium oxide nanoparticles by Fourier Transform Infrared Spectroscopy. The crystal structure of cerium oxide nanoparticles was determined to be face-centered cubic by analyzing the peaks obtained through theX-Ray Diffraction method. The crystal size of cerium oxide nanoparticles was determined to be around 13 nm by using the Debye Scherer equation. This study confirmed the synthesis of cerium oxide nanoparticles using turmeric extract.

Keywords: antioxidant, characterization, cerium oxide, synthesis, turmeric

Procedia PDF Downloads 173
27379 Finite Element Modelling of a 3D Woven Composite for Automotive Applications

Authors: Ahmad R. Zamani, Luigi Sanguigno, Angelo R. Maligno

Abstract:

A 3D woven composite, designed for automotive applications, is studied using Abaqus Finite Element (FE) software suite. Python scripts were developed to build FE models of the woven composite in Complete Abaqus Environment (CAE). They can read TexGen or WiseTex files and automatically generate consistent meshes of the fabric and the matrix. A user menu is provided to help define parameters for the FE models, such as type and size of the elements in fabric and matrix as well as the type of matrix-fabric interaction. Node-to-node constraints were imposed to guarantee periodicity of the deformed shapes at the boundaries of the representative volume element of the composite. Tensile loads in three axes and biaxial loads in x-y directions have been applied at different Fibre Volume Fractions (FVFs). A simple damage model was implemented via an Abaqus user material (UMAT) subroutine. Existing tools for homogenization were also used, including voxel mesh generation from TexGen as well as Abaqus Micromechanics plugin. Linear relations between homogenised elastic properties and the FVFs are given. The FE models of composite exhibited balanced behaviour with respect to warp and weft directions in terms of both stiffness and strength.

Keywords: 3D woven composite (3DWC), meso-scale finite element model, homogenisation of elastic material properties, Abaqus Python scripting

Procedia PDF Downloads 148
27378 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 343
27377 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 101
27376 Modeling the Present Economic and Social Alienation of Working Class in South Africa in the Musical Production ‘from Marikana to Mahagonny’ at Durban University of Technology (DUT)

Authors: Pamela Tancsik

Abstract:

The stage production in 2018, titled ‘From‘Marikana to Mahagonny’, began with a prologue in the form of the award-winning documentary ‘Miners Shot Down' by Rehad Desai, followed by Brecht/Weill’s song play or scenic cantata ‘Mahagonny’, premièred in Baden-Baden 1927. The central directorial concept of the DUT musical production ‘From Marikana to Mahagonny’ was to show a connection between the socio-political alienation of mineworkers in present-day South Africa and Brecht’s alienation effect in his scenic cantata ‘Mahagonny’. Marikana is a mining town about 50 km west of South Africa’s capital Pretoria. Mahagonny is a fantasy name for a utopian mining town in the United States. The characters, setting, and lyrics refer to America with of songs like ‘Benares’ and ‘Moon of Alabama’ and the use of typical American inventions such as dollars, saloons, and the telephone. The six singing characters in ‘Mahagonny’ all have typical American names: Charlie, Billy, Bobby, Jimmy, and the two girls they meet later are called Jessie and Bessie. The four men set off to seek Mahagonny. For them, it is the ultimate dream destination promising the fulfilment of all their desires, such as girls, alcohol, and dollars – in short, materialistic goals. Instead of finding a paradise, they experience how money and the practice of exploitive capitalism, and the lack of any moral and humanity is destroying their lives. In the end, Mahagonny gets demolished by a hurricane, an event which happened in 1926 in the United States. ‘God’ in person arrives disillusioned and bitter, complaining about violent and immoral mankind. In the end, he sends them all to hell. Charlie, Billy, Bobby, and Jimmy reply that this punishment does not mean anything to them because they have already been in hell for a long time – hell on earth is a reality, so the threat of hell after life is meaningless. Human life was also taken during the stand-off between striking mineworkers and the South African police on 16 August 2012. Miners from the Lonmin Platinum Mine went on an illegal strike, equipped with bush knives and spears. They were striking because their living conditions had never improved; they still lived in muddy shacks with no running water and electricity. Wages were as low as R4,000 (South African Rands), equivalent to just over 200 Euro per month. By August 2012, the negotiations between Lonmin management and the mineworkers’ unions, asking for a minimum wage of R12,500 per month, had failed. Police were sent in by the Government, and when the miners did not withdraw, the police shot at them. 34 were killed, some by bullets in their backs while running away and trying to hide behind rocks. In the musical play ‘From Marikana to Mahagonny’ audiences in South Africa are confronted with a documentary about Marikana, followed by Brecht/Weill’s scenic cantata, highlighting the tragic parallels between the Mahagonny story and characters from 1927 America and the Lonmin workers today in South Africa, showing that in 95 years, capitalism has not changed.

Keywords: alienation, brecht/Weill, mahagonny, marikana/South Africa, musical theatre

Procedia PDF Downloads 103
27375 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 72
27374 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 201
27373 Central Solar Tower Model

Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale

Abstract:

It is presented a model of two subsystems of Central Solar Tower to produce steam in applications to help in energy consumption. The first subsystem consists of 24 heliostats constructed of adaptive and mobile metal structures to track the apparent movement of the sun on its focus and covered by 96 layers of mirror of 150 mm at width and 220 mm at length, totaling an area of concentration of 3.2 m². Thereby obtaining optical parameters essential to reflection of sunlight by the reflector surface and absorption of this light by focus located in the light receiver, which is inserted in the second subsystem, which is at the top of a tower. The tower was built in galvanized iron able to support the absorber, and a gas cylinder to cool the equipment. The area illuminated by the sun was 9 x 10-2m2, yielding a concentration factor of 35.22. It will be shown the processes of manufacture and assembly of the Mini-Central Tower proposal, which has as main characteristics the construction and assembly facilities, in addition to reduced cost. Data of tests to produce water vapor parameters are presented and determined to diagnose the efficiency of the mini-solar central tower. It will be demonstrated the thermal, economic and material viability of the proposed system.

Keywords: solar oven, solar cooker, composite material, low cost, sustainable development

Procedia PDF Downloads 420
27372 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce

Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada

Abstract:

With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.

Keywords: distributed algorithm, MapReduce, multi-class, support vector machine

Procedia PDF Downloads 405
27371 Readiness of Estonian Working and Non-working Older Adults to Benefit from eHealth

Authors: Marianne Paimre

Abstract:

Estonia is heralded as the most successful digital country in the world with the highly acclaimed eHealth system. Yet 40% of the 65–74-year-olds do not use the Internet at all, and digital divide between young and elderly people's use of ICT is larger than in many advanced countries. Poor access to ICT resource and insufficient digital skills can lead to detachment from digital health resources, delayed diagnoses, and increased rates of hospitalization. To reveal digital divide within the elderly population itself, the presentation focuses on the health information behavior of Estonian seniors who either continue or have stopped working after retirement to use digital health applications. The author's main interest is on access, trust, and skills to use the Internet for medical purposes. Fifteen in-depth interviews with 65+ working persons, as well as 15 interviews with full-time retirees, were conducted. Also, six think-aloud protocols were conducted. The results indicate that older adults, who due to the nature of their work, have regular access to computers, often search for health-related information online. They exposed high source criticism and were successful in solving the given tasks. Conversely, most of the fully retired older adults claimed not using computers or other digital devices and cited lack of skills as the main reason for their inactivity. Thus, when developing health applications, it should be borne in mind that the ability and willingness of older adults to use e-solutions are very different.

Keywords: digital divide, digital healthcare, health information behavior, older adults

Procedia PDF Downloads 158
27370 Unlocking Academic Success: A Comprehensive Exploration of Shaguf Bites’s Impact on Learning and Retention

Authors: Joud Zagzoog, Amira Aldabbagh, Radiyah Hamidaddin

Abstract:

This research aims to test out and observe whether artificial intelligence (AI) software and applications could actually be effective, useful, and time-saving for those who use them. Shaguf Bites, a web application that uses AI technology, claims to help students study and memorize information more effectively in less time. The website uses smart learning, or AI-powered bite-sized repetitive learning, by transforming documents or PDFs with the help of AI into summarized interactive smart flashcards (Bites, n.d.). To properly test out the websites’ effectiveness, both qualitative and quantitative methods were used in this research. An experiment was conducted on a number of students where they were first requested to use Shaguf Bites without any prior knowledge or explanation of how to use it. Second, they were asked for feedback through a survey on how their experience was after using it and whether it was helpful, efficient, time-saving, and easy to use for studying. After reviewing the collected data, we found out that the majority of students found the website to be straightforward and easy to use. 58% of the respondents agreed that the website accurately formulated the flashcard questions. And 53% of them reported that they are most likely to use the website again in the future as well as recommend it to others. Overall, from the given results, it is clear that Shaguf Bites have proved to be very beneficial, accurate, and time saving for the majority of the students.

Keywords: artificial intelligence (AI), education, memorization, spaced repetition, flashcards.

Procedia PDF Downloads 196
27369 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 182
27368 From Parents to Pioneers: Examining Parental Impact on Entrepreneurial Traits in Latin America

Authors: Bert Seither

Abstract:

Entrepreneurship is a critical driver of economic growth, especially in emerging regions such as Latin America. This study investigates how parental influences, particularly parental individual entrepreneurial orientation (IEO), shape the entrepreneurial traits of Latin American entrepreneurs. By examining key factors like parental IEO, work ethic, parenting style, and family support, this research assesses how much of an entrepreneur's own IEO can be attributed to parental influence. The study also explores how socio-economic status and cultural context moderate the relationship between parental traits and entrepreneurial orientation. Data will be collected from 600 Latin American entrepreneurs via an online survey. This research aims to provide a comprehensive understanding of the intergenerational transmission of entrepreneurial traits and the broader socio-cultural factors that contribute to entrepreneurial success in diverse contexts. Findings from this study will offer valuable insights for policymakers, educators, and business leaders on fostering entrepreneurship across Latin America, providing practical applications for shaping entrepreneurial behavior through family influences.

Keywords: entrepreneurial orientation, parental influence, Latin American entrepreneurs, socio-economic status, cultural context

Procedia PDF Downloads 25
27367 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 288
27366 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 132
27365 Additive Carbon Dots Nanocrystals for Enhancement of the Efficiency of Dye-Sensitized Solar Cell in Energy Applications Technology

Authors: Getachew Kuma Watiro

Abstract:

The need for solar energy is constantly increasing and it is widely available on the earth’s surface. Photovoltaic technology is one of the most capable of all viable energy technology and is seen as a promising approach to the control era as it is readily available and has zero carbon emissions. Inexpensive and versatile solar cells have achieved the conversion efficiency and long life of dye-sensitized solar cells, improving the conversion efficiency from the sun to electricity. DSSCs have received a lot of attention for Various potential commercial uses, such as mobile devices and portable electronic devices, as well as integrated solar cell modules. The systematic reviews were used to show the critical impact of additive C-dots in the Dye-Sensitized solar cell for energy application technology. This research focuses on the following methods to synthesize nanoparticles such as facile, polyol, calcination, and hydrothermal technique. In addition to these, there are additives C-dots by the Hydrothermal method. This study deals with the progressive development of DSSC in photovoltaic technology. The applications of single and heterojunction structure technology devices were used (ZnO, NiO, SnO2, and NiO/ZnO/N719) and applied some additives C-dots (ZnO/C-dots /N719, NiO/C-dots /N719, SnO2 /C-dots /N719 and NiO/ZnO/C-dots/N719) and the effects of C-dots were reviewed. More than all, the technology of DSSC with C-dots enhances efficiency. Finally, recommendations have been made for future research on the application of DSSC with the use of these additives.

Keywords: dye-sensitized solar cells, heterojunction’s structure, carbon dot, conversion efficiency

Procedia PDF Downloads 124
27364 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 154
27363 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 436
27362 Thermo-Physical and Morphological Properties of Pdlcs Films Doped with Tio2 Nanoparticles.

Authors: Salima Bouadjela, Fatima Zohra Abdoune, Lahcene Mechernene

Abstract:

PDLCs are currently considered as promising materials for specific applications such as creation of window blinds controlled by electric field, fog simulators, UV protective glasses, high data storage device etc. We know that the electrical field inside the liquid crystal is low compare with the external electric field [1,2]. An addition of high magnetic and electrical, properties containing compounds to the polymer dispersed liquid crystal (PDLC) will enhance the electrical, optical, and magnetic properties of the PDLC [3,4]. Low Concentration of inorganic nanoparticles TiO2 added to nematic liquid crystals (E7) and also combined with monomers (TPGDA) and cured monomer/LC mixture to elaborate polymer-LC-NP dispersion. The presence of liquid crystal and nanoparticles in TPGDA matrix were conformed and the modified properties of PDLC due to doped nanoparticle were studied and explained by the results of FTIR, POM, UV. Incorporation of nanoparticles modifies the structure of PDLC and thus it makes increase the amount of droplets and decrease in droplet size. we found that the presence of TiO2 nanoparticles leads to a shift the nematic-isotropic transition temperature TNI.

Keywords: nanocomposites, PDLC, phases diagram, TiO2

Procedia PDF Downloads 377
27361 Forecasting of the Mobility of Rainfall-Induced Slow-Moving Landslides Using a Two-Block Model

Authors: Antonello Troncone, Luigi Pugliese, Andrea Parise, Enrico Conte

Abstract:

The present study deals with the landslides periodically reactivated by groundwater level fluctuations owing to rainfall. The main type of movement which generally characterizes these landslides consists in sliding with quite small-displacement rates. Another peculiar characteristic of these landslides is that soil deformations are essentially concentrated within a thin shear band located below the body of the landslide, which, consequently, undergoes an approximately rigid sliding. In this context, a simple method is proposed in the present study to forecast the movements of this type of landslides owing to rainfall. To this purpose, the landslide body is schematized by means of a two-block model. Some analytical solutions are derived to relate rainfall measurements with groundwater level oscillations and these latter, in turn, to landslide mobility. The proposed method is attractive for engineering applications since it requires few parameters as input data, many of which can be obtained from conventional geotechnical tests. To demonstrate the predictive capability of the proposed method, the application to a well-documented landslide periodically reactivated by rainfall is shown.

Keywords: rainfall, water level fluctuations, landslide mobility, two-block model

Procedia PDF Downloads 125