Search results for: data mining applications and discovery
29721 Blockchain-Based Assignment Management System
Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi
Abstract:
Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf,.doc,.ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.Keywords: education technology, learning management system, decentralized applications, blockchain
Procedia PDF Downloads 8529720 Enhancing Healthcare Data Protection and Security
Authors: Joseph Udofia, Isaac Olufadewa
Abstract:
Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.Keywords: cloud security, healthcare, cybersecurity, policy and standard
Procedia PDF Downloads 9329719 Biological Applications of CNT Inherited Polyaniline Nano-Composites
Authors: Yashfeen Khan, Anees Ahmad
Abstract:
In the last few decades, nano-composites have been the topic of interest. Presently, the modern era enlightens the synthesis of hybrid nano-composites over their individual counterparts because of higher application potentials and synergism. Recently, CNT hybrids have demonstrated their pronounced capability as effective sorbents for the removal of heavy metal ions (the root trouble) and organic contaminants due to their high specific surface area, enhanced reactivity, and sequestration characteristics. The present abstract discusses removal efficiencies of organic, inorganic pollutants through CNT/PANI/ composites. It also represents the widespread applications of CNT like monitoring biological systems, biosensors, as heat resources for treating cancer, fire retardant applications of polymer/CNT composites etc. And considering the same, this article aims to brief the scenario of CNT-PANI nano-composites.Keywords: biosensors, CNT, hybrids, polyaniline, synergism
Procedia PDF Downloads 37729718 Ibrutinib and the Potential Risk of Cardiac Failure: A Review of Pharmacovigilance Data
Authors: Abdulaziz Alakeel, Roaa Alamri, Abdulrahman Alomair, Mohammed Fouda
Abstract:
Introduction: Ibrutinib is a selective, potent, and irreversible small-molecule inhibitor of Bruton's tyrosine kinase (BTK). It forms a covalent bond with a cysteine residue (CYS-481) at the active site of Btk, leading to inhibition of Btk enzymatic activity. The drug is indicated to treat certain type of cancers such as mantle cell lymphoma (MCL), chronic lymphocytic leukaemia and Waldenström's macroglobulinaemia (WM). Cardiac failure is a condition referred to inability of heart muscle to pump adequate blood to human body organs. There are multiple types of cardiac failure including left and right-sided heart failure, systolic and diastolic heart failures. The aim of this review is to evaluate the risk of cardiac failure associated with the use of ibrutinib and to suggest regulatory recommendations if required. Methodology: Signal Detection team at the National Pharmacovigilance Center (NPC) of Saudi Food and Drug Authority (SFDA) performed a comprehensive signal review using its national database as well as the World Health Organization (WHO) database (VigiBase), to retrieve related information for assessing the causality between cardiac failure and ibrutinib. We used the WHO- Uppsala Monitoring Centre (UMC) criteria as standard for assessing the causality of the reported cases. Results: Case Review: The number of resulted cases for the combined drug/adverse drug reaction are 212 global ICSRs as of July 2020. The reviewers have selected and assessed the causality for the well-documented ICSRs with completeness scores of 0.9 and above (35 ICSRs); the value 1.0 presents the highest score for best-written ICSRs. Among the reviewed cases, more than half of them provides supportive association (four probable and 15 possible cases). Data Mining: The disproportionality of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by WHO-UMC to measure the reporting ratio. Positive IC reflects higher statistical association while negative values indicates less statistical association, considering the null value equal to zero. The results of (IC=1.5) revealed a positive statistical association for the drug/ADR combination, which means “Ibrutinib” with “Cardiac Failure” have been observed more than expected when compared to other medications available in WHO database. Conclusion: Health regulators and health care professionals must be aware for the potential risk of cardiac failure associated with ibrutinib and the monitoring of any signs or symptoms in treated patients is essential. The weighted cumulative evidences identified from causality assessment of the reported cases and data mining are sufficient to support a causal association between ibrutinib and cardiac failure.Keywords: cardiac failure, drug safety, ibrutinib, pharmacovigilance, signal detection
Procedia PDF Downloads 13129717 Customized Temperature Sensors for Sustainable Home Appliances
Authors: Merve Yünlü, Nihat Kandemir, Aylin Ersoy
Abstract:
Temperature sensors are used in home appliances not only to monitor the basic functions of the machine but also to minimize energy consumption and ensure safe operation. In parallel with the development of smart home applications and IoT algorithms, these sensors produce important data such as the frequency of use of the machine, user preferences, and the compilation of critical data in terms of diagnostic processes for fault detection throughout an appliance's operational lifespan. Commercially available thin-film resistive temperature sensors have a well-established manufacturing procedure that allows them to operate over a wide temperature range. However, these sensors are over-designed for white goods applications. The operating temperature range of these sensors is between -70°C and 850°C, while the temperature range requirement in home appliance applications is between 23°C and 500°C. To ensure the operation of commercial sensors in this wide temperature range, usually, a platinum coating of approximately 1-micron thickness is applied to the wafer. However, the use of platinum in coating and the high coating thickness extends the sensor production process time and therefore increases sensor costs. In this study, an attempt was made to develop a low-cost temperature sensor design and production method that meets the technical requirements of white goods applications. For this purpose, a custom design was made, and design parameters (length, width, trim points, and thin film deposition thickness) were optimized by using statistical methods to achieve the desired resistivity value. To develop thin film resistive temperature sensors, one side polished sapphire wafer was used. To enhance adhesion and insulation 100 nm silicon dioxide was coated by inductively coupled plasma chemical vapor deposition technique. The lithography process was performed by a direct laser writer. The lift-off process was performed after the e-beam evaporation of 10 nm titanium and 280 nm platinum layers. Standard four-point probe sheet resistance measurements were done at room temperature. The annealing process was performed. Resistivity measurements were done with a probe station before and after annealing at 600°C by using a rapid thermal processing machine. Temperature dependence between 25-300 °C was also tested. As a result of this study, a temperature sensor has been developed that has a lower coating thickness than commercial sensors but can produce reliable data in the white goods application temperature range. A relatively simplified but optimized production method has also been developed to produce this sensor.Keywords: thin film resistive sensor, temperature sensor, household appliance, sustainability, energy efficiency
Procedia PDF Downloads 7329716 On the Framework of Contemporary Intelligent Mathematics Underpinning Intelligent Science, Autonomous AI, and Cognitive Computers
Authors: Yingxu Wang, Jianhua Lu, Jun Peng, Jiawei Zhang
Abstract:
The fundamental demand in contemporary intelligent science towards Autonomous AI (AI*) is the creation of unprecedented formal means of Intelligent Mathematics (IM). It is discovered that natural intelligence is inductively created rather than exhaustively trained. Therefore, IM is a family of algebraic and denotational mathematics encompassing Inference Algebra, Real-Time Process Algebra, Concept Algebra, Semantic Algebra, Visual Frame Algebra, etc., developed in our labs. IM plays indispensable roles in training-free AI* theories and systems beyond traditional empirical data-driven technologies. A set of applications of IM-driven AI* systems will be demonstrated in contemporary intelligence science, AI*, and cognitive computers.Keywords: intelligence mathematics, foundations of intelligent science, autonomous AI, cognitive computers, inference algebra, real-time process algebra, concept algebra, semantic algebra, applications
Procedia PDF Downloads 6329715 Development of a Catalogs System for Augmented Reality Applications
Authors: J. Ierache, N. A. Mangiarua, S. A. Bevacqua, N. N. Verdicchio, M. E. Becerra, D. R. Sanz, M. E. Sena, F. M. Ortiz, N. D. Duarte, S. Igarza
Abstract:
Augmented Reality is a technology that involves the overlay of virtual content, which is context or environment sensitive, on images of the physical world in real time. This paper presents the development of a catalog system that facilitates and allows the creation, publishing, management and exploitation of augmented multimedia contents and Augmented Reality applications, creating an own space for anyone that wants to provide information to real objects in order to edit and share it then online with others. These spaces would be built for different domains without the initial need of expert users. Its operation focuses on the context of Web 2.0 or Social Web, with its various applications, developing contents to enrich the real context in which human beings act permitting the evolution of catalog’s contents in an emerging way.Keywords: augmented reality, catalog system, computer graphics, mobile application
Procedia PDF Downloads 35429714 Persistent Homology of Convection Cycles in Network Flows
Authors: Minh Quang Le, Dane Taylor
Abstract:
Convection is a well-studied topic in fluid dynamics, yet it is less understood in the context of networks flows. Here, we incorporate techniques from topological data analysis (namely, persistent homology) to automate the detection and characterization of convective/cyclic/chiral flows over networks, particularly those that arise for irreversible Markov chains (MCs). As two applications, we study convection cycles arising under the PageRank algorithm, and we investigate chiral edges flows for a stochastic model of a bi-monomer's configuration dynamics. Our experiments highlight how system parameters---e.g., the teleportation rate for PageRank and the transition rates of external and internal state changes for a monomer---can act as homology regularizers of convection, which we summarize with persistence barcodes and homological bifurcation diagrams. Our approach establishes a new connection between the study of convection cycles and homology, the branch of mathematics that formally studies cycles, which has diverse potential applications throughout the sciences and engineering.Keywords: homology, persistent homolgy, markov chains, convection cycles, filtration
Procedia PDF Downloads 14029713 BingleSeq: A User-Friendly R Package for Single-Cell RNA-Seq Data Analysis
Authors: Quan Gu, Daniel Dimitrov
Abstract:
BingleSeq was developed as a shiny-based, intuitive, and comprehensive application that enables the analysis of single-Cell RNA-Sequencing count data. This was achieved via incorporating three state-of-the-art software packages for each type of RNA sequencing analysis, alongside functional annotation analysis and a way to assess the overlap of differential expression method results. At its current state, the functionality implemented within BingleSeq is comparable to that of other applications, also developed with the purpose of lowering the entry requirements to RNA Sequencing analyses. BingleSeq is available on GitHub and will be submitted to R/Bioconductor.Keywords: bioinformatics, functional annotation analysis, single-cell RNA-sequencing, transcriptomics
Procedia PDF Downloads 20729712 A Dynamic Solution Approach for Heart Disease Prediction
Authors: Walid Moudani
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets
Procedia PDF Downloads 41129711 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms
Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson
Abstract:
This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection
Procedia PDF Downloads 46629710 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.Keywords: airborne laser scanning, digital terrain models, filtering, forested areas
Procedia PDF Downloads 14029709 Integrating Radar Sensors with an Autonomous Vehicle Simulator for an Enhanced Smart Parking Management System
Authors: Mohamed Gazzeh, Bradley Null, Fethi Tlili, Hichem Besbes
Abstract:
The burgeoning global ownership of personal vehicles has posed a significant strain on urban infrastructure, notably parking facilities, leading to traffic congestion and environmental concerns. Effective parking management systems (PMS) are indispensable for optimizing urban traffic flow and reducing emissions. The most commonly deployed systems nowadays rely on computer vision technology. This paper explores the integration of radar sensors and simulation in the context of smart parking management. We concentrate on radar sensors due to their versatility and utility in automotive applications, which extends to PMS. Additionally, radar sensors play a crucial role in driver assistance systems and autonomous vehicle development. However, the resource-intensive nature of radar data collection for algorithm development and testing necessitates innovative solutions. Simulation, particularly the monoDrive simulator, an internal development tool used by NI the Test and Measurement division of Emerson, offers a practical means to overcome this challenge. The primary objectives of this study encompass simulating radar sensors to generate a substantial dataset for algorithm development, testing, and, critically, assessing the transferability of models between simulated and real radar data. We focus on occupancy detection in parking as a practical use case, categorizing each parking space as vacant or occupied. The simulation approach using monoDrive enables algorithm validation and reliability assessment for virtual radar sensors. It meticulously designed various parking scenarios, involving manual measurements of parking spot coordinates, orientations, and the utilization of TI AWR1843 radar. To create a diverse dataset, we generated 4950 scenarios, comprising a total of 455,400 parking spots. This extensive dataset encompasses radar configuration details, ground truth occupancy information, radar detections, and associated object attributes such as range, azimuth, elevation, radar cross-section, and velocity data. The paper also addresses the intricacies and challenges of real-world radar data collection, highlighting the advantages of simulation in producing radar data for parking lot applications. We developed classification models based on Support Vector Machines (SVM) and Density-Based Spatial Clustering of Applications with Noise (DBSCAN), exclusively trained and evaluated on simulated data. Subsequently, we applied these models to real-world data, comparing their performance against the monoDrive dataset. The study demonstrates the feasibility of transferring models from a simulated environment to real-world applications, achieving an impressive accuracy score of 92% using only one radar sensor. This finding underscores the potential of radar sensors and simulation in the development of smart parking management systems, offering significant benefits for improving urban mobility and reducing environmental impact. The integration of radar sensors and simulation represents a promising avenue for enhancing smart parking management systems, addressing the challenges posed by the exponential growth in personal vehicle ownership. This research contributes valuable insights into the practicality of using simulated radar data in real-world applications and underscores the role of radar technology in advancing urban sustainability.Keywords: autonomous vehicle simulator, FMCW radar sensors, occupancy detection, smart parking management, transferability of models
Procedia PDF Downloads 8329708 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: classifier ensemble, breast cancer survivability, data mining, SEER
Procedia PDF Downloads 33029707 Significance of Transient Data and Its Applications in Turbine Generators
Authors: Chandra Gupt Porwal, Preeti C. Porwal
Abstract:
Transient data reveals much about the machine's condition that steady-state data cannot. New technologies make this information much more available for evaluating the mechanical integrity of a machine train. Recent surveys at various stations indicate that simplicity is preferred over completeness in machine audits throughout the power generation industry. This is most clearly shown by the number of rotating machinery predictive maintenance programs in which only steady-state vibration amplitude is trended while important transient vibration data is not even acquired. Efforts have been made to explain what transient data is, its importance, the types of plots used for its display, and its effective utilization for analysis. In order to demonstrate the value of measuring transient data and its practical application in rotating machinery for resolving complex and persistent issues with turbine generators, the author presents a few case studies that highlight the presence of rotor instabilities due to the shaft moving towards the bearing centre in a 100 MM LMZ unit located in the Northern Capital Region (NCR), heavy misalignment noticed—especially after 2993 rpm—caused by loose coupling bolts, which prevented the machine from being synchronized for more than four months in a 250 MW KWU unit in the Western Region (WR), and heavy preload noticed at Intermediate pressure turbine (IPT) bearing near HP- IP coupling, caused by high points on coupling faces at a 500 MW KWU unit in the Northern region (NR), experienced at Indian power plants.Keywords: transient data, steady-state-data, intermediate -pressure-turbine, high-points
Procedia PDF Downloads 7229706 Blue Economy and Marine Mining
Authors: Fani Sakellariadou
Abstract:
The Blue Economy includes all marine-based and marine-related activities. They correspond to established, emerging as well as unborn ocean-based industries. Seabed mining is an emerging marine-based activity; its operations depend particularly on cutting-edge science and technology. The 21st century will face a crisis in resources as a consequence of the world’s population growth and the rising standard of living. The natural capital stored in the global ocean is decisive for it to provide a wide range of sustainable ecosystem services. Seabed mineral deposits were identified as having a high potential for critical elements and base metals. They have a crucial role in the fast evolution of green technologies. The major categories of marine mineral deposits are deep-sea deposits, including cobalt-rich ferromanganese crusts, polymetallic nodules, phosphorites, and deep-sea muds, as well as shallow-water deposits including marine placers. Seabed mining operations may take place within continental shelf areas of nation-states. In international waters, the International Seabed Authority (ISA) has entered into 15-year contracts for deep-seabed exploration with 21 contractors. These contracts are for polymetallic nodules (18 contracts), polymetallic sulfides (7 contracts), and cobalt-rich ferromanganese crusts (5 contracts). Exploration areas are located in the Clarion-Clipperton Zone, the Indian Ocean, the Mid Atlantic Ridge, the South Atlantic Ocean, and the Pacific Ocean. Potential environmental impacts of deep-sea mining include habitat alteration, sediment disturbance, plume discharge, toxic compounds release, light and noise generation, and air emissions. They could cause burial and smothering of benthic species, health problems for marine species, biodiversity loss, reduced photosynthetic mechanism, behavior change and masking acoustic communication for mammals and fish, heavy metals bioaccumulation up the food web, decrease of the content of dissolved oxygen, and climate change. An important concern related to deep-sea mining is our knowledge gap regarding deep-sea bio-communities. The ecological consequences that will be caused in the remote, unique, fragile, and little-understood deep-sea ecosystems and inhabitants are still largely unknown. The blue economy conceptualizes oceans as developing spaces supplying socio-economic benefits for current and future generations but also protecting, supporting, and restoring biodiversity and ecological productivity. In that sense, people should apply holistic management and make an assessment of marine mining impacts on ecosystem services, including the categories of provisioning, regulating, supporting, and cultural services. The variety in environmental parameters, the range in sea depth, the diversity in the characteristics of marine species, and the possible proximity to other existing maritime industries cause a span of marine mining impact the ability of ecosystems to support people and nature. In conclusion, the use of the untapped potential of the global ocean demands a liable and sustainable attitude. Moreover, there is a need to change our lifestyle and move beyond the philosophy of single-use. Living in a throw-away society based on a linear approach to resource consumption, humans are putting too much pressure on the natural environment. Applying modern, sustainable and eco-friendly approaches according to the principle of circular economy, a substantial amount of natural resource savings will be achieved. Acknowledgement: This work is part of the MAREE project, financially supported by the Division VI of IUPAC. This work has been partly supported by the University of Piraeus Research Center.Keywords: blue economy, deep-sea mining, ecosystem services, environmental impacts
Procedia PDF Downloads 8629705 Tripeptide Inhibitor: The Simplest Aminogenic PEGylated Drug against Amyloid Beta Peptide Fibrillation
Authors: Sutapa Som Chaudhury, Chitrangada Das Mukhopadhyay
Abstract:
Alzheimer’s disease is a well-known form of dementia since its discovery in 1906. Current Food and Drug Administration approved medications e.g. cholinesterase inhibitors, memantine offer modest symptomatic relief but do not play any role in disease modification or recovery. In last three decades many small molecules, chaperons, synthetic peptides, partial β-secretase enzyme blocker have been tested for the development of a drug against Alzheimer though did not pass the 3rd clinical phase trials. Here in this study, we designed a PEGylated, aminogenic, tripeptidic polymer with two different molecular weights based on the aggregation prone amino acid sequence 17-20 in amyloid beta (Aβ) 1-42. Being conjugated with poly-ethylene glycol (PEG) which self-assembles into hydrophilic nanoparticles, these PEGylated tripeptides constitute a very good drug delivery system crossing the blood brain barrier while the peptide remains protected from proteolytic degradation and non-specific protein interactions. Moreover, being completely aminogenic they would not raise any side effects. These peptide inhibitors were evaluated for their effectiveness against Aβ42 fibrillation at an early stage of oligomer to fibril formation as well as preformed fibril clearance via Thioflavin T (ThT) assay, dynamic light scattering analyses, atomic force microscopy and scanning electron microscopy. The inhibitors were proved to be safe at a higher concentration of 20µM by the reduction assay of 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) dye. Moreover, SHSY5Y neuroblastoma cells have shown a greater survivability when treated with the inhibitors following Aβ42 fibril and oligomer treatment as compared with the control Aβ42 fibril and/or oligomer treated neuroblastoma cells. These make the peptidic inhibitors a promising compound in the aspect of the discovery of alternative medication for Alzheimer’s disease.Keywords: Alzheimer’s disease, alternative medication, amyloid beta, PEGylated peptide
Procedia PDF Downloads 20929704 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations
Authors: Ram Mohan, Richard Haney, Ajit Kelkar
Abstract:
Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance
Procedia PDF Downloads 36629703 The Socio-Technical Relationship between Architects and Nano-Enhanced Materials: An Ethnographic Study in Cairo, Egypt
Authors: Ramy Bakir
Abstract:
Advancements in the field of nanoscience and nanotechnology have had a sweeping effect on the manufacturing industry in the last two decades, and have specifically allowed for the enhancement of a multitude of applications in the field of building technology. Research carried out in the architectural field in the past decade highlights how those enhancements have improved the structural and environmental performance of buildings, and/or how they developed the aesthetic value of façade or interior treatments. In developing countries, such as Egypt, the actual use of those nano-enhanced applications and their benefits rarely manifest. Hence this paper investigates the socio-technical relationship between the architectural design process and nanotechnology in Cairo using participant observation within an ethnographic study. The study focused on the socio-cultural context of an environmental design process in a specific design firm, and the role of nano-enhanced applications in it, and provided a thick description of the design decisions made within the preliminary stages of the design process of a residential building in Cairo, Egypt. Using Grounded Theory, and through the analysis and coding of the qualitative data collected, this paper was able to identify specific socio-cultural issues influencing individual architect cognition, clarifying how the context of the design process of the studied project affected the design team members’ responses to nano-enhanced materials. This paper presents those findings within a framework of the three identified statuses of response to nanotechnology and classifies the socio-cultural reasons influencing them. In doing so, the paper aims to shed more light on the relation between nanotechnology and architects in their natural environment, and hence allow both to benefit more from a clearer understanding of how the socio-cultural context, along with the benefits of using nanotechnology, influences the design decisions made.Keywords: nanotechnology, design process, socio-cultural context, nano-enhanced applications
Procedia PDF Downloads 26929702 Distributed Cyber Physical Secure Framework for DC Microgrids: DC Ship Power System Applications
Authors: Grace karimi Muriithi, Behnaz Papari, Ali Arsalan, Christopher Shannon Edrington
Abstract:
Complexity and nonlinearity of the control system design is increasing for DC microgrid applications when the cyber concept associated with the technology constraints will added to the picture. Controllers’ functionality during the critical operation mode is required to guaranteed specifically for a high profile applications such as NAVY DC ship power system (SPS) as an small-scaled DC microgrid. Thus, SPS is susceptible to cyber-attacks and, accordingly, can provide the disastrous effects. In this study, a machine learning (ML) approach is demonstrated to offer the promising performance of SPS for developing an effective and robust functionality over attacks time. Simulation results analysis demonstrate that the proposed method can improve the controllability successfully.Keywords: controlability, cyber attacks, distribute control, machine learning
Procedia PDF Downloads 11629701 Revealing the Genome Based Biosynthetic Potential of a Streptomyces sp. Isolate BR123 Presenting Broad Spectrum Antimicrobial Activities
Authors: Neelma Ashraf
Abstract:
Actinomycetes, particularly genus Streptomyces is of great importance due to their role in the discovery of new natural products, particularly antimicrobial secondary metabolites in the medicinal science and biotechnology industry. Different Streptomyces strains were isolated from Helianthus annuus plants and tested for antibacterial and antifungal activities. The most promising five strains were chosen for further investigation, and growth conditions for antibiotic synthesis were optimised. The supernatants were extracted in different solvents, and the extracted products were analyzed using liquid chromatography-mass spectrometry (LC-MS) and biological testing. From one of the potent strains Streptomyces globusus sp. BR123, a compound lavendamycin was identified using these analytical techniques. In addition, this potent strain also produces a strong antifungal polyene compound with a quasimolecular ion of 2072. Streptomyces sp. BR123 was genome sequenced because of its promising antimicrobial potential in order to identify the gene cluster responsible for analyzed compound “lavendamycin”. The genome analysis yielded candidate genes responsible for the production of this potent compound. The genome sequence of 8.15 Mb of Streptomyces sp. isolate BR123 with a GC content of 72.63% and 8103 protein coding genes was attained. Many antimicrobial, antiparasitic, and anticancerous compounds were detected through multiple biosynthetic gene clusters predicted by in-Silico analysis. Though, the novelty of metabolites was determined through the insignificant resemblance with known biosynthetic gene clusters. The current study gives insight into the bioactive potential of Streptomyces sp. isolate BR123 with respect to the synthesis of bioactive secondary metabolites through genomic and spectrometric analysis. Moreover, the comparative genome study revealed the connection of isolate BR123 with other Streptomyces strains, which could expand the knowledge of this genus and the mechanism involved in the discovery of new antimicrobial metabolites.Keywords: streptomyces, secondary metabolites, genome, biosynthetic gene clusters, high performance liquid chromatography, mass spectrometry
Procedia PDF Downloads 7129700 A Functional Thermochemical Energy Storage System for Mobile Applications: Design and Performance Analysis
Authors: Jure Galović, Peter Hofmann
Abstract:
Thermochemical energy storage (TCES), as a long-term and lossless energy storage principle, provides a contribution for the reduction of greenhouse emissions of mobile applications, such as passenger vehicles with an internal combustion engine. A prototype of a TCES system, based on reversible sorption reactions of LiBr composite and methanol has been designed at Vienna University of Technology. In this paper, the selection of reactive and inert carrier materials as well as the design of heat exchangers (reactor vessel and evapo-condenser) was reviewed and the cycle stability under real operating conditions was investigated. The performance of the developed system strongly depends on the environmental temperatures, to which the reactor vessel and evapo-condenser are exposed during the phases of thermal conversion. For an integration of the system into mobile applications, the functionality of the designed prototype was proved in numerous conducted cycles whereby no adverse reactions were observed.Keywords: dynamic applications, LiBr composite, methanol, performance of TCES system, sorption process, thermochemical energy storage
Procedia PDF Downloads 16829699 Determinants of Poverty: A Logit Regression Analysis of Zakat Applicants
Authors: Zunaidah Ab Hasan, Azhana Othman, Abd Halim Mohd Noor, Nor Shahrina Mohd Rafien
Abstract:
Zakat is a portion of wealth contributed from financially able Muslims to be distributed to predetermine recipients; main among them are the poor and the needy. Distribution of the zakat fund is given with the objective to lift the recipients from poverty. Due to the multidimensional and multifaceted nature of poverty, it is imperative that the causes of poverty are properly identified for assistance given by zakat authorities reached the intended target. Despite, various studies undertaken to identify the poor correctly, there are reports of the poor not receiving the adequate assistance required from zakat. Thus, this study examines the determinants of poverty among applicants for zakat assistance distributed by the State Islamic Religious Council in Malacca (SIRCM). Malacca is a state in Malaysia. The respondents were based on the list of names of new zakat applicants for the month of April and May 2014 provided by SIRCM. A binary logistic regression was estimated based on this data with either zakat applications is rejected or accepted as the dependent variable and set of demographic variables and health as the explanatory variables. Overall, the logistic model successfully predicted factors of acceptance of zakat applications. Three independent variables namely gender, age; size of households and health significantly explain the likelihood of a successful zakat application. Among others, the finding suggests the importance of focusing on providing education opportunity in helping the poor.Keywords: logistic regression, zakat distribution, status of zakat applications, poverty, education
Procedia PDF Downloads 34129698 De Novo Assembly and Characterization of the Transcriptome from the Fluoroacetate Producing Plant, Dichapetalum Cymosum
Authors: Selisha A. Sooklal, Phelelani Mpangase, Shaun Aron, Karl Rumbold
Abstract:
Organically bound fluorine (C-F bond) is extremely rare in nature. Despite this, the first fluorinated secondary metabolite, fluoroacetate, was isolated from the plant Dichapetalum cymosum (commonly known as Gifblaar). However, the enzyme responsible for fluorination (fluorinase) in Gifblaar was never isolated and very little progress has been achieved in understanding this process in higher plants. Fluorinated compounds have vast applications in the pharmaceutical, agrochemical and fine chemicals industries. Consequently, an enzyme capable of catalysing a C-F bond has great potential as a biocatalyst in the industry considering that the field of fluorination is virtually synthetic. As with any biocatalyst, a range of these enzymes are required. Therefore, it is imperative to expand the exploration for novel fluorinases. This study aimed to gain molecular insights into secondary metabolite biosynthesis in Gifblaar using a high-throughput sequencing-based approach. Mechanical wounding studies were performed using Gifblaar leaf tissue in order to induce expression of the fluorinase. The transcriptome of the wounded and unwounded plant was then sequenced on the Illumina HiSeq platform. A total of 26.4 million short sequence reads were assembled into 77 845 transcripts using Trinity. Overall, 68.6 % of transcripts were annotated with gene identities using public databases (SwissProt, TrEMBL, GO, COG, Pfam, EC) with an E-value threshold of 1E-05. Sequences exhibited the greatest homology to the model plant, Arabidopsis thaliana (27 %). A total of 244 annotated transcripts were found to be differentially expressed between the wounded and unwounded plant. In addition, secondary metabolic pathways present in Gifblaar were successfully reconstructed using Pathway tools. Due to lack of genetic information for plant fluorinases, a transcript failed to be annotated as a fluorinating enzyme. Thus, a local database containing the 5 existing bacterial fluorinases was created. Fifteen transcripts having homology to partial regions of existing fluorinases were found. In efforts to obtain the full coding sequence of the Gifblaar fluorinase, primers were designed targeting the regions of homology and genome walking will be performed to amplify the unknown regions. This is the first genetic data available for Gifblaar. It has provided novel insights into the mechanisms of metabolite biosynthesis and will allow for the discovery of the first eukaryotic fluorinase.Keywords: biocatalyst, fluorinase, gifblaar, transcriptome
Procedia PDF Downloads 27729697 Origins: An Interpretive History of MMA Design Studio’s Exhibition for the 2023 Venice Biennale
Authors: Jonathan A. Noble
Abstract:
‘Origins’ is an exhibition designed and installed by MMA Design Studio, at the 2023 Venice Biennale. The instillation formed part of the ‘Dangerous Liaisons’ group exhibition at the Arsenale building. An immersive experience was created for those who visited, where video projection and the bodies of visitors interacted with the scene. Designed by South African architect, Mphethi Morojele – founder and owner of MMA – the primary inspiration for ‘Origins’ was the recent discovery by Professor Karim Sadr in 2019, of a substantial Tswana settlement. Situated in present day Suikerbosrand Nature Reserve, some 45km south of Johannesburg, this precolonial city named Kweneng, has been dated back to the fifteenth century. This remarkable discovery was achieved thanks to advanced aerial, LiDAR scanning technology, which was used to capture the traces of Kweneng, spanning a terrain of some 10km long and 2km wide. Discovered by light (LiDAR) and exhibited through light, Origins presents a simulated experience of Kweneng. The presentation of Kweneng was achieved primarily though video, with a circular projection onto the floor of an animated LiDAR data sequence, and onto the walls a filmed dance sequence choreographed to embody the architectural, spatial and symbolic significance of Kweneng. This paper documents the design process that was involved in the conceptualization, development and final realization of this noteworthy exhibition, with an elucidation upon key social and cultural questions pertaining to precolonial heritage, reimagined histories and postcolonial identity. Periods of change and of social awakening sometimes spark an interest in questions of origin, of cultural lineage and belonging – and which certainly is the case for contemporary, post-Apartheid South Africa. Researching this paper has required primary study of MMA Design Studio’s project archive, including various proposals and other design related documents, conceptual design sketches, architectural drawings and photographs. This material is supported by the authors first-hand interviews with Morejele and others who were involved, especially with respect to the choreography of the interpretive dance, LiDAR visualization techniques and video production that informed the simulated, immersive experience at the exhibition. Presenting a ‘dangerous liaison’ between architecture and dance, Origins looks into the distant past to frame contemporary questions pertaining to intangible heritage, animism and embodiment through architecture and dance – considerations which are required “to survive the future”, says Morojele.Keywords: architecture and dance, Kweneng, MMA design studio, origins, Venice Biennale
Procedia PDF Downloads 9129696 Modalmetric Fiber Sensor and Its Applications
Authors: M. Zyczkowski, P. Markowski, M. Karol
Abstract:
The team from IOE MUT is developing fiber optic sensors for the security systems for 15 years. The conclusions of the work indicate that these sensors are complicated. Moreover, these sensors are expensive to produce and require sophisticated signal processing methods.We present the results of the investigations of three different applications of the modalmetric sensor: • Protection of museum collections and heritage buildings, • Protection of fiber optic transmission lines, • Protection of objects of critical infrastructure. Each of the presented applications involves different requirements for the system. The results indicate that it is possible to developed a fiber optic sensor based on a single fiber. Modification of optoelectronic parts with a change of the length of the sensor and the method of reflections of propagating light at the end of the sensor allows to adjust the system to the specific application.Keywords: modalmetric fiber optic sensor, security sensor, optoelectronic parts, signal processing
Procedia PDF Downloads 62329695 An Engineering Review of Grouting in Soil Improvement Applications
Authors: Mohamad Kazem Zamani, Meldi Suhatril
Abstract:
Soil improvement is one of the main concerns of each civil engineer who is working at soil mechanics and geotechnics. Grouting has been used as a powerful treatment for soil improving. In this paper, we have tried to review the grouting application base on grouts which is used and also we have tried to give a general view of grout applications and where and when can be used.Keywords: cementious grouting, chemical grouting, soil improvement, civil engineering
Procedia PDF Downloads 52129694 A Bivariate Inverse Generalized Exponential Distribution and Its Applications in Dependent Competing Risks Model
Authors: Fatemah A. Alqallaf, Debasis Kundu
Abstract:
The aim of this paper is to introduce a bivariate inverse generalized exponential distribution which has a singular component. The proposed bivariate distribution can be used when the marginals have heavy-tailed distributions, and they have non-monotone hazard functions. Due to the presence of the singular component, it can be used quite effectively when there are ties in the data. Since it has four parameters, it is a very flexible bivariate distribution, and it can be used quite effectively for analyzing various bivariate data sets. Several dependency properties and dependency measures have been obtained. The maximum likelihood estimators cannot be obtained in closed form, and it involves solving a four-dimensional optimization problem. To avoid that, we have proposed to use an EM algorithm, and it involves solving only one non-linear equation at each `E'-step. Hence, the implementation of the proposed EM algorithm is very straight forward in practice. Extensive simulation experiments and the analysis of one data set have been performed. We have observed that the proposed bivariate inverse generalized exponential distribution can be used for modeling dependent competing risks data. One data set has been analyzed to show the effectiveness of the proposed model.Keywords: Block and Basu bivariate distributions, competing risks, EM algorithm, Marshall-Olkin bivariate exponential distribution, maximum likelihood estimators
Procedia PDF Downloads 14429693 Tuning Cubic Equations of State for Supercritical Water Applications
Authors: Shyh Ming Chern
Abstract:
Cubic equations of state (EoS), popular due to their simple mathematical form, ease of use, semi-theoretical nature and, reasonable accuracy are normally fitted to vapor-liquid equilibrium P-v-T data. As a result, They often show poor accuracy in the region near and above the critical point. In this study, the performance of the renowned Peng-Robinson (PR) and Patel-Teja (PT) EoS’s around the critical area has been examined against the P-v-T data of water. Both of them display large deviations at critical point. For instance, PR-EoS exhibits discrepancies as high as 47% for the specific volume, 28% for the enthalpy departure and 43% for the entropy departure at critical point. It is shown that incorporating P-v-T data of the supercritical region into the retuning of a cubic EoS can improve its performance above the critical point dramatically. Adopting a retuned acentric factor of 0.5491 instead of its genuine value of 0.344 for water in PR-EoS and a new F of 0.8854 instead of its original value of 0.6898 for water in PT-EoS reduces the discrepancies to about one third or less.Keywords: equation of state, EoS, supercritical water, SCW
Procedia PDF Downloads 53829692 Synthesis, Characterization, Validation of Resistant Microbial Strains and Anti Microbrial Activity of Substitted Pyrazoles
Authors: Rama Devi Kyatham, D. Ashok, K. S. K. Rao Patnaik, Raju Bathula
Abstract:
We have shown the importance of pyrazoles as anti-microbial chemical entities. These compounds have generally been considered significant due to their wide range of pharmacological acivities and their discovery motivates new avenues of research.The proposed pyrazoles were synthesized and evaluated for their anti-microbial activities. The Synthesized compounds were analyzed by different spectroscopic methods.Keywords: pyrazoles, validation, resistant microbial strains, anti-microbial activities
Procedia PDF Downloads 176