Search results for: data mining applications and discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30852

Search results for: data mining applications and discovery

29112 Multi-Criteria Decision Support System for Modeling of Civic Facilities Using GIS Applications: A Case Study of F-11, Islamabad

Authors: Asma Shaheen Hashmi, Omer Riaz, Khalid Mahmood, Fahad Ullah, Tanveer Ahmad

Abstract:

The urban landscapes are being change with the population growth and advancements in new technologies. The urban sprawl pattern and utilizes are related to the local socioeconomic and physical condition. Urban policy decisions are executed mostly through spatial planning. A decision support system (DSS) is very powerful tool which provides flexible knowledge base method for urban planning. An application was developed using geographical information system (GIS) for urban planning. A scenario based DSS was developed to integrate the hierarchical muti-criteria data of different aspects of urban landscape. These were physical environment, the dumping site, spatial distribution of road network, gas and water supply lines, and urban watershed management, selection criteria for new residential, recreational, commercial and industrial sites. The model provided a framework to incorporate the sustainable future development. The data can be entered dynamically by planners according to the appropriate criteria for the management of urban landscapes.

Keywords: urban, GIS, spatial, criteria

Procedia PDF Downloads 638
29111 Stress Corrosion Crack Identification with Direct Assessment Method in Pipeline Downstream from a Compressor Station

Authors: H. Gholami, M. Jalali Azizpour

Abstract:

Stress Corrosion Crack (SCC) in pipeline is a type of environmentally assisted cracking (EAC), since its discovery in 1965 as a possible cause of failure in pipeline, SCC has caused, on average, one of two failures per year in the U.S, According to the NACE SCC DA a pipe line segment is considered susceptible to SCC if all of the following factors are met: The operating stress exceeds 60% of specified minimum yield strength (SMYS), the operating temperature exceeds 38°C, the segment is less than 32 km downstream from a compressor station, the age of the pipeline is greater than 10 years and the coating type is other than Fusion Bonded Epoxy(FBE). In this paper as a practical experience in NISOC, Direct Assessment (DA) Method is used for identification SCC defect in unpiggable pipeline located downstream of compressor station.

Keywords: stress corrosion crack, direct assessment, disbondment, transgranular SCC, compressor station

Procedia PDF Downloads 387
29110 Cybersecurity Assessment of Decentralized Autonomous Organizations in Smart Cities

Authors: Claire Biasco, Thaier Hayajneh

Abstract:

A smart city is the integration of digital technologies in urban environments to enhance the quality of life. Smart cities capture real-time information from devices, sensors, and network data to analyze and improve city functions such as traffic analysis, public safety, and environmental impacts. Current smart cities face controversy due to their reliance on real-time data tracking and surveillance. Internet of Things (IoT) devices and blockchain technology are converging to reshape smart city infrastructure away from its centralized model. Connecting IoT data to blockchain applications would create a peer-to-peer, decentralized model. Furthermore, blockchain technology powers the ability for IoT device data to shift from the ownership and control of centralized entities to individuals or communities with Decentralized Autonomous Organizations (DAOs). In the context of smart cities, DAOs can govern cyber-physical systems to have a greater influence over how urban services are being provided. This paper will explore how the core components of a smart city now apply to DAOs. We will also analyze different definitions of DAOs to determine their most important aspects in relation to smart cities. Both categorizations will provide a solid foundation to conduct a cybersecurity assessment of DAOs in smart cities. It will identify the benefits and risks of adopting DAOs as they currently operate. The paper will then provide several mitigation methods to combat cybersecurity risks of DAO integrations. Finally, we will give several insights into what challenges will be faced by DAO and blockchain spaces in the coming years before achieving a higher level of maturity.

Keywords: blockchain, IoT, smart city, DAO

Procedia PDF Downloads 125
29109 Applications and Development of a Plug Load Management System That Automatically Identifies the Type and Location of Connected Devices

Authors: Amy Lebar, Kim L. Trenbath, Bennett Doherty, William Livingood

Abstract:

Plug and process loads (PPLs) account for 47% of U.S. commercial building energy use. There is a huge potential to reduce whole building consumption by targeting PPLs for energy savings measures or implementing some form of plug load management (PLM). Despite this potential, there has yet to be a widely adopted commercial PLM technology. This paper describes the Automatic Type and Location Identification System (ATLIS), a PLM system framework with automatic and dynamic load detection (ADLD). ADLD gives PLM systems the ability to automatically identify devices as they are plugged into the outlets of a building. The ATLIS framework takes advantage of smart, connected devices to identify device locations in a building, meter and control their power, and communicate this information to a central database. ATLIS includes five primary capabilities: location identification, communication, control, energy metering and data storage. A laboratory proof of concept (PoC) demonstrated all but the data storage capabilities and these capabilities were validated using an office building scenario. The PoC can identify when a device is plugged into an outlet and the location of the device in the building. When a device is moved, the PoC’s dashboard and database are automatically updated with the new location. The PoC implements controls to devices from the system dashboard so that devices maintain correct schedules regardless of where they are plugged in within a building. ATLIS’s primary technology application is improved PLM, but other applications include asset management, energy audits, and interoperability for grid-interactive efficient buildings. A system like ATLIS could also be used to direct power to critical devices, such as ventilators, during a brownout or blackout. Such a framework is an opportunity to make PLM more widespread and reduce the amount of energy consumed by PPLs in current and future commercial buildings.

Keywords: commercial buildings, grid-interactive efficient buildings (GEB), miscellaneous electric loads (MELs), plug loads, plug load management (PLM)

Procedia PDF Downloads 135
29108 Towards a Secure Storage in Cloud Computing

Authors: Mohamed Elkholy, Ahmed Elfatatry

Abstract:

Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.

Keywords: access control, data integrity, data confidentiality, Kerberos authentication, cloud security

Procedia PDF Downloads 336
29107 Bitplanes Image Encryption/Decryption Using Edge Map (SSPCE Method) and Arnold Transform

Authors: Ali A. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression, salt and peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 501
29106 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 178
29105 A Study on the Computation of Gourava Indices for Poly-L Lysine Dendrimer and Its Biomedical Applications

Authors: M. Helen

Abstract:

Chemical graph serves as a convenient model for any real or abstract chemical system. Dendrimers are novel three dimensional hyper branched globular nanopolymeric architectures. Drug delivery scientists are especially enthusiastic about possible utility of dendrimers as drug delivery tool. Dendrimers like poly L lysine (PLL), poly-propylene imine (PPI) and poly-amidoamine (PAMAM), etc., are used as gene carrier in drug delivery system because of their chemical characteristics. These characteristics of chemical compounds are analysed using topological indices (invariants under graph isomorphism) such as Wiener index, Zagreb index, etc., Prof. V. R. Kulli motivated by the application of Zagreb indices in finding the total π energy and derived Gourava indices which is an improved version over Zagreb indices. In this paper, we study the structure of PLL-Dendrimer that has the following applications: reduction in toxicity, colon delivery, and topical delivery. Also, we determine first and second Gourava indices, first and second hyper Gourava indices, product and sum connectivity Gourava indices for PLL-Dendrimer. Gourava Indices have found applications in Quantitative Structure-Property Relationship (QSPR)/ Quantitative Structure-Activity Relationship (QSAR) studies.

Keywords: connectivity Gourava indices, dendrimer, Gourava indices, hyper GouravaG indices

Procedia PDF Downloads 142
29104 Efficient Motion Estimation by Fast Three Step Search Algorithm

Authors: S. M. Kulkarni, D. S. Bormane, S. L. Nalbalwar

Abstract:

The rapid development in the technology have dramatic impact on the medical health care field. Medical data base obtained with latest machines like CT Machine, MRI scanner requires large amount of memory storage and also it requires large bandwidth for transmission of data in telemedicine applications. Thus, there is need for video compression. As the database of medical images contain number of frames (slices), hence while coding of these images there is need of motion estimation. Motion estimation finds out movement of objects in an image sequence and gets motion vectors which represents estimated motion of object in the frame. In order to reduce temporal redundancy between successive frames of video sequence, motion compensation is preformed. In this paper three step search (TSS) block matching algorithm is implemented on different types of video sequences. It is shown that three step search algorithm produces better quality performance and less computational time compared with exhaustive full search algorithm.

Keywords: block matching, exhaustive search motion estimation, three step search, video compression

Procedia PDF Downloads 493
29103 Novel Aminoglycosides to Target Resistant Pathogens

Authors: Nihar Ranjan, Derrick Watkins, Dev P. Arya

Abstract:

Current methods in the study of antibiotic activity of ribosome targeted antibiotics are dependent on cell based bacterial inhibition assays or various forms of ribosomal binding assays. These assays are typically independent of each other and little direct correlation between the ribosomal binding and bacterial inhibition is established with the complementary assay. We have developed novel high-throughput capable assays for ribosome targeted drug discovery. One such assay examines the compounds ability to bind to a model ribosomal RNA A-site. We have also coupled this assay to other functional orthogonal assays. Such analysis can provide valuable understanding of the relationships between two complementary drug screening methods and could be used as standard analysis to correlate the affinity of a compound for its target and the effect the compound has on a cell.

Keywords: bacterial resistance, aminoglycosides, screening, drugs

Procedia PDF Downloads 373
29102 Kinetic and Removable of Amoxicillin Using Aliquat336 as a Carrier via a HFSLM

Authors: Teerapon Pirom, Ura Pancharoen

Abstract:

Amoxicillin is an antibiotic which is widely used to treat various infections in both human beings and animals. However, when amoxicillin is released into the environment, it is a major problem. Amoxicillin causes bacterial resistance to these drugs and failure of treatment with antibiotics. Liquid membrane is of great interest as a promising method for the separation and recovery of the target ions from aqueous solutions due to the use of carriers for the transport mechanism, resulting in highly selectivity and rapid transportation of the desired metal ions. The simultaneous processes of extraction and stripping in a single unit operation of liquid membrane system are very interesting. Therefore, it is practical to apply liquid membrane, particularly the HFSLM for industrial applications as HFSLM is proved to be a separation process with lower capital and operating costs, low energy and extractant with long life time, high selectivity and high fluxes compared with solid membranes. It is a simple design amenable to scaling up for industrial applications. The extraction and recovery for (Amoxicillin) through the hollow fiber supported liquid membrane (HFSLM) using aliquat336 as a carrier were explored with the experimental data. The important variables affecting on transport of amoxicillin viz. extractant concentration and operating time were investigated. The highest AMOX- extraction percentages of 85.35 and Amoxicillin stripping of 80.04 were achieved with the best condition at 6 mmol/L [aliquat336] and operating time 100 min. The extraction reaction order (n) and the extraction reaction rate constant (kf) were found to be 1.00 and 0.0344 min-1, respectively.

Keywords: aliquat336, amoxicillin, HFSLM, kinetic

Procedia PDF Downloads 277
29101 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 79
29100 Development of Soft 3D Printing Materials for Textile Applications

Authors: Chi-Chung Marven Chick, Chu-Po Ho, Sau-Chuen Joe Au, Wing-Fai Sidney Wong, Chi-Wai Kan

Abstract:

Recently, 3D printing becomes popular process for manufacturing, especially has special attention in textile applications. However, there are various types of 3D printing materials, including plastic, resin, rubber, ceramics, gold, platinum, silver, iron, titanium but not all these materials are suitable for textile application. Generally speaking, 3D printing of textile mainly uses thermoplastic polymers such as acrylonitrile butadiene styrene (ABS), polylactide (PLA), polycaprolactone (PCL), thermoplastic polyurethane (TPU), polyethylene terephthalate glycol-modified (PETG), polystyrene (PS), polypropylene (PP). Due to the characteristics of the polymers, 3D printed textiles usually have low air permeability and poor comfortable. Therefore, in this paper, we will review the possible materials suitable for textile application with desired physical and mechanical properties.

Keywords: 3D printing, 3D printing materials, textile, properties

Procedia PDF Downloads 68
29099 A Recognition Method for Spatio-Temporal Background in Korean Historical Novels

Authors: Seo-Hee Kim, Kee-Won Kim, Seung-Hoon Kim

Abstract:

The most important elements of a novel are the characters, events and background. The background represents the time, place and situation that character appears, and conveys event and atmosphere more realistically. If readers have the proper knowledge about background of novels, it may be helpful for understanding the atmosphere of a novel and choosing a novel that readers want to read. In this paper, we are targeting Korean historical novels because spatio-temporal background especially performs an important role in historical novels among the genre of Korean novels. To the best of our knowledge, we could not find previous study that was aimed at Korean novels. In this paper, we build a Korean historical national dictionary. Our dictionary has historical places and temple names of kings over many generations as well as currently existing spatial words or temporal words in Korean history. We also present a method for recognizing spatio-temporal background based on patterns of phrasal words in Korean sentences. Our rules utilize postposition for spatial background recognition and temple names for temporal background recognition. The knowledge of the recognized background can help readers to understand the flow of events and atmosphere, and can use to visualize the elements of novels.

Keywords: data mining, Korean historical novels, Korean linguistic feature, spatio-temporal background

Procedia PDF Downloads 280
29098 Slope Stability Assessment in Metasedimentary Deposit of an Opencast Mine: The Case of the Dikuluwe-Mashamba (DIMA) Mine in the DR Congo

Authors: Dina Kon Mushid, Sage Ngoie, Tshimbalanga Madiba, Kabutakapua Kakanda

Abstract:

Slope stability assessment is still the biggest challenge in mining activities and civil engineering structures. The slope in an opencast mine frequently reaches multiple weak layers that lead to the instability of the pit. Faults and soft layers throughout the rock would increase weathering and erosion rates. Therefore, it is essential to investigate the stability of the complex strata to figure out how stable they are. In the Dikuluwe-Mashamba (DIMA) area, the lithology of the stratum is a set of metamorphic rocks whose parent rocks are sedimentary rocks with a low degree of metamorphism. Thus, due to the composition and metamorphism of the parent rock, the rock formation is different in hardness and softness, which means that when the content of dolomitic and siliceous is high, the rock is hard. It is softer when the content of argillaceous and sandy is high. Therefore, from the vertical direction, it appears as a weak and hard layer, and from the horizontal direction, it seems like a smooth and hard layer in the same rock layer. From the structural point of view, the main structures in the mining area are the Dikuluwe dipping syncline and the Mashamba dipping anticline, and the occurrence of rock formations varies greatly. During the folding process of the rock formation, the stress will concentrate on the soft layer, causing the weak layer to be broken. At the same time, the phenomenon of interlayer dislocation occurs. This article aimed to evaluate the stability of metasedimentary rocks of the Dikuluwe-Mashamba (DIMA) open-pit mine using limit equilibrium and stereographic methods Based on the presence of statistical structural planes, the stereographic projection was used to study the slope's stability and examine the discontinuity orientation data to identify failure zones along the mine. The results revealed that the slope angle is too steep, and it is easy to induce landslides. The numerical method's sensitivity analysis showed that the slope angle and groundwater significantly impact the slope safety factor. The increase in the groundwater level substantially reduces the stability of the slope. Among the factors affecting the variation in the rate of the safety factor, the bulk density of soil is greater than that of rock mass, the cohesion of soil mass is smaller than that of rock mass, and the friction angle in the rock mass is much larger than that in the soil mass. The analysis showed that the rock mass structure types are mostly scattered and fragmented; the stratum changes considerably, and the variation of rock and soil mechanics parameters is significant.

Keywords: slope stability, weak layer, safety factor, limit equilibrium method, stereography method

Procedia PDF Downloads 267
29097 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine

Authors: Adriana Haulica

Abstract:

Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.

Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics

Procedia PDF Downloads 74
29096 Unlocking the Potential of Phosphatic Wastes: Sustainable Valorization Pathways for Synthesizing Functional Metal-Organic Frameworks and Zeolites

Authors: Ali Mohammed Yimer, Ayalew H. Assen, Youssef Belmabkhout

Abstract:

This study delves into sustainable approaches for valorizing phosphatic wastes, specifically phosphate mining wastes and phosphogypsum, which are byproducts of phosphate industries and pose significant environmental challenges due to their accumulation. We propose a unified strategic synthesis method aimed at converting these wastes into hetero-functional porous materials. Our approach involves isolating the primary components of phosphatic wastes, such as CaO, SiO2 and Al2O3 to fabricate functional porous materials falling into two distinct classes. Firstly, alumina and silica components are extracted or isolated to produce zeolites (including CAN, GIS, SOD, FAU, and LTA), characterized by a Si/Al ratio of less than 5. Secondly, residual calcium is utilized to synthesize calcium-based metal–organic frameworks (Ca-MOFs) employing various organic linkers like Ca-BDC, Ca-BTC and Ca-TCPB (SBMOF-2), thereby providing flexibility in material design. Characterization techniques including XRD, SEM-EDX, FTIR, and TGA-MS affirm successful material assembly, while sorption analyses using N2, CO2, and H2O demonstrate the porosity of the materials. Particularly noteworthy is the water/alcohol separation potential exhibited by the Ca-BTC MOF, owing to its optimal pore aperture size (∼3.4 Å). To enhance replicability and scalability, detailed protocols for each synthesis step and specific conditions for each process are provided, ensuring that the methodology can be easily reproduced and scaled up for industrial applications. This synthetic transformation approach represents a valorization route for converting phosphatic wastes into extended porous structures, promising significant environmental and economic benefits.

Keywords: calcium-based metal-organic frameworks, low-silica zeolites, porous materials, sustainable synthesis, valorization

Procedia PDF Downloads 44
29095 A Study of Mortars with Granulated Blast Furnace Slag as Fine Aggregate and Its Influence on Properties of Burnt Clay Brick Masonry

Authors: Vibha Venkataramu, B. V. Venkatarama Reddy

Abstract:

Natural river sand is the most preferred choice as fine aggregate in masonry mortars. Uncontrolled mining of sand from riverbeds for several decades has had detrimental effects on the environment. Several countries across the world have put strict restrictions on sand mining from riverbeds. However, in countries like India, the huge infrastructural boom has made the local construction industry to look for alternative materials to sand. This study aims at understanding the suitability of granulated blast furnace slag (GBS) as fine aggregates in masonry mortars. Apart from characterising the material properties of GBS, such as particle size distribution, pH, chemical composition, etc., of GBS, tests were performed on the mortars with GBS as fine aggregate. Additionally, the properties of five brick tall, stack bonded masonry prisms with various types of GBS mortars were studied. The mortars with mix proportions 1: 0: 6 (cement: lime: fine aggregate), 1: 1: 6, and 1: 0: 3 were considered for the study. Fresh and hardened properties of mortar, such as flow and compressive strength, were studied. To understand the behaviour of GBS mortars on masonry, tests such as compressive strength and flexure bond strength were performed on masonry prisms made with a different type of GBS mortars. Furthermore, the elastic properties of masonry with GBS mortars were also studied under compression. For comparison purposes, the properties of corresponding control mortars with natural sand as fine aggregate and masonry prisms with sand mortars were also studied under similar testing conditions. From the study, it was observed the addition of GBS negatively influenced the flow of mortars and positively influenced the compressive strength. The GBS mortars showed 20 to 25 % higher compressive strength at 28 days of age, compared to corresponding control mortars. Furthermore, masonry made with GBS mortars showed nearly 10 % higher compressive strengths compared to control specimens. But, the impact of GBS on the flexural strength of masonry was marginal.

Keywords: building materials, fine aggregate, granulated blast furnace slag in mortars, masonry properties

Procedia PDF Downloads 123
29094 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 112
29093 A Semi-supervised Classification Approach for Trend Following Investment Strategy

Authors: Rodrigo Arnaldo Scarpel

Abstract:

Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.

Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation

Procedia PDF Downloads 91
29092 Use of Remote Sensing for Seasonal and Temporal Monitoring in Wetlands: A Case Study of Akyatan Lagoon

Authors: A. Cilek, S. Berberoglu, A. Akin Tanriover, C. Donmez

Abstract:

Wetlands are the areas which have important effects and functions on protecting human life, adjust to nature, and biological variety, besides being potential exploitation sources. Observing the changes in these sensitive areas is important to study for data collecting and correct planning for the future. Remote sensing and Geographic Information System are being increasingly used for environmental studies such as biotope mapping and habitat monitoring. Akyatan Lagoon, one of the most important wetlands in Turkey, has been facing serious threats from agricultural applications in recent years. In this study, seasonal and temporal monitoring in wetlands system are determined by using remotely sensed data and Geographic Information Systems (GIS) between 1985 and 2015. The research method is based on classifying and mapping biotopes in the study area. The natural biotope types were determined as coastal sand dunes, salt marshes, river beds, coastal woods, lakes, lagoons.

Keywords: biotope mapping, GIS, remote sensing, wetlands

Procedia PDF Downloads 395
29091 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 144
29090 Recreation and Environmental Quality of Tropical Wetlands: A Social Media Based Spatial Analysis

Authors: Michael Sinclair, Andrea Ghermandi, Sheela A. Moses, Joseph Sabu

Abstract:

Passively crowdsourced data, such as geotagged photographs from social media, represent an opportunistic source of location-based and time-specific behavioral data for ecosystem services analysis. Such data have innovative applications for environmental management and protection, which are replicable at wide spatial scales and in the context of both developed and developing countries. Here we test one such innovation, based on the analysis of the metadata of online geotagged photographs, to investigate the provision of recreational services by the entire network of wetland ecosystems in the state of Kerala, India. We estimate visitation to individual wetlands state-wide and extend, for the first time to a developing region, the emerging application of cultural ecosystem services modelling using data from social media. The impacts of restoration of wetland areal extension and water quality improvement are explored as a means to inform more sustainable management strategies. Findings show that improving water quality to a level suitable for the preservation of wildlife and fisheries could increase annual visits by 350,000, an increase of 13% in wetland visits state-wide, while restoring previously encroached wetland area could result in a 7% increase in annual visits, corresponding to 49,000 visitors, in the Ashtamudi and Vembanad lakes alone, two large coastal Ramsar wetlands in Kerala. We discuss how passive crowdsourcing of social media data has the potential to improve current ecosystem service analyses and environmental management practices also in the context of developing countries.

Keywords: coastal wetlands, cultural ecosystem services, India, passive crowdsourcing, social media, wetland restoration

Procedia PDF Downloads 158
29089 “The Unbearable Lightness of Being” Book as an Interdisciplinary Study Basis for Students’ Learning Process about Love and Politics at Old Communist Czechoslovakia

Authors: Clarissa Valença Travassos da Silva

Abstract:

In this article, it is intended to study the book “The unbearable Lightness of Being” by the Czech Republican writer Milan Kundera. The main objective is to be an interdisciplinary study basis for students in the world about love and politics at old communist Czechoslovakia. Love is presented by discussing the relationship between Tomas and Tereza and the discovery of true love. Furthermore, it is debated the Russian invasion in Czechoslovakia and the outcomes of it for the personages, all this related to the contradiction of lightness and heaviness in life. For the production of this didactic material, the researcher based her work on the original book, “The Unbearable Lightness of Being” by Kundera, Milan Kundera’s interviews, Friedrich Nietzche, Zygmunt Bauman and George Orwell, among Brazilian and international articles on the issue.

Keywords: lightness, heaviness, Russia, Czechoslovakia, love

Procedia PDF Downloads 403
29088 Fractional, Component and Morphological Composition of Ambient Air Dust in the Areas of Mining Industry

Authors: S.V. Kleyn, S.Yu. Zagorodnov, А.А. Kokoulina

Abstract:

Technogenic emissions of the mining and processing complex are characterized by a high content of chemical components and solid dust particles. However, each industrial enterprise and the surrounding area have features that require refinement and parameterization. Numerous studies have shown the negative impact of fine dust PM10 and PM2.5 on the health, as well as the possibility of toxic components absorption, including heavy metals by dust particles. The target of the study was the quantitative assessment of the fractional and particle size composition of ambient air dust in the area of impact by primary magnesium production complex. Also, we tried to describe the morphology features of dust particles. Study methods. To identify the dust emission sources, the analysis of the production process has been carried out. The particulate composition of the emissions was measured using laser particle analyzer Microtrac S3500 (covered range of particle size is 20 nm to 2000 km). Particle morphology and the component composition were established by electron microscopy by scanning microscope of high resolution (magnification rate - 5 to 300 000 times) with X-ray fluorescence device S3400N ‘HITACHI’. The chemical composition was identified by X-ray analysis of the samples using an X-ray diffractometer XRD-700 ‘Shimadzu’. Determination of the dust pollution level was carried out using model calculations of emissions in the atmosphere dispersion. The calculations were verified by instrumental studies. Results of the study. The results demonstrated that the dust emissions of different technical processes are heterogeneous and fractional structure is complicated. The percentage of particle sizes up to 2.5 micrometres inclusive was ranged from 0.00 to 56.70%; particle sizes less than 10 microns inclusive – 0.00 - 85.60%; particle sizes greater than 10 microns - 14.40% -100.00%. During microscopy, the presence of nanoscale size particles has been detected. Studied dust particles are round, irregular, cubic and integral shapes. The composition of the dust includes magnesium, sodium, potassium, calcium, iron, chlorine. On the base of obtained results, it was performed the model calculations of dust emissions dispersion and establishment of the areas of fine dust РМ 10 and РМ 2.5 distribution. It was found that the dust emissions of fine powder fractions PM10 and PM2.5 are dispersed over large distances and beyond the border of the industrial site of the enterprise. The population living near the enterprise is exposed to the risk of diseases associated with dust exposure. Data are transferred to the economic entity to make decisions on the measures to minimize the risks. Exposure and risks indicators on the health are used to provide named patient health and preventive care to the citizens living in the area of negative impact of the facility.

Keywords: dust emissions, еxposure assessment, PM 10, PM 2.5

Procedia PDF Downloads 262
29087 A Comparative Study on the Dimensional Error of 3D CAD Model and SLS RP Model for Reconstruction of Cranial Defect

Authors: L. Siva Rama Krishna, Sriram Venkatesh, M. Sastish Kumar, M. Uma Maheswara Chary

Abstract:

Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modelling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyse the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.

Keywords: rapid prototyping, selective laser sintering, cranial defect, dimensional error

Procedia PDF Downloads 326
29086 Caregiver Experiences of Attachment-Based Interventions

Authors: Mikaela E. Flood, Elaine Greidanus

Abstract:

This study will examine how caregivers construct and interpret their experience in applying attachment-based interventions, guided by the research question: How do caregivers construct and interpret their experiences when attempting to apply attachment-based interventions in their relationships? Using a constructivist paradigm, this qualitative study aims to explore caregivers' experiences of attachment-based interventions through semi-structured interviews with five individuals. The research aims to uncover how caregivers perceive, implement, and reflect upon attachment-based interventions, focusing on challenges and successes encountered. Thematic analysis of interview data seeks to reveal recurring patterns and themes, offering insights into the practical implications of attachment theory and interventions within caregiver contexts. The findings may impact the field by integrating theoretical insights with practical applications. They may also inform therapeutic approaches and support services, as well as how attachment-based interventions are implemented, thereby enhancing caregivers' capacity to foster secure attachments. Moreover, this research may inform existing bodies of knowledge by providing empirical support and a deeper understanding of attachment theory's application in real-world caregiving scenarios. In terms of future research, this study may identify potential avenues for further exploration of the application and implementation of attachment-based interventions. Ultimately, this research aims to advance both theoretical understanding and practical applications of attachment-based interventions for enhancing relationship dynamics and emotional well-being in caregiving settings.

Keywords: children, attachment, intervention, caregiver

Procedia PDF Downloads 18
29085 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 118
29084 Screening of Potential Cytotoxic Activities of Some Medicinal Plants of Saudi Arabia

Authors: Syed Farooq Adil, Merajuddinkhan, Mujeeb Khan, Hamad Z. Alkhathlan

Abstract:

Phytochemicals from plant extracts belong to an important source of natural products which have demonstrated excellent cytotoxic activities. However, plants of different origins exhibit diverse chemical compositions and bioactivities. Therefore, the discovery of plants based new anticancer agents from different parts of the world is always challenging. In this study, methanolic extracts of different parts of 11 plants from Saudi Arabia have been tested in vitro for their anticancer potential on human liver cancer cell line (HepG2). Particularly, for this study, plants from Asteraceae, Resedaceae, and Polygonaceae families were chosen on the basis of locally available ethnobotanical data and their medicinal properties. Among 12 tested extract samples, three samples obtained from Artemisia monosperma stem, Ochradenus baccatus aerial parts, and Pulicaria glutinosa stem have demonstrated interesting cytotoxic activities with a cell viability of 29.3%, 28.4% and 24.2%, respectively. Whereas, four plant extracts including Calendula arvensis aerial parts, Scorzonera musilii whole plant, A. monosperma leaves show moderate anticancer properties bearing a cell viability ranging from 11.9 to 16.7%. The remaining extracts have shown poor cytotoxic activities. Subsequently, GC-MS analysis of methanolic extracts of the four most active plants extracts such as C. comosum, O. baccatus, P. glutinosa and A. monosperma detected the presence of 41 phytomolecules. Among which 3-(4-hydroxyphenyl) propionitrile (1), 8,11-octadecadiynoic acid methyl ester (2), 6,7-dimethoxycoumarin (3), and 1-(2-hydroxyphenyl) ethenone (4) were found to be the lead compounds of C. comosum, O. baccatus P. glutinosa and A. monosperma, respectively.

Keywords: medicinal plants, asteraceae, polygonaceae, hepg2

Procedia PDF Downloads 129
29083 The Role of Healthcare Informatics in Combating the COVID-19 Pandemic

Authors: Philip Eappen, Narasimha Rao Vajjhala

Abstract:

This chapter examines how healthcare organizations harnessed innovative healthcare informatics to navigate the challenges posed by the COVID-19 pan-demic, addressing critical needs and improving care delivery. The pandemic's un-precedented demands necessitated the adoption of new and advanced tools to manage healthcare operations more effectively. Informatics solutions played a crucial role in facilitating the smooth functioning of healthcare systems during this crisis and are anticipated to remain central to future healthcare management. Technologies such as telemedicine helped healthcare professionals minimize ex-posure to COVID-19 patients, thereby reducing infection risks within healthcare facilities. This chapter explores a range of informatics applications utilized worldwide, including telemedicine, AI-driven solutions, big data analytics, drones, robots, and digital platforms for drug delivery, all of which enabled re-mote patient care and enhanced healthcare accessibility and safety during the pan-demic.

Keywords: healthcare informatics, COVID-19 Pandemic, telemedicine, AI-driven healthcare, big data analytics, remote patient care, digital health platforms

Procedia PDF Downloads 13