Search results for: graph vulnerability measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4209

Search results for: graph vulnerability measure

4089 Drug-Drug Interaction Prediction in Diabetes Mellitus

Authors: Rashini Maduka, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

Drug-drug interactions (DDIs) can happen when two or more drugs are taken together. Today DDIs have become a serious health issue due to adverse drug effects. In vivo and in vitro methods for identifying DDIs are time-consuming and costly. Therefore, in-silico-based approaches are preferred in DDI identification. Most machine learning models for DDI prediction are used chemical and biological drug properties as features. However, some drug features are not available and costly to extract. Therefore, it is better to make automatic feature engineering. Furthermore, people who have diabetes already suffer from other diseases and take more than one medicine together. Then adverse drug effects may happen to diabetic patients and cause unpleasant reactions in the body. In this study, we present a model with a graph convolutional autoencoder and a graph decoder using a dataset from DrugBank version 5.1.3. The main objective of the model is to identify unknown interactions between antidiabetic drugs and the drugs taken by diabetic patients for other diseases. We considered automatic feature engineering and used Known DDIs only as the input for the model. Our model has achieved 0.86 in AUC and 0.86 in AP.

Keywords: drug-drug interaction prediction, graph embedding, graph convolutional networks, adverse drug effects

Procedia PDF Downloads 69
4088 Vulnerability Assessment of Vertically Irregular Structures during Earthquake

Authors: Pranab Kumar Das

Abstract:

Vulnerability assessment of buildings with irregularity in the vertical direction has been carried out in this study. The constructions of vertically irregular buildings are increasing in the context of fast urbanization in the developing countries including India. During two reconnaissance based survey performed after Nepal earthquake 2015 and Imphal (India) earthquake 2016, it has been observed that so many structures are damaged due to the vertically irregular configuration. These irregular buildings are necessary to perform safely during seismic excitation. Therefore, it is very urgent demand to point out the actual vulnerability of the irregular structure. So that remedial measures can be taken for protecting those structures during natural hazard as like earthquake. This assessment will be very helpful for India and as well as for the other developing countries. A sufficient number of research has been contributed to the vulnerability of plan asymmetric buildings. In the field of vertically irregular buildings, the effort has not been forwarded much to find out their vulnerability during an earthquake. Irregularity in vertical direction may be caused due to irregular distribution of mass, stiffness and geometrically irregular configuration. Detailed analysis of such structures, particularly non-linear/ push over analysis for performance based design seems to be challenging one. The present paper considered a number of models of irregular structures. Building models made of both reinforced concrete and brick masonry are considered for the sake of generality. The analyses are performed with both help of finite element method and computational method.The study, as a whole, may help to arrive at a reasonably good estimate, insight for fundamental and other natural periods of such vertically irregular structures. The ductility demand, storey drift, and seismic response study help to identify the location of critical stress concentration. Summarily, this paper is a humble step for understanding the vulnerability and framing up the guidelines for vertically irregular structures.

Keywords: ductility, stress concentration, vertically irregular structure, vulnerability

Procedia PDF Downloads 208
4087 Assessing the Efficacy of Network Mapping, Vulnerability Scanning, and Penetration Testing in Enhancing Security for Academic Networks

Authors: Kenny Onayemi

Abstract:

In an era where academic institutions increasingly rely on information technology, the security of academic networks has emerged as a paramount concern. This comprehensive study delves into the effectiveness of security practices, including network mapping, vulnerability scanning, and penetration testing, within academic networks. Leveraging data from surveys administered to faculty, staff, IT professionals and IT students in the university, the study assesses their familiarity with these practices, perceived effectiveness, and frequency of implementation. The findings reveal that a significant portion of respondents exhibit a strong understanding of network mapping, vulnerability scanning, and penetration testing, highlighting the presence of knowledgeable professionals within academic institutions. Additionally, active scanning using network scanning tools and automated vulnerability scanning tools emerge as highly effective methods. However, concerns arise as the respondents show that the academic institutions conduct these practices rarely or never. Notably, many respondents have reported significant vulnerabilities or security incidents through these security measures within their institution. This study concludes with recommendations to enhance network security awareness and practices among faculty, staff, IT personnel, and students, ultimately fortifying the security posture of academic networks in the digital age.

Keywords: network security, academic networks, vulnerability scanning, penetration testing, information security

Procedia PDF Downloads 22
4086 The Impact of Karst Structures on the Urban Environment in Semi-Arid Area

Authors: Benhammadi Hocine, Chaffai Hicham

Abstract:

Urban development is often dependent on adequate land for expansion, except that sometimes these areas have vulnerability. This is the case of karst regions characterized by carbonate geological formations marked by the presence of cavities and cracks. The impact of climate variability in Cheria area marked by a growing shortage of rainfall, the impact resulted in the development of the vulnerability of these structures. This vulnerability has led to the appearance of collapse phenomena as well in both agricultural and urban areas. Two phenomena have emerged to explain the collapses, the first is assigned a filling process in the cavities, and the second is due to a weakening of the resistance that collapses limestone slab shear phenomenon. In urban areas, the weight of the buildings has increased the load on the limestone slab and accelerated the collapse. The analysis of the environmental process is in the context of our modest work, after which we indicate the appropriate methods for management policy of urban expansion. This management more preventive (upstream), much less expensive than remedial solutions (downstream) needed after the event and sometimes ineffective.

Keywords: Cheria, urban, climate variability, vulnerability karst collapse, extension, management

Procedia PDF Downloads 439
4085 Scheduling in Cloud Networks Using Chakoos Algorithm

Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi

Abstract:

Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.

Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm

Procedia PDF Downloads 35
4084 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 49
4083 Assessing Livelihood Vulnerability to Climate Change and Adaptation Strategies in Rajanpur District, Pakistan

Authors: Muhammad Afzal, Shahbaz Mushtaq, Duc-Anh-An-Vo, Kathryn Reardon Smith, Thanh Ma

Abstract:

Climate change has become one of the most challenging environmental issues in the 21st century. Climate change-induced natural disasters, especially floods, are the major factors of livelihood vulnerability, impacting millions of individuals worldwide. Evaluating and mitigating the effects of floods requires an in-depth understanding of the relationship between vulnerability and livelihood capital assets. Using an integrated approach, sustainable livelihood framework, and system thinking approach, the study developed a conceptual model of a generalized livelihood system in District Rajanpur, Pakistan. The model visualizes the livelihood vulnerability system as a whole and identifies the key feedback loops likely to influence the livelihood vulnerability. The study suggests that such conceptual models provide effective communication and understanding tools to stakeholders and decision-makers to anticipate the problem and design appropriate policies. It can also serve as an evaluation technique for rural livelihood policy and identify key systematic interventions. The key finding of the study reveals that household income, health, and education are the major factors behind the livelihood vulnerability of the rural poor of District Rajanpur. The Pakistani government tried to reduce the livelihood vulnerability of the region through different income, health, and education programs, but still, many changes are required to make these programs more effective especially during the flood times. The government provided only cash to vulnerable and marginalized families through income support programs, but this study suggests that along with the cash, the government must provide seed storage facilities and access to crop insurance to the farmers. Similarly, the government should establish basic health units in villages and frequent visits of medical mobile vans should be arranged with advanced medical lab facilities during and after the flood.

Keywords: livelihood vulnerability, rural communities, flood, sustainable livelihood framework, system dynamics, Pakistan

Procedia PDF Downloads 24
4082 Mental Vulnerability and Coping Strategies as a Factor for Academic Success for Pupils with Special Education Needs

Authors: T. Dubayova

Abstract:

Slovak, as well as foreign authors, believe that the influence of non-cognitive factors on a student's academic success or failure is unquestionable. The aim of this paper is to establish a link between the mental vulnerability and coping strategies used by 4th grade elementary school students in dealing with stressful situations and their academic performance, which was used as a simple quantitative indicator of academic success. The research sample consists of 320 students representing the standard population and 60 students with special education needs (SEN), who were assessed by the Strengths and Difficulties Questionnaire (SDQ) by their teachers and the Children’s Coping Strategies Checklist (CCSC-R1) filled in by themselves. Students with SEN recorded an extraordinarily high frequency of mental vulnerability (34.5 %) than students representing the standard population (7 %). The poorest academic performance of students with SEN was associated with the avoidance behavior displayed during stressful situations. Students of the standard population did not demonstrate this association. Students with SEN are more likely to display mental health problems than students of the standard population. This may be caused by the accumulation of and frequent exposure to situations that they perceive as stressful.

Keywords: coping, mental vulnerability, pupil with special education needs, school performance, school success

Procedia PDF Downloads 326
4081 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City

Authors: Liang Weichien

Abstract:

Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.

Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation

Procedia PDF Downloads 268
4080 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks

Authors: Van Trieu, Shouhuai Xu, Yusheng Feng

Abstract:

Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.

Keywords: causality, multilevel graph, cyber-attacks, prediction

Procedia PDF Downloads 136
4079 Nonlinear Evolution on Graphs

Authors: Benniche Omar

Abstract:

We are concerned with abstract fully nonlinear differential equations having the form y’(t)=Ay(t)+f(t,y(t)) where A is an m—dissipative operator (possibly multi—valued) defined on a subset D(A) of a Banach space X with values in X and f is a given function defined on I×X with values in X. We consider a graph K in I×X. We recall that K is said to be viable with respect to the above abstract differential equation if for each initial data in K there exists at least one trajectory starting from that initial data and remaining in K at least for a short time. The viability problem has been studied by many authors by using various techniques and frames. If K is closed, it is shown that a tangency condition, which is mainly linked to the dynamic, is crucial for viability. In the case when X is infinite dimensional, compactness and convexity assumptions are needed. In this paper, we are concerned with the notion of near viability for a given graph K with respect to y’(t)=Ay(t)+f(t,y(t)). Roughly speaking, the graph K is said to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)), if for each initial data in K there exists at least one trajectory remaining arbitrary close to K at least for short time. It is interesting to note that the near viability is equivalent to an appropriate tangency condition under mild assumptions on the dynamic. Adding natural convexity and compactness assumptions on the dynamic, we may recover the (exact) viability. Here we investigate near viability for a graph K in I×X with respect to y’(t)=Ay(t)+f(t,y(t)) where A and f are as above. We emphasis that the t—dependence on the perturbation f leads us to introduce a new tangency concept. In the base of a tangency conditions expressed in terms of that tangency concept, we formulate criteria for K to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)). As application, an abstract null—controllability theorem is given.

Keywords: abstract differential equation, graph, tangency condition, viability

Procedia PDF Downloads 118
4078 Research on Fuzzy Test Framework Based on Concolic Execution

Authors: Xiong Xie, Yuhang Chen

Abstract:

Vulnerability discovery technology is a significant field of the current. In this paper, a fuzzy framework based on concolic execution has been proposed. Fuzzy test and symbolic execution are widely used in the field of vulnerability discovery technology. But each of them has its own advantages and disadvantages. During the path generation stage, path traversal algorithm based on generation is used to get more accurate path. During the constraint solving stage, dynamic concolic execution is used to avoid the path explosion. If there is external call, the concolic based on function summary is used. Experiments show that the framework can effectively improve the ability of triggering vulnerabilities and code coverage.

Keywords: concolic execution, constraint solving, fuzzy test, vulnerability discovery

Procedia PDF Downloads 198
4077 A Semiotic Approach to Vulnerability in Conducting Gesture and Singing Posture

Authors: Johann Van Niekerk

Abstract:

The disciplines of conducting (instrumental or choral) and of singing presume a willingness toward an open posture and, in many cases, demand it for effective communication and technique. Yet, this very openness, with the "spread-eagle" gesture as an extreme, is oftentimes counterintuitive for musicians and within the trajectory of human evolution. Conversely, it is in this very gesture of "taking up space" that confidence-gaining techniques such as the popular "power pose" are based. This paper consists primarily of a literature review, exploring the topics of physical openness and vulnerability, considering the semiotics of the "spread-eagle" and its accompanying letter X. A major finding of this research is the discrepancy between evolutionary instinct towards physical self-protection and “folding in” and the demands of the discipline of physical and gestural openness, expansiveness and vulnerability. A secondary finding is ways in which encouragement of confidence-gaining techniques may be more effective in obtaining the required results than insistence on vulnerability, which is influenced by various cultural contexts and socialization. Choral conductors and music educators are constantly seeking ways to promote engagement and healthy singing. Much of the information and direction toward this goal is gleaned by students from conducting gestures and other pedagogies employed in the rehearsal. The findings of this research provide yet another avenue toward reaching the goals required for sufficient and effective teaching and artistry on the part of instructors and students alike.

Keywords: conducting, gesture, music, pedagogy, posture, vulnerability

Procedia PDF Downloads 38
4076 Land Degradation Vulnerability Modeling: A Study on Selected Micro Watersheds of West Khasi Hills Meghalaya, India

Authors: Amritee Bora, B. S. Mipun

Abstract:

Land degradation is often used to describe the land environmental phenomena that reduce land’s original productivity both qualitatively and quantitatively. The study of land degradation vulnerability primarily deals with “Environmentally Sensitive Areas” (ESA) and the amount of topsoil loss due to erosion. In many studies, it is observed that the assessment of the existing status of land degradation is used to represent the vulnerability. Moreover, it is also noticed that in most studies, the primary emphasis of land degradation vulnerability is to assess its sensitivity to soil erosion only. However, the concept of land degradation vulnerability can have different objectives depending upon the perspective of the study. It shows the extent to which changes in land use land cover can imprint their effect on the land. In other words, it represents the susceptibility of a piece of land to degrade its productive quality permanently or in the long run. It is also important to mention that the vulnerability of land degradation is not a single factor outcome. It is a probability assessment to evaluate the status of land degradation and needs to consider both biophysical and human induce parameters. To avoid the complexity of the previous models in this regard, the present study has emphasized on to generate a simplified model to assess the land degradation vulnerability in terms of its current human population pressure, land use practices, and existing biophysical conditions. It is a “Mixed-Method” termed as the land degradation vulnerability index (LDVi). It was originally inspired by the MEDALUS model (Mediterranean Desertification and Land Use), 1999, and Farazadeh’s 2007 revised version of it. It has followed the guidelines of Space Application Center, Ahmedabad / Indian Space Research Organization for land degradation vulnerability. The model integrates the climatic index (Ci), vegetation index (Vi), erosion index (Ei), land utilization index (Li), population pressure index (Pi), and cover management index (CMi) by giving equal weightage to each parameter. The final result shows that the very high vulnerable zone primarily indicates three (3) prominent circumstances; land under continuous population pressure, high concentration of human settlement, and high amount of topsoil loss due to surface runoff within the study sites. As all the parameters of the model are amalgamated with equal weightage further with the help of regression analysis, the LDVi model also provides a strong grasp of each parameter and how far they are competent to trigger the land degradation process.

Keywords: population pressure, land utilization, soil erosion, land degradation vulnerability

Procedia PDF Downloads 139
4075 Modeling of Bioelectric Activity of Nerve Cells Using Bond Graph Method

Authors: M. Ghasemi, F. Eskandari, B. Hamzehei, A. R. Arshi

Abstract:

Bioelectric activity of nervous cells might be changed causing by various factors. This alteration can lead to unforeseen circumstances in other organs of the body. Therefore, the purpose of this study was to model a single neuron and its behavior under an initial stimulation. This study was developed based on cable theory by means of the Bond Graph method. The numerical values of the parameters were derived from empirical studies of cellular electrophysiology experiments. Initial excitation was applied through square current functions, and the resulted action potential was estimated along the neuron. The results revealed that the model was developed in this research adapted with the results of experimental studies and demonstrated the electrical behavior of nervous cells properly.

Keywords: bond graph, stimulation, nervous cells, modeling

Procedia PDF Downloads 400
4074 Robust Diagnosis Efficiency by Bond-Graph Approach

Authors: Benazzouz Djamel, Termeche Adel, Touati Youcef, Alem Said, Ouziala Mahdi

Abstract:

This paper presents an approach which detect and isolate efficiently a fault in a system. This approach avoids false alarms, non-detections and delays in detecting faults. A study case have been proposed to show the importance of taking into consideration the uncertainties in the decision-making procedure and their effect on the degradation diagnostic performance and advantage of using Bond Graph (BG) for such degradation. The use of BG in the Linear Fractional Transformation (LFT) form allows generating robust Analytical Redundancy Relations (ARR’s), where the uncertain part of ARR’s is used to generate the residuals adaptive thresholds. The study case concerns an electromechanical system composed of a motor, a reducer and an external load. The aim of this application is to show the effectiveness of the BG-LFT approach to robust fault detection.

Keywords: bond graph, LFT, uncertainties, detection and faults isolation, ARR

Procedia PDF Downloads 279
4073 Leveraging Artificial Intelligence to Analyze the Interplay between Social Vulnerability Index and Mobility Dynamics in Pandemics

Authors: Joshua Harrell, Gideon Osei Bonsu, Susan Garza, Clarence Conner, Da’Neisha Harris, Emma Bukoswki, Zohreh Safari

Abstract:

The Social Vulnerability Index (SVI) stands as a pivotal tool for gauging community resilience amidst diverse stressors, including pandemics like COVID-19. This paper synthesizes recent research and underscores the significance of SVI in elucidating the differential impacts of crises on communities. Drawing on studies by Fox et al. (2023) and Mah et al. (2023), we delve into the application of SVI alongside emerging data sources to uncover nuanced insights into community vulnerability. Specifically, we explore the utilization of SVI in conjunction with mobility data from platforms like SafeGraph to probe the intricate relationship between social vulnerability and mobility dynamics during the COVID-19 pandemic. By leveraging 16 community variables derived from the American Community Survey, including socioeconomic status and demographic characteristics, SVI offers actionable intelligence for guiding targeted interventions and resource allocation. Building upon recent advancements, this paper contributes to the discourse on harnessing AI techniques to mitigate health disparities and fortify public health resilience in the face of pandemics and other crises.

Keywords: social vulnerability index, mobility dynamics, data analytics, health equity, pandemic preparedness, targeted interventions, data integration

Procedia PDF Downloads 34
4072 GeneNet: Temporal Graph Data Visualization for Gene Nomenclature and Relationships

Authors: Jake Gonzalez, Tommy Dang

Abstract:

This paper proposes a temporal graph approach to visualize and analyze the evolution of gene relationships and nomenclature over time. An interactive web-based tool implements this temporal graph, enabling researchers to traverse a timeline and observe coupled dynamics in network topology and naming conventions. Analysis of a real human genomic dataset reveals the emergence of densely interconnected functional modules over time, representing groups of genes involved in key biological processes. For example, the antimicrobial peptide DEFA1A3 shows increased connections to related alpha-defensins involved in infection response. Tracking degree and betweenness centrality shifts over timeline iterations also quantitatively highlight the reprioritization of certain genes’ topological importance as knowledge advances. Examination of the CNR1 gene encoding the cannabinoid receptor CB1 demonstrates changing synonymous relationships and consolidating naming patterns over time, reflecting its unique functional role discovery. The integrated framework interconnecting these topological and nomenclature dynamics provides richer contextual insights compared to isolated analysis methods. Overall, this temporal graph approach enables a more holistic study of knowledge evolution to elucidate complex biology.

Keywords: temporal graph, gene relationships, nomenclature evolution, interactive visualization, biological insights

Procedia PDF Downloads 33
4071 Enhancing Flood Modeling: Unveiling the Role of Hazard Parameters in Building Vulnerability

Authors: Mohammad Shoraka, Raulina Wojtkiewicz, Karthik Ramanathan

Abstract:

Following the devastating summer 2021 floods in Germany, catastrophe modelers realized that hazard parameters, such as flow velocity, flood duration, and debris flow, play a significant role in capturing the overall damage potential of such events. Accounting for the location-specific static depth as the only hazard intensity metric may lead to a substantial underestimation of the vulnerability of building stock and, eventually, the loss potential of such catastrophic events. As the flow velocity increases, the hydrodynamic forces acting on various building components are amplified. Longer flood duration leads to water permeating porous components, incurring additional cleanup costs that contribute to an overall increase in damage. Debris flow possesses the power to erode extensive sections of buildings, thus substantially augmenting the extent of losses. This paper introduces four flow velocity classes, ranging from no flow velocity to major velocity, along with two flood duration classes: short and long, in estimating the vulnerability of the building stock. Additionally, the study examines the impact of the presence of debris flow and its role in exacerbating flood damage. The paper delves into the effects of each of these parameters on building component damageability and their collective impact on the overall building vulnerability.

Keywords: catastrophe modeling, building vulnerability, hazard parameters, component damage function

Procedia PDF Downloads 38
4070 Coastal Flood Mapping of Vulnerability Due to Sea Level Rise and Extreme Weather Events: A Case Study of St. Ives, UK

Authors: S. Vavias, T. R. Brewer, T. S. Farewell

Abstract:

Coastal floods have been identified as an important natural hazard that can cause significant damage to the populated built-up areas, related infrastructure and also ecosystems and habitats. This study attempts to fill the gap associated with the development of preliminary assessments of coastal flood vulnerability for compliance with the EU Directive on the Assessment and Management of Flood Risks (2007/60/EC). In this context, a methodology has been created by taking into account three major parameters; the maximum wave run-up modelled from historical weather observations, the highest tide according to historic time series, and the sea level rise projections due to climate change. A high resolution digital terrain model (DTM) derived from LIDAR data has been used to integrate the estimated flood events in a GIS environment. The flood vulnerability map created shows potential risk areas and can play a crucial role in the coastal zone planning process. The proposed method has the potential to be a powerful tool for policy and decision makers for spatial planning and strategic management.

Keywords: coastal floods, vulnerability mapping, climate change, extreme weather events

Procedia PDF Downloads 369
4069 Measurement of Susceptibility Users Using Email Phishing Attack

Authors: Cindy Sahera, Sarwono Sutikno

Abstract:

Rapid technological developments also have negative impacts, namely the increasing criminal cases based on technology or cybercrime. One technique that can be used to conduct cybercrime attacks are phishing email. The issue is whether the user is aware that email can be misused by others so that it can harm the user's own? This research was conducted to measure the susceptibility of selected targets against email abuse. The objectives of this research are measurement of targets’ susceptibility and find vulnerability in email recipient. There are three steps being taken in this research, (1) the information gathering phase, (2) the design phase, and (3) the execution phase. The first step includes the collection of the information necessary to carry out an attack on a target. The next step is to make the design of an attack against a target. The last step is to send phishing emails to the target. The levels of susceptibility are three: level 1, level 2 and level 3. Level 1 indicates a low level of targets’ susceptibility, level 2 indicates the intermediate level of targets’ susceptibility, and level 3 indicates a high level of targets’ susceptibility. The results showed that users who are on level 1 and level 2 more that level 3, which means the user is not too careless. However, it does not mean the user to be safe. There are still vulnerabilities that may occur, such as automatic location detection when opening emails and automatic downloaded malware as user clicks a link in the email.

Keywords: cybercrime, email phishing, susceptibility, vulnerability

Procedia PDF Downloads 255
4068 Constructing Orthogonal De Bruijn and Kautz Sequences and Applications

Authors: Yaw-Ling Lin

Abstract:

A de Bruijn graph of order k is a graph whose vertices representing all length-k sequences with edges joining pairs of vertices whose sequences have maximum possible overlap (length k−1). Every Hamiltonian cycle of this graph defines a distinct, minimum length de Bruijn sequence containing all k-mers exactly once. A Kautz sequence is the minimal generating sequence so as the sequence of minimal length that produces all possible length-k sequences with the restriction that every two consecutive alphabets in the sequences must be different. A collection of de Bruijn/Kautz sequences are orthogonal if any two sequences are of maximally differ in sequence composition; that is, the maximum length of their common substring is k. In this paper, we discuss how such a collection of (maximal) orthogonal de Bruijn/Kautz sequences can be made and use the algorithm to build up a web application service for the synthesized DNA and other related biomolecular sequences.

Keywords: biomolecular sequence synthesis, de Bruijn sequences, Eulerian cycle, Hamiltonian cycle, Kautz sequences, orthogonal sequences

Procedia PDF Downloads 130
4067 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 62
4066 Aspect-Level Sentiment Analysis with Multi-Channel and Graph Convolutional Networks

Authors: Jiajun Wang, Xiaoge Li

Abstract:

The purpose of the aspect-level sentiment analysis task is to identify the sentiment polarity of aspects in a sentence. Currently, most methods mainly focus on using neural networks and attention mechanisms to model the relationship between aspects and context, but they ignore the dependence of words in different ranges in the sentence, resulting in deviation when assigning relationship weight to other words other than aspect words. To solve these problems, we propose a new aspect-level sentiment analysis model that combines a multi-channel convolutional network and graph convolutional network (GCN). Firstly, the context and the degree of association between words are characterized by Long Short-Term Memory (LSTM) and self-attention mechanism. Besides, a multi-channel convolutional network is used to extract the features of words in different ranges. Finally, a convolutional graph network is used to associate the node information of the dependency tree structure. We conduct experiments on four benchmark datasets. The experimental results are compared with those of other models, which shows that our model is better and more effective.

Keywords: aspect-level sentiment analysis, attention, multi-channel convolution network, graph convolution network, dependency tree

Procedia PDF Downloads 174
4065 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems

Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos

Abstract:

Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.

Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system

Procedia PDF Downloads 158
4064 Surface to the Deeper: A Universal Entity Alignment Approach Focusing on Surface Information

Authors: Zheng Baichuan, Li Shenghui, Li Bingqian, Zhang Ning, Chen Kai

Abstract:

Entity alignment (EA) tasks in knowledge graphs often play a pivotal role in the integration of knowledge graphs, where structural differences often exist between the source and target graphs, such as the presence or absence of attribute information and the types of attribute information (text, timestamps, images, etc.). However, most current research efforts are focused on improving alignment accuracy, often along with an increased reliance on specific structures -a dependency that inevitably diminishes their practical value and causes difficulties when facing knowledge graph alignment tasks with varying structures. Therefore, we propose a universal knowledge graph alignment approach that only utilizes the common basic structures shared by knowledge graphs. We have demonstrated through experiments that our method achieves state-of-the-art performance in fair comparisons.

Keywords: knowledge graph, entity alignment, transformer, deep learning

Procedia PDF Downloads 14
4063 Robust Diagnosability of PEMFC Based on Bond Graph LFT

Authors: Ould Bouamama, M. Bressel, D. Hissel, M. Hilairet

Abstract:

Fuel cell (FC) is one of the best alternatives of fossil energy. Recently, the research community of fuel cell has shown a considerable interest for diagnosis in view to ensure safety, security, and availability when faults occur in the process. The problematic for model based FC diagnosis consists in that the model is complex because of coupling of several kind of energies and the numerical values of parameters are not always known or are uncertain. The present paper deals with use of one tool: the Linear Fractional Transformation bond graph tool not only for uncertain modelling but also for monitorability (ability to detect and isolate faults) analysis and formal generation of robust fault indicators with respect to parameter uncertainties.The developed theory applied to a nonlinear FC system has proved its efficiency.

Keywords: bond graph, fuel cell, fault detection and isolation (FDI), robust diagnosis, structural analysis

Procedia PDF Downloads 340
4062 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire

Authors: Vinay A. Sharma, Shiva Prasad H. C.

Abstract:

The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.

Keywords: decision-making, leadership, logic, strategic management

Procedia PDF Downloads 85
4061 Approaches to Tsunami Mitigation and Prevention: Explaining Architectural Strategies for Reducing Urban Risk

Authors: Hedyeh Gamini, Hadi Abdus

Abstract:

Tsunami, as a natural disaster, is composed of waves that are usually caused by severe movements at the sea floor. Although tsunami and its consequences cannot be prevented in any way, by examining past tsunamis and extracting key points on how to deal with this incident and learning from it, a positive step can be taken to reduce the vulnerability of human settlements and reduce the risk of this phenomenon in architecture and urbanism. The method is reviewing and has examined the documents written and valid internet sites related to managing and reducing the vulnerability of human settlements in face of tsunami. This paper has explored the tsunamis in Indonesia (2004), Sri Lanka (2004) and Japan (2011), and of the study objectives has been understanding how they dealt with tsunami and extracting key points, and the lessons from them in terms of reduction of vulnerability of human settlements in dealing with the tsunami. Finally, strategies to prevent and reduce the vulnerability of communities at risk of tsunamis have been offered in terms of architecture and urban planning. According to what is obtained from the study of the recent tsunamis, the authorities' quality of dealing with them, how to manage the crisis and the manner of their construction, it can be concluded that to reduce the vulnerability of human settlements against tsunami, there are generally four ways that are: 1-Construction of tall buildings with opening on the first floor so that water can flow easily under and the direction of the building should be in a way that water passes easily from the side. 2- The construction of multi-purpose centers, which could be used as vertical evacuation during accidents. 3- Constructing buildings in core forms with diagonal orientation of the coastline, 4- Building physical barriers (natural and synthetic) such as water dams, mounds of earth, sea walls and creating forests

Keywords: tsunami, architecture, reducing vulnerability, human settlements, urbanism

Procedia PDF Downloads 368
4060 Strategies to Promote Safety and Reduce the Vulnerability of Urban Worn-out Textures to the Potential Risk of Earthquake

Authors: Bahareh Montakhabi

Abstract:

Earthquake is known as one of the deadliest natural disasters, with a high potential for damage to life and property. Some of Iran's cities were completely destroyed after major earthquakes, and the people of the region suffered a lot of mental, financial and psychological damage. Tehran is one of the cities located on the fault line. According to experts, the only city that could be severely damaged by a moderate earthquake in Earthquake Engineering Intensity Scale (EEIS) (70% destruction) is Tehran because Tehran is built precisely on the fault. Seismic risk assessment (SRA) of cities in the scale of urban areas and neighborhoods is the first phase of the earthquake crisis management process, which can provide the information required to make optimal use of available resources and facilities in order to reduce the destructive effects and consequences of an earthquake. This study has investigated strategies to promote safety and reduce the vulnerability of worn-out urban textures in the District 12 of Tehran to the potential risk of earthquake aimed at prioritizing the factors affecting the vulnerability of worn-out urban textures to earthquake crises and how to reduce them, using the analytical-exploratory method, analytical hierarchy process (AHP), Expert choice and SWOT technique. The results of SWAT and AHP analysis of the vulnerability of the worn-out textures of District 12 to internal threats (1.70) and external threats (2.40) indicate weak safety of the textures of District 12 regarding internal and external factors and a high possibility of damage.

Keywords: risk management, vulnerability, worn-out textures, earthquake

Procedia PDF Downloads 165