Search results for: Python vulnerabilities
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 396

Search results for: Python vulnerabilities

186 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image

Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa

Abstract:

A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.

Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever

Procedia PDF Downloads 86
185 The Challenge of Characterising Drought Risk in Data Scarce Regions: The Case of the South of Angola

Authors: Natalia Limones, Javier Marzo, Marcus Wijnen, Aleix Serrat-Capdevila

Abstract:

In this research we developed a structured approach for the detection of areas under the highest levels of drought risk that is suitable for data-scarce environments. The methodology is based on recent scientific outcomes and methods and can be easily adapted to different contexts in successive exercises. The research reviews the history of drought in the south of Angola and characterizes the experienced hazard in the episode from 2012, focusing on the meteorological and the hydrological drought types. Only global open data information coming from modeling or remote sensing was used for the description of the hydroclimatological variables since there is almost no ground data in this part of the country. Also, the study intends to portray the socioeconomic vulnerabilities and the exposure to the phenomenon in the region to fully understand the risk. As a result, a map of the areas under the highest risk in the south of the country is produced, which is one of the main outputs of this work. It was also possible to confirm that the set of indicators used revealed different drought vulnerability profiles in the South of Angola and, as a result, several varieties of priority areas prone to distinctive impacts were recognized. The results demonstrated that most of the region experienced a severe multi-year meteorological drought that triggered an unprecedent exhaustion of the surface water resources, and that the majority of their socioeconomic impacts started soon after the identified onset of these processes.

Keywords: drought risk, exposure, hazard, vulnerability

Procedia PDF Downloads 171
184 “It Takes a Community to Save a Child”: A Qualitative Analysis of Child Trafficking Interventions from Practitioner Perspectives

Authors: Crispin Rakibu Mbamba

Abstract:

Twenty-two years after the adoption of the United Nation Trafficking Protocol, evidence suggest that child trafficking continues to rise. Community level factors, like poverty which creates the conditions for children’s vulnerability is key to the rise in trafficking cases in Ghana. Albeit, growing evidence suggestthat despite the vulnerabilities, communities have the capacity to prevent and address child trafficking issues. This study contributes to this positive agenda by exploring the ways in which communities (and the key actors) in Ghana contribute to child trafficking interventions.The study objective is explored through in-depth interviews with practitioners (including social workers) from an organization working in trafficking hotspots in Ghana. Interviews wereanalyzed thematically with the help of HyperRESEARCH software. From the in-depth interviews, three themes were identified as the ways in which communities are involved in child trafficking interventions: 1) engagement of community leaders, 2) community-led anti-trafficking committees and 3) knowledge about trafficking. Albeit the cultural differences, evidence on the instrumental role of community chiefs and leaders provide important learning on how to harness trafficking intervention measures and ensure better child protection practices. Based on the findings, we recommend the need to intensify trafficking awareness campaigns in rural communities where education is lacking to contribute to United Nations (UN) promoting Just, Peaceful and Inclusive societies’ mandate.

Keywords: child trafficking, community interventions, knowledge on trafficking, human trafficking intervention

Procedia PDF Downloads 80
183 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller

Authors: Alireza Dantism

Abstract:

Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.

Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller

Procedia PDF Downloads 65
182 Igbo Art: A Reflection of the Igbo’s Visual Culture

Authors: David Osa-Egonwa

Abstract:

Visual culture is the expression of the norms and social behavior of a society in visual images. A reflection simply shows you how you look when you stand before a mirror, a clear water or stream. The mirror does not alter, improve or distort your original appearance, neither does it show you a caricature of what stands before it, this is the case with visual images created by a tribe or society. The ‘uli’ is hand drawn body design done on Igbo women and speaks of a culture of body adornment which is a practice that is appreciated by that tribe. The use of pattern of the gliding python snake ‘ije eke’ or ‘ijeagwo’ for wall painting speaks of the Igbo culture as one that appreciates wall paintings based on these patterns. Modern life came and brought a lot of change to the Igbo-speaking people of Nigeria. Change cloaked in the garment of Westernization has influenced the culture of the Igbos. This has resulted in a problem which is a break in the cultural practice that has also affected art produced by the Igbos. Before the colonial masters arrived and changed the established culture practiced by the Igbos, visual images were created that retained the culture of this people. To bring this point to limelight, this paper has adopted a historical method. A large number of works produced during pre and post-colonial era which range from sculptural pieces, paintings and other artifacts, just to mention a few, were studied carefully and it was discovered that the visual images hold the culture or aspects of the culture of the Igbos in their renditions and can rightly serve as a mirror of the Igbo visual culture.

Keywords: artistic renditions, historical method, Igbo visual culture, changes

Procedia PDF Downloads 156
181 Approaches to Ethical Hacking: A Conceptual Framework for Research

Authors: Lauren Provost

Abstract:

The digital world remains increasingly vulnerable, making the development of effective cybersecurity approaches even more critical in supporting the success of the digital economy and national security. Although approaches to cybersecurity have shifted and improved in the last decade with new models, especially with cloud computing and mobility, a record number of high severity vulnerabilities were recorded in the National Institute of Standards and Technology (NIST), and its National Vulnerability Database (NVD) in 2020. This is due, in part, to the increasing complexity of cyber ecosystems. Security must be approached with a more comprehensive, multi-tool strategy that addresses the complexity of cyber ecosystems, including the human factor. Ethical hacking has emerged as such an approach: a more effective, multi-strategy, comprehensive approach to cyber security's most pressing needs, especially understanding the human factor. Research on ethical hacking, however, is limited in scope. The two main objectives of this work are to (1) provide highlights of case studies in ethical hacking, (2) provide a conceptual framework for research in ethical hacking that embraces and addresses both technical and nontechnical security measures. Recommendations include an improved conceptual framework for research centered on ethical hacking that addresses many factors and attributes of significant attacks that threaten computer security; a more robust, integrative multi-layered framework embracing the complexity of cybersecurity ecosystems.

Keywords: ethical hacking, literature review, penetration testing, social engineering

Procedia PDF Downloads 185
180 Mapping Environmental Complexity: A Strategic Tool for Sustainable Development of Road Infrastructure in Santa Catarina, Brazil

Authors: Edinei Coser, Cátia Regina Silva de Carvalho Pinto, Kleber Isaac Silva de Souza

Abstract:

The road transportation system is an integral part of the Brazilian economy, so investing in this sector is paramount. Despite being a significant contributor to national and regional development, implementing road infrastructures brings about significant environmental changes, resulting in negative impacts that need to be mitigated through environmental licensing. However, by considering potential environmental impacts from a strategic perspective earlier, we can ensure that the sustainable development resulting from investments in this sector is more efficient. Therefore, this work aims to incorporate strategic environmental assessment into the road transportation system in the state of Santa Catarina using a tool that evaluates the entire territory. This tool analyzes 15 qualitative socio-environmental factors that may complicate environmental licensing and project implementation, with the help of multi-criteria analysis based on AHP and geographic information systems with Python, which presents a surface map of environmental cost for Santa Catarina state in Brazil. This map represents how environmental restrictions are spatially distributed in the territory and can be used for governments and decision-makers to assess potential areas for road implementation or paving, evaluate and propose road corridors, propose, promote, and evaluate risks for governmental programs and investments, set environmental management guidelines and enhance contracting and environmental assessment processes.

Keywords: environmental impact assessment., GIS, highways, multi-criteria analysis, strategic environmental assessment

Procedia PDF Downloads 27
179 Efficient Credit Card Fraud Detection Based on Multiple ML Algorithms

Authors: Neha Ahirwar

Abstract:

In the contemporary digital era, the rise of credit card fraud poses a significant threat to both financial institutions and consumers. As fraudulent activities become more sophisticated, there is an escalating demand for robust and effective fraud detection mechanisms. Advanced machine learning algorithms have become crucial tools in addressing this challenge. This paper conducts a thorough examination of the design and evaluation of a credit card fraud detection system, utilizing four prominent machine learning algorithms: random forest, logistic regression, decision tree, and XGBoost. The surge in digital transactions has opened avenues for fraudsters to exploit vulnerabilities within payment systems. Consequently, there is an urgent need for proactive and adaptable fraud detection systems. This study addresses this imperative by exploring the efficacy of machine learning algorithms in identifying fraudulent credit card transactions. The selection of random forest, logistic regression, decision tree, and XGBoost for scrutiny in this study is based on their documented effectiveness in diverse domains, particularly in credit card fraud detection. These algorithms are renowned for their capability to model intricate patterns and provide accurate predictions. Each algorithm is implemented and evaluated for its performance in a controlled environment, utilizing a diverse dataset comprising both genuine and fraudulent credit card transactions.

Keywords: efficient credit card fraud detection, random forest, logistic regression, XGBoost, decision tree

Procedia PDF Downloads 29
178 Client Hacked Server

Authors: Bagul Abhijeet

Abstract:

Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.

Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring

Procedia PDF Downloads 228
177 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use

Authors: Mayank Mundhra, Chester Rebeiro

Abstract:

Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.

Keywords: Ripple, Kelips, unique node list, consensus, information propagation

Procedia PDF Downloads 108
176 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube

Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego

Abstract:

The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).

Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation

Procedia PDF Downloads 291
175 Counter-Terrorism Policies in the Wider Black Sea Region: Evaluating the Robustness of Constantza Port under Potential Terror Attacks

Authors: A. V. Popa, C. Barna, V. Mihalache

Abstract:

Being the largest port at the Black Sea and functioning as a civil and military nodal point between Europe and Asia, Constantza Port has become a potential target on the terrorist international agenda. The authors use qualitative research based on both face-to-face and online semi-structured interviews with relevant stakeholders (top decision-makers in the Romanian Naval Authority, Romanian Maritime Training Centre, National Company "Maritime Ports Administration" and military staff) in order to detect potential vulnerabilities which might be exploited by terrorists in the case of Constantza Port. Likewise, this will enable bringing together the experts’ opinions on potential mitigation measures. Subsequently, this paper formulates various counter-terrorism policies to enhance the robustness of Constantza Port under potential terror attacks and connects them with the attributions in the field of critical infrastructure protection conferred by the law to the lead national authority for preventing and countering terrorism, namely the Romanian Intelligence Service. Extending the national counterterrorism efforts to an international level, the authors propose the establishment – among the experts of the NATO member states of the Wider Black Sea Region – of a platform for the exchange of know-how and best practices in the field of critical infrastructure protection.

Keywords: Constantza Port, counter-terrorism policies, critical infrastructure protection, security, Wider Black Sea Region

Procedia PDF Downloads 274
174 Land Suitability Prediction Modelling for Agricultural Crops Using Machine Learning Approach: A Case Study of Khuzestan Province, Iran

Authors: Saba Gachpaz, Hamid Reza Heidari

Abstract:

The sharp increase in population growth leads to more pressure on agricultural areas to satisfy the food supply. To achieve this, more resources should be consumed and, besides other environmental concerns, highlight sustainable agricultural development. Land-use management is a crucial factor in obtaining optimum productivity. Machine learning is a widely used technique in the agricultural sector, from yield prediction to customer behavior. This method focuses on learning and provides patterns and correlations from our data set. In this study, nine physical control factors, namely, soil classification, electrical conductivity, normalized difference water index (NDWI), groundwater level, elevation, annual precipitation, pH of water, annual mean temperature, and slope in the alluvial plain in Khuzestan (an agricultural hotspot in Iran) are used to decide the best agricultural land use for both rainfed and irrigated agriculture for ten different crops. For this purpose, each variable was imported into Arc GIS, and a raster layer was obtained. In the next level, by using training samples, all layers were imported into the python environment. A random forest model was applied, and the weight of each variable was specified. In the final step, results were visualized using a digital elevation model, and the importance of all factors for each one of the crops was obtained. Our results show that despite 62% of the study area being allocated to agricultural purposes, only 42.9% of these areas can be defined as a suitable class for cultivation purposes.

Keywords: land suitability, machine learning, random forest, sustainable agriculture

Procedia PDF Downloads 55
173 A Modular and Reusable Bond Graph Model of Epithelial Transport in the Proximal Convoluted Tubule

Authors: Leyla Noroozbabaee, David Nickerson

Abstract:

We introduce a modular, consistent, reusable bond graph model of the renal nephron’s proximal convoluted tubule (PCT), which can reproduce biological behaviour. In this work, we focus on ion and volume transport in the proximal convoluted tubule of the renal nephron. Modelling complex systems requires complex modelling problems to be broken down into manageable pieces. This can be enabled by developing models of subsystems that are subsequently coupled hierarchically. Because they are based on a graph structure. In the current work, we define two modular subsystems: the resistive module representing the membrane and the capacitive module representing solution compartments. Each module is analyzed based on thermodynamic processes, and all the subsystems are reintegrated into circuit theory in network thermodynamics. The epithelial transport system we introduce in the current study consists of five transport membranes and four solution compartments. Coupled dissipations in the system occur in the membrane subsystems and coupled free-energy increasing, or decreasing processes appear in solution compartment subsystems. These structural subsystems also consist of elementary thermodynamic processes: dissipations, free-energy change, and power conversions. We provide free and open access to the Python implementation to ensure our model is accessible, enabling the reader to explore the model through setting their simulations and reproducibility tests.

Keywords: Bond Graph, Epithelial Transport, Water Transport, Mathematical Modeling

Procedia PDF Downloads 59
172 A Comparative Assessment of the FoodSupply Vulnerability to Large-Scale Disasters in OECD Countries

Authors: Karolin Bauer, Anna Brinkmann

Abstract:

Vulnerabilities in critical infrastructure can cause significant difficulties for the affected population during crises. Securing the food supply as part of the critical infrastructure in crisis situations is an essential part of public services and a ground stone for a successful concept of civil protection. In most industrialized countries, there are currently no comparative studies regarding the food supply of the population during crisis and disaster events. In order to mitigate the potential impact in case of major disasters in Germany, it is absolutely necessary to investigate how the food supply can be secured. The research project aims to provide in-depth research on the experiences gathered during past large-scale disasters in the 34 OECD member countries in order to discover alternatives for an updated civil protection system in Germany. The basic research question is: "Which international approaches and structures of civil protection have been proven and would be useful to modernize the German civil protection with regards to the critical infrastructure and food supply?" Research findings should be extracted from an extensive literature review covering the entire research period as well as from personal and online-based interviews with experts and responsible persons from involved institutions. The capability of the research project insists on the deliberate choice to investigate previous large-scale disasters to formulate important and practical approaches to modernize civil protection in Germany.

Keywords: food supply, vulnerabilty, critical infratstructure, large-scale disaster

Procedia PDF Downloads 313
171 Landslide Vulnerability Assessment in Context with Indian Himalayan

Authors: Neha Gupta

Abstract:

Landslide vulnerability is considered as the crucial parameter for the assessment of landslide risk. The term vulnerability defined as the damage or degree of elements at risk of different dimensions, i.e., physical, social, economic, and environmental dimensions. Himalaya region is very prone to multi-hazard such as floods, forest fires, earthquakes, and landslides. With the increases in fatalities rates, loss of infrastructure, and economy due to landslide in the Himalaya region, leads to the assessment of vulnerability. In this study, a methodology to measure the combination of vulnerability dimension, i.e., social vulnerability, physical vulnerability, and environmental vulnerability in one framework. A combined result of these vulnerabilities has rarely been carried out. But no such approach was applied in the Indian Scenario. The methodology was applied in an area of east Sikkim Himalaya, India. The physical vulnerability comprises of building footprint layer extracted from remote sensing data and Google Earth imaginary. The social vulnerability was assessed by using population density based on land use. The land use map was derived from a high-resolution satellite image, and for environment vulnerability assessment NDVI, forest, agriculture land, distance from the river were assessed from remote sensing and DEM. The classes of social vulnerability, physical vulnerability, and environment vulnerability were normalized at the scale of 0 (no loss) to 1 (loss) to get the homogenous dataset. Then the Multi-Criteria Analysis (MCA) was used to assign individual weights to each dimension and then integrate it into one frame. The final vulnerability was further classified into four classes from very low to very high.

Keywords: landslide, multi-criteria analysis, MCA, physical vulnerability, social vulnerability

Procedia PDF Downloads 282
170 Copper Price Prediction Model for Various Economic Situations

Authors: Haidy S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

Copper is an essential raw material used in the construction industry. During the year 2021 and the first half of 2022, the global market suffered from a significant fluctuation in copper raw material prices due to the aftermath of both the COVID-19 pandemic and the Russia-Ukraine war, which exposed its consumers to an unexpected financial risk. Thereto, this paper aims to develop two ANN-LSTM price prediction models, using Python, that can forecast the average monthly copper prices traded in the London Metal Exchange; the first model is a multivariate model that forecasts the copper price of the next 1-month and the second is a univariate model that predicts the copper prices of the upcoming three months. Historical data of average monthly London Metal Exchange copper prices are collected from January 2009 till July 2022, and potential external factors are identified and employed in the multivariate model. These factors lie under three main categories: energy prices and economic indicators of the three major exporting countries of copper, depending on the data availability. Before developing the LSTM models, the collected external parameters are analyzed with respect to the copper prices using correlation and multicollinearity tests in R software; then, the parameters are further screened to select the parameters that influence the copper prices. Then, the two LSTM models are developed, and the dataset is divided into training, validation, and testing sets. The results show that the performance of the 3-Month prediction model is better than the 1-Month prediction model, but still, both models can act as predicting tools for diverse economic situations.

Keywords: copper prices, prediction model, neural network, time series forecasting

Procedia PDF Downloads 84
169 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Authors: Chen Chou, Feng-Tyan Lin

Abstract:

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Keywords: Big Data, ITS, influence range, living area, central place theory, visualization

Procedia PDF Downloads 246
168 A Grey-Box Text Attack Framework Using Explainable AI

Authors: Esther Chiramal, Kelvin Soh Boon Kai

Abstract:

Explainable AI is a strong strategy implemented to understand complex black-box model predictions in a human-interpretable language. It provides the evidence required to execute the use of trustworthy and reliable AI systems. On the other hand, however, it also opens the door to locating possible vulnerabilities in an AI model. Traditional adversarial text attack uses word substitution, data augmentation techniques, and gradient-based attacks on powerful pre-trained Bidirectional Encoder Representations from Transformers (BERT) variants to generate adversarial sentences. These attacks are generally white-box in nature and not practical as they can be easily detected by humans e.g., Changing the word from “Poor” to “Rich”. We proposed a simple yet effective Grey-box cum Black-box approach that does not require the knowledge of the model while using a set of surrogate Transformer/BERT models to perform the attack using Explainable AI techniques. As Transformers are the current state-of-the-art models for almost all Natural Language Processing (NLP) tasks, an attack generated from BERT1 is transferable to BERT2. This transferability is made possible due to the attention mechanism in the transformer that allows the model to capture long-range dependencies in a sequence. Using the power of BERT generalisation via attention, we attempt to exploit how transformers learn by attacking a few surrogate transformer variants which are all based on a different architecture. We demonstrate that this approach is highly effective to generate semantically good sentences by changing as little as one word that is not detectable by humans while still fooling other BERT models.

Keywords: BERT, explainable AI, Grey-box text attack, transformer

Procedia PDF Downloads 114
167 Assessing the Resilience to Economic Shocks of the Households in Bistekville 2, Quezon City, Philippines

Authors: Maria Elisa B. Manuel

Abstract:

The Philippine housing sector is bracing challenges with the massive housing backlog and the adamant cycle of relocation, resettlement and returns to the cities of informal settler families due to the vast inaccessibility of necessities and opportunities in the past off-city housing projects. Bistekville 2 has been established as a model socialized housing project by utilizing government partnerships with private developers and individuals in the first in-city and onsite resettlement effort in the country. The study looked into the resilience of the residents to idiosyncratic economic shocks by analyzing their vulnerabilities, assets and coping strategies. The study formulated an economic resilience framework to identify how these factors that interact to build the household’s capacity to positively adapt to sudden expenses in their households. The framework is supplemented with a scale that presents the proximity of the household to resilience by identifying through its indicators whether the households are in the level of subsistence, coping, adaptive or transformative. Survey interviews were conducted with 91 households from Bistekville 2 on the components that have been identified by the framework that was processed with qualitative and quantitative processes. The study has found that the households are highly vulnerable due to their family composition and other conditions such as unhealthy loans, inconsistent amortization payment. Along with their high vulnerability, the households have inadequate strategies to anticipate shocks and primarily react to the shock. This has led to the conclusion that the households do not reflect resilience to idiosyncratic economic shocks and are still at the level of coping.

Keywords: idiosyncratic economic shocks, socialized housing, economic resilience, economic vulnerability, adaptive capacity

Procedia PDF Downloads 117
166 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 69
165 Identification of Flooding Attack (Zero Day Attack) at Application Layer Using Mathematical Model and Detection Using Correlations

Authors: Hamsini Pulugurtha, V.S. Lakshmi Jagadmaba Paluri

Abstract:

Distributed denial of service attack (DDoS) is one altogether the top-rated cyber threats presently. It runs down the victim server resources like a system of measurement and buffer size by obstructing the server to supply resources to legitimate shoppers. Throughout this text, we tend to tend to propose a mathematical model of DDoS attack; we discuss its relevancy to the choices like inter-arrival time or rate of arrival of the assault customers accessing the server. We tend to tend to further analyze the attack model in context to the exhausting system of measurement and buffer size of the victim server. The projected technique uses an associate in nursing unattended learning technique, self-organizing map, to make the clusters of identical choices. Lastly, the abstract applies mathematical correlation and so the standard likelihood distribution on the clusters and analyses their behaviors to look at a DDoS attack. These systems not exclusively interconnect very little devices exchanging personal data, but to boot essential infrastructures news standing of nuclear facilities. Although this interconnection brings many edges and blessings, it to boot creates new vulnerabilities and threats which might be conversant in mount attacks. In such sophisticated interconnected systems, the power to look at attacks as early as accomplishable is of paramount importance.

Keywords: application attack, bandwidth, buffer correlation, DDoS distribution flooding intrusion layer, normal prevention probability size

Procedia PDF Downloads 194
164 Experiences of Marital Relationship of Middle-Aged Couples in Hong Kong: Implications for Services Interventions

Authors: Wai M. Shum

Abstract:

There was evidence that the change of marital quality satisfaction was related to the different stages of the family life cycle. Research studies have been largely based on western contexts, which found a curvilinear U-shaped trend in changes of marital satisfaction over the course of a marriage, but little is known about the marital experiences of Hong Kong couples. Through in-depth interviews, this qualitative study explored the marital relationship of middle-aged couples in a satisfying marriage and to identify how couples maintain a satisfying relationship in the local context. Findings from this study suggested twelve themes with some showing consistency with previous literature, such as communication, companionship, trust, and fidelity. The affective aspects of empathetic understanding and perceived empathy were found to have an enormous effect on couples’ bondedness. The high level of differentiation and security served as a basis for unconditional contribution, acceptance, and adjustment to unsolvable issues such that negative emotion would not be escalated. The manifestations of intimacy and commitment in the triangular theory of love were more frequently addressed than passion in striving for marital longevity in the local context. This study challenged the curvilinear trend of marital satisfaction throughout marriage, with couples showing different pathways of marital satisfaction. The study gave insights on martial enrichment, such as facilitating couples to disclose their vulnerabilities, desire for physical intimacy, and passion in the pursuit of enduring marriage instead of an emphasis on skills training on communication and conflict resolution.

Keywords: intimacy, marital relationship, marital satisfaction, middle-aged

Procedia PDF Downloads 89
163 Stigma Associated with Invisible Disabilities and Its Effect on Intended Disclosure in the Workplace

Authors: Jessica Lynne Hicksted

Abstract:

Disability discrimination is a long-standing issue that, despite protections, continues to result in unemployment, underemployment, and lack of advancement for disabled persons. Visible stigma is researched substantially; however, less is known about the impact of stigma associated with identities that can be concealed. Although researchers have investigated this issue, currently there is no tool to measure this phenomenon. The purpose of this quantitative study was to create and validate a new tool to measure stigma associated with invisible disabilities. The study is grounded by Roberts’ conceptual model of professional image construction integrating social identity, impression management, and organizational behavior; Meisenbach’s stigma management communication theory addressing the vulnerabilities and resilience to stigma communication by focusing on how individuals encounter and react to perceived stigmas; and Kelley and Michela’s causal attribution theory. Participants included 1,412 adults in the United States 18 years or older currently employed or who have been employed within the last 5 years. Confirmatory factor analysis of the new Workplace Invisible Disabilities Experience scale showed excellent fit of the factor structure to the data, X₂/df = 1.855, CFI = .955, RMSEA = .045, p = .0001. The scale has three subscales, Ableism, Advocacy, and Acceptance, with excellent internal consistency reliability. Total score, Advocacy, and Acceptance were associated with intention to disclose. Implications for positive social change include helping organizations to understand the extent of invisible disability stigma that can help improve workplace performance and satisfaction.

Keywords: invisible disabilities, accommodations, acceptance, social change, workplace inclusion

Procedia PDF Downloads 43
162 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 336
161 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 83
160 The Management of Climate Change by Indigenous People: A Focus on Himachal Pradesh, India

Authors: Anju Batta Sehgal

Abstract:

Climate change is a major challenge in terms of agriculture, food security and rural livelihood for thousands of people especially the poor in Himachal, which falls in North-Western Himalayas. Agriculture contributes over 45 per cent to net state domestic product. It is the main source of income and employment. Over 93 per cent of population is dependent on agriculture which provides direct employment to 71 percent of its people. Area of operation holding is about 9,79 lakh hectares owned by 9.14 lakh farmers. About 80 per cent area is rain-fed and farmers depend on weather gods for rains. Region is a home of diverse ethnic communities having enormous socio-economic and cultural diversities, gifted with range of farming systems and rich resource wealth, including biodiversity, hot spots and ecosystems sustaining millions of people living in the region. But growing demands of ecosystem goods and services are posing threats to natural resources. Climate change is already making adverse impact on the indigenous people. The rural populace is directly dependent for all its food, shelter and other needs on the climate. Our aim should be to shift the focus to indigenous people as primary actors in terms of global climate change monitoring, adaptations and innovations. Objective of this paper is to identify the climate change related threats and vulnerabilities associated with agriculture as a sector and agriculture as people’s livelihood. Broadly it analyses the connections between the nature and rural consumers the ethnic groups.

Keywords: climate change, agriculture, indigenous people, Himachal Pradesh

Procedia PDF Downloads 250
159 Monitor Student Concentration Levels on Online Education Sessions

Authors: M. K. Wijayarathna, S. M. Buddika Harshanath

Abstract:

Monitoring student engagement has become a crucial part of the educational process and a reliable indicator of the capacity to retain information. As online learning classrooms are now more common these days, students' attention levels have become increasingly important, making it more difficult to check each student's concentration level in an online classroom setting. To profile student attention to various gradients of engagement, a study is a plan to conduct using machine learning models. Using a convolutional neural network, the findings and confidence score of the high accuracy model are obtained. In this research, convolutional neural networks are using to help discover essential emotions that are critical in defining various levels of participation. Students' attention levels were shown to be influenced by emotions such as calm, enjoyment, surprise, and fear. An improved virtual learning system was created as a result of these data, which allowed teachers to focus their support and advise on those students who needed it. Student participation has formed as a crucial component of the learning technique and a consistent predictor of a student's capacity to retain material in the classroom. Convolutional neural networks have a plan to implement the platform. As a preliminary step, a video of the pupil would be taken. In the end, researchers used a convolutional neural network utilizing the Keras toolkit to take pictures of the recordings. Two convolutional neural network methods are planned to use to determine the pupils' attention level. Finally, those predicted student attention level results plan to display on the graphical user interface of the System.

Keywords: HTML5, JavaScript, Python flask framework, AI, graphical user

Procedia PDF Downloads 69
158 Estimation of Particle Size Distribution Using Magnetization Data

Authors: Navneet Kaur, S. D. Tiwari

Abstract:

Magnetic nanoparticles possess fascinating properties which make their behavior unique in comparison to corresponding bulk materials. Superparamagnetism is one such interesting phenomenon exhibited only by small particles of magnetic materials. In this state, the thermal energy of particles become more than their magnetic anisotropy energy, and so particle magnetic moment vectors fluctuate between states of minimum energy. This situation is similar to paramagnetism of non-interacting ions and termed as superparamagnetism. The magnetization of such systems has been described by Langevin function. But, the estimated fit parameters, in this case, are found to be unphysical. It is due to non-consideration of particle size distribution. In this work, analysis of magnetization data on NiO nanoparticles is presented considering the effect of particle size distribution. Nanoparticles of NiO of two different sizes are prepared by heating freshly synthesized Ni(OH)₂ at different temperatures. Room temperature X-ray diffraction patterns confirm the formation of single phase of NiO. The diffraction lines are seen to be quite broad indicating the nanocrystalline nature of the samples. The average crystallite size are estimated to be about 6 and 8 nm. The samples are also characterized by transmission electron microscope. Magnetization of both sample is measured as function of temperature and applied magnetic field. Zero field cooled and field cooled magnetization are measured as a function of temperature to determine the bifurcation temperature. The magnetization is also measured at several temperatures in superparamagnetic region. The data are fitted to an appropriate expression considering a distribution in particle size following a least square fit procedure. The computer codes are written in PYTHON. The presented analysis is found to be very useful for estimating the particle size distribution present in the samples. The estimated distributions are compared with those determined from transmission electron micrographs.

Keywords: anisotropy, magnetization, nanoparticles, superparamagnetism

Procedia PDF Downloads 106
157 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling

Authors: Masoud Safdari, Jacob Fish

Abstract:

Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.

Keywords: atomistic, continuum, coupling, multiscale

Procedia PDF Downloads 156