Search results for: Python vulnerabilities
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 396

Search results for: Python vulnerabilities

156 Development of Medical Intelligent Process Model Using Ontology Based Technique

Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu

Abstract:

An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.

Keywords: ontology-based, model, database, OOADM, healthcare

Procedia PDF Downloads 45
155 The Design of Safe Spaces in Healthcare Facilities Vulnerable to Tornado Impact in Central US

Authors: Lucy Ampaw-Asiedu, Terri R. Norton

Abstract:

In the wake of recent disasters happening around the world such as earthquake in Italy (January, 2017); hurricanes in the United States (US) (September 2016 and September 2017); and compounding disasters in Haiti (September 2010 and September 2016); to our best knowledge, never has the world seen the need to work on preemptive rather than reactionary measures to salvage this situation than now. Tornadoes are natural hazards that mostly affect mid-western and central states in the US. Tornadoes, like all natural hazards such as hurricanes, earthquakes, floods and others, are very destructive and result in massive destruction to homes, cause billions of dollars in damage and claims many lives. Healthcare facilities in general are vulnerable to disasters, and therefore, the safety of patients, health workers and those who come in to seek shelter should be a priority. The focus of this study is to assess disaster management measures instituted by healthcare facilities. Thus, the sole aim of the study is to examine the vulnerabilities and the design of safe spaces in healthcare facilities in Central US. Objectives that guide the study are to primarily identify the impacts of tornadoes in hospitals and to assess the structural design or specifications of safe spaces. St. John’s Regional Medical Center, now Mercy Hospital in Joplin, is used as a case study. Preliminary results show that the lateral base shear of the proposed design to be 684.24 ton (1508.49kip) for the safe space. Findings from this work will be used to make recommendations about the design of safe spaces for health care facilities in Central US.

Keywords: disaster management, safe spaces, structural design, tornado, vulnerability

Procedia PDF Downloads 182
154 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 125
153 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 63
152 The Effect of the Organization of Mental Health Care on General Practitioners’ Prescription Behavior of Psychotropics for Adolescents in Belgium

Authors: Ellen Lagast, Melissa Ceuterick, Mark Leys

Abstract:

Although adolescence is a stressful period with an increased risk for mental illnesses such as anxiety and depression, little in-depth knowledge is available on the determinants of the use of psychotropic drugs (BZD/SSRIs) and the effects. A qualitative research with adolescents in Flanders was performed. Based on indepth interviews, the interviewees indicate feelings of ambiguity towards their medication use because on the one hand the medication helps to manage their mental vulnerability and disrupted lives, but on the other hand they experience a loss of control of their self and their environment. Undesired side-effects and stigma led to a negative pharmaceutical self. The interviewed youngsters also express dissatisfaction about the prescription behavior with regard to psychotropic drugs of their general practitioner (GP). They wished to have received more information about alternative non-pharmaceutical treatment options. Notwithstanding these comments, the majority of the interviewees maintained trust in their GP to act in their best interest. This paper will relate the prescription behavior in primary care to the organization of mental health care to better understand the “phamaceuticalization” and medicalization of mental health problems in Belgium. Belgium implemented fundamental mental health care reforms to collaborate, to integrate care and to optimize continuity of care. Children and adolescents still are confronted with long waiting lists to access (non-medicalized) mental health services. This access to mental health care partly explains general practitioners’ prescription behavior of psychotropics. Moreover, multidisciplinary practices have not pervaded primary health care yet. Medicalization and pharmaceuticalization of mental health vulnerabilities of youth are both a structural and cultural problem.

Keywords: adolescents, antidepressants, benzodiazepines, mental health system, psychotropic drugs

Procedia PDF Downloads 81
151 A Case-Study Analysis on the Necessity of Testing for Cyber Risk Mitigation on Maritime Transport

Authors: Polychronis Kapalidis

Abstract:

In recent years, researchers have started to turn their attention to cyber security and maritime security independently, neglecting, in most cases, to examine the areas where these two critical issues are intertwined. The impact of cybersecurity issues on the maritime economy is emerging dramatically. Maritime transport and all related activities are conducted by technology-intensive platforms, which today rely heavily on information systems. The paper’s argument is that when no defense is completely effective against cyber attacks, it is vital to test responses to the inevitable incursions. Hence, preparedness in the form of testing existing cybersecurity structure via different tools for potential attacks is vital for minimizing risks. Traditional criminal activities may further be facilitated and evolved through the misuse of cyberspace. Kidnap, piracy, fraud, theft of cargo and imposition of ransomware are the major of these activities that mainly target the industry’s most valuable asset; the ship. The paper, adopting a case-study analysis, based on stakeholder consultation and secondary data analysis, namely policy and strategic-related documentation, presents the importance of holistic testing in the sector. Arguing that poor understanding of the issue leads to the adoption of ineffective policies the paper will present the level of awareness within the industry and assess the risks and vulnerabilities of ships to these cybercriminal activities. It will conclude by suggesting that testing procedures must be focused on three main pillars within the maritime transport sector: the human factor, the infrastructure, and the procedures.

Keywords: cybercrime, cybersecurity, organized crime, risk mitigation

Procedia PDF Downloads 133
150 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons

Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker

Abstract:

To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.

Keywords: bioinformatics, automation, opentrons, research

Procedia PDF Downloads 85
149 Comparison of Deep Learning and Machine Learning Algorithms to Diagnose and Predict Breast Cancer

Authors: F. Ghazalnaz Sharifonnasabi, Iman Makhdoom

Abstract:

Breast cancer is a serious health concern that affects many people around the world. According to a study published in the Breast journal, the global burden of breast cancer is expected to increase significantly over the next few decades. The number of deaths from breast cancer has been increasing over the years, but the age-standardized mortality rate has decreased in some countries. It’s important to be aware of the risk factors for breast cancer and to get regular check- ups to catch it early if it does occur. Machin learning techniques have been used to aid in the early detection and diagnosis of breast cancer. These techniques, that have been shown to be effective in predicting and diagnosing the disease, have become a research hotspot. In this study, we consider two deep learning approaches including: Multi-Layer Perceptron (MLP), and Convolutional Neural Network (CNN). We also considered the five-machine learning algorithm titled: Decision Tree (C4.5), Naïve Bayesian (NB), Support Vector Machine (SVM), K-Nearest Neighbors (KNN) Algorithm and XGBoost (eXtreme Gradient Boosting) on the Breast Cancer Wisconsin Diagnostic dataset. We have carried out the process of evaluating and comparing classifiers involving selecting appropriate metrics to evaluate classifier performance and selecting an appropriate tool to quantify this performance. The main purpose of the study is predicting and diagnosis breast cancer, applying the mentioned algorithms and also discovering of the most effective with respect to confusion matrix, accuracy and precision. It is realized that CNN outperformed all other classifiers and achieved the highest accuracy (0.982456). The work is implemented in the Anaconda environment based on Python programing language.

Keywords: breast cancer, multi-layer perceptron, Naïve Bayesian, SVM, decision tree, convolutional neural network, XGBoost, KNN

Procedia PDF Downloads 46
148 Assessing the Imapact of Climate Change on Biodiversity Hotspots: A Multidisciplinary Study

Authors: Reet Bishnoi

Abstract:

Climate change poses a pressing global challenge, with far-reaching consequences for the planet's ecosystems and biodiversity. This abstract introduces the research topic, "Assessing the Impact of Climate Change on Biodiversity Hotspots: A Multidisciplinary Study," which delves into the intricate relationship between climate change and biodiversity in the world's most ecologically diverse regions. Biodiversity hotspots, characterized by their exceptionally high species richness and endemism, are under increasing threat due to rising global temperatures, altered precipitation patterns, and other climate-related factors. This research employs a multidisciplinary approach, incorporating ecological, climatological, and conservationist methodologies to comprehensively analyze the effects of climate change on these vital regions. Through a combination of field research, climate modelling, and ecological assessments, this study aims to elucidate the vulnerabilities of biodiversity hotspots and understand how changes in temperature and precipitation are affecting the diverse species and ecosystems that inhabit these areas. The research seeks to identify potential tipping points, assess the resilience of native species, and propose conservation strategies that can mitigate the adverse impacts of climate change on these critical regions. By illuminating the complex interplay between climate change and biodiversity hotspots, this research not only contributes to our scientific understanding of these issues but also informs policymakers, conservationists, and the public about the urgent need for coordinated efforts to safeguard our planet's ecological treasures. The outcomes of this multidisciplinary study are expected to play a pivotal role in shaping future climate policies and conservation practices, emphasizing the importance of protecting biodiversity hotspots for the well-being of the planet and future generations.

Keywords: climate change, biodiversity hotspots, ecological diversity, conservation, multidisciplinary study

Procedia PDF Downloads 41
147 Cybersecurity Challenges in the Era of Open Banking

Authors: Krish Batra

Abstract:

The advent of open banking has revolutionized the financial services industry by fostering innovation, enhancing customer experience, and promoting competition. However, this paradigm shift towards more open and interconnected banking ecosystems has introduced complex cybersecurity challenges. This research paper delves into the multifaceted cybersecurity landscape of open banking, highlighting the vulnerabilities and threats inherent in sharing financial data across a network of banks and third-party providers. Through a detailed analysis of recent data breaches, phishing attacks, and other cyber incidents, the paper assesses the current state of cybersecurity within the open banking framework. It examines the effectiveness of existing security measures, such as encryption, API security protocols, and authentication mechanisms, in protecting sensitive financial information. Furthermore, the paper explores the regulatory response to these challenges, including the implementation of standards such as PSD2 in Europe and similar initiatives globally. By identifying gaps in current cybersecurity practices, the research aims to propose a set of robust, forward-looking strategies that can enhance the security and resilience of open banking systems. This includes recommendations for banks, third-party providers, regulators, and consumers on how to mitigate risks and ensure a secure open banking environment. The ultimate goal is to provide stakeholders with a comprehensive understanding of the cybersecurity implications of open banking and to outline actionable steps for safeguarding the financial ecosystem in an increasingly interconnected world.

Keywords: open banking, financial services industry, cybersecurity challenges, data breaches, phishing attacks, encryption, API security protocols, authentication mechanisms, regulatory response, PSD2, cybersecurity practices

Procedia PDF Downloads 29
146 Mobulid Ray Post-Release Mortality to Assess the Feasibility of Live-Release Management Measures

Authors: Sila K. Sari, Betty J.L. Laglbauer, Muhammad G. Salim, Irianies C. Gozali, Iqbal Herwata, Fahmi Fahmi, Selvia Oktaviyani, Isabel Ender, Sarah Lewis, Abraham Sianipar, Mark Erdmann

Abstract:

Taking strides towards the sustainable use of marine stocks requires science-based management of target fish populations and reduction of bycatch in non-selective fisheries. Among elasmobranchs, mobulid rays are faced with high extinction risk due to intrinsic vulnerability to fishing and their conservation has been recognized as a strong priority both in Indonesia and worldwide. Despite their common vulnerabilities to fishing pressure due to slow growth, late maturation and low fecundity, only manta rays, but not devil rays, are protected in Indonesian waters. However, both manta and devil rays are captured in non-selective fisheries, in particular drift gillnets, since their habitat overlaps with fishing grounds for primary target species (e.g. marlin, swordfish and bullet tuna off the coast of Muncar). For this reason, mobulid populations are being heavily impacted, and while national-level protections are crucial to help conservation, they may not suffice alone to insure populations sustainability. In order to assess the potential of applying live-release management measures to conserve mobulids captured as bycatch in drift gillnets, we deployed pop-up survival archival transmitters to assess post-release mortality in Indonesian mobulid rays. We also assessed which fishing practices, in particular, soak duration, affected post-release mortality in order to draw relevant conclusions for management.

Keywords: Mobulid, Devil ray, Manta ray, Bycatch

Procedia PDF Downloads 140
145 Climate Change Vulnerability and Capacity Assessment in Coastal Areas of Sindh Pakistan and Its Impact on Water Resources

Authors: Falak Nawaz

Abstract:

The Climate Change Vulnerability and Capacity Assessment carried out in the coastal regions of Thatta and Malir districts underscore the potential risks and challenges associated with climate change affecting water resources. This study was conducted by the author using participatory rural appraisal tools, with a greater focus on conducting focus group discussions, direct observations, key informant interviews, and other PRA tools. The assessment delves into the specific impacts of climate change along the coastal belt, concentrating on aspects such as rising sea levels, depletion of freshwater, alterations in precipitation patterns, fluctuations in water table levels, and the intrusion of saltwater into rivers. These factors have significant consequences for the availability and quality of water resources in coastal areas, manifesting in frequent migration and alterations in agriculture-based livelihood practices. Furthermore, the assessment assesses the adaptive capacity of communities and organizations in these coastal regions to effectively confront and alleviate the effects of climate change on water resources. It considers various measures, including infrastructure enhancements, water management practices, adjustments in agricultural approaches, and disaster preparedness, aiming to bolster adaptive capacity. The study's findings emphasize the necessity for prompt actions to address identified vulnerabilities and fortify the adaptive capacities of Sindh's coastal areas. This calls for comprehensive strategies and policies promoting sustainable water resource management, integrating climate change considerations, and providing essential resources and support to vulnerable communities.

Keywords: climate, climate change adaptation, disaster reselience, vulnerability, capacity, assessment

Procedia PDF Downloads 32
144 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs

Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye

Abstract:

This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.

Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label

Procedia PDF Downloads 87
143 FlameCens: Visualization of Expressive Deviations in Music Performance

Authors: Y. Trantafyllou, C. Alexandraki

Abstract:

Music interpretation accounts to the way musicians shape their performance by deliberately deviating from composers’ intentions, which are commonly communicated via some form of music transcription, such as a music score. For transcribed and non-improvised music, music expression is manifested by introducing subtle deviations in tempo, dynamics and articulation during the evolution of performance. This paper presents an application, named FlameCens, which, given two recordings of the same piece of music, presumably performed by different musicians, allow visualising deviations in tempo and dynamics during playback. The application may also compare a certain performance to the music score of that piece (i.e. MIDI file), which may be thought of as an expression-neutral representation of that piece, hence depicting the expressive queues employed by certain performers. FlameCens uses the Dynamic Time Warping algorithm to compare two audio sequences, based on CENS (Chroma Energy distribution Normalized Statistics) audio features. Expressive deviations are illustrated in a moving flame, which is generated by an animation of particles. The length of the flame is mapped to deviations in dynamics, while the slope of the flame is mapped to tempo deviations so that faster tempo changes the slope to the right and slower tempo changes the slope to the left. Constant slope signifies no tempo deviation. The detected deviations in tempo and dynamics can be additionally recorded in a text file, which allows for offline investigation. Moreover, in the case of monophonic music, the color of particles is used to convey the pitch of the notes during performance. FlameCens has been implemented in Python and it is openly available via GitHub. The application has been experimentally validated for different music genres including classical, contemporary, jazz and popular music. These experiments revealed that FlameCens can be a valuable tool for music specialists (i.e. musicians or musicologists) to investigate the expressive performance strategies employed by different musicians, as well as for music audience to enhance their listening experience.

Keywords: audio synchronization, computational music analysis, expressive music performance, information visualization

Procedia PDF Downloads 107
142 Machine Learning-Based Techniques for Detecting and Mitigating Cyber-attacks on Automatic Generation Control in Smart Grids

Authors: Sami M. Alshareef

Abstract:

The rapid growth of smart grid technology has brought significant advancements to the power industry. However, with the increasing interconnectivity and reliance on information and communication technologies, smart grids have become vulnerable to cyber-attacks, posing significant threats to the reliable operation of power systems. Among the critical components of smart grids, the Automatic Generation Control (AGC) system plays a vital role in maintaining the balance between generation and load demand. Therefore, protecting the AGC system from cyber threats is of paramount importance to maintain grid stability and prevent disruptions. Traditional security measures often fall short in addressing sophisticated and evolving cyber threats, necessitating the exploration of innovative approaches. Machine learning, with its ability to analyze vast amounts of data and learn patterns, has emerged as a promising solution to enhance AGC system security. Therefore, this research proposal aims to address the challenges associated with detecting and mitigating cyber-attacks on AGC in smart grids by leveraging machine learning techniques on automatic generation control of two-area power systems. By utilizing historical data, the proposed system will learn the normal behavior patterns of AGC and identify deviations caused by cyber-attacks. Once an attack is detected, appropriate mitigation strategies will be employed to safeguard the AGC system. The outcomes of this research will provide power system operators and administrators with valuable insights into the vulnerabilities of AGC systems in smart grids and offer practical solutions to enhance their cyber resilience.

Keywords: machine learning, cyber-attacks, automatic generation control, smart grid

Procedia PDF Downloads 58
141 Process for Analyzing Information Security Risks Associated with the Incorporation of Online Dispute Resolution Systems in the Context of Conciliation in Colombia

Authors: Jefferson Camacho Mejia, Jenny Paola Forero Pachon, Luis Carlos Gomez Florez

Abstract:

The innumerable possibilities offered by the use of Information Technology (IT) in the development of different socio-economic activities has made a change in the social paradigm and the emergence of the so-called information and knowledge society. The Colombian government, aware of this reality, has been promoting the use of IT as part of the E-government strategy adopted in the country. However, it is well known that the use of IT implies the existence of certain threats that put the security of information in the digital environment at risk. One of the priorities of the Colombian government is to improve access to alternative justice through IT, in particular, access to Alternative Dispute Resolution (ADR): conciliation, arbitration and friendly composition; by means of which it is sought that the citizens directly resolve their differences. To this end, a trend has been identified in the use of Online Dispute Resolution (ODR) systems, which extend the benefits of ADR to the digital environment through the use of IT. This article presents a process for the analysis of information security risks associated with the incorporation of ODR systems in the context of conciliation in Colombia, based on four fundamental stages identified in the literature: (I) Identification of assets, (II) Identification of threats and vulnerabilities (III) Estimation of the impact and 4) Estimation of risk levels. The methodological design adopted for this research was the grounded theory, since it involves interactions that are applied to a specific context and from the perspective of diverse participants. As a result of this investigation, the activities to be followed are defined to carry out an analysis of information security risks, in the context of the conciliation in Colombia supported by ODR systems, thus contributing to the estimation of the risks to make possible its subsequent treatment.

Keywords: alternative dispute resolution, conciliation, information security, online dispute resolution systems, process, risk analysis

Procedia PDF Downloads 211
140 Alternative (In)Security: Using Photovoice Research Methodology to Explore Refugee Anxieties in Lebanon

Authors: Jessy Abouarab

Abstract:

For more than half a century, international norms related to refugee security and protection have proliferated, yet their role in alleviating war’s negative impacts on human life remains limited. The impact of refugee-security processes often manifests asymmetrically within populations. Many issues and people get silenced due to narrow security policies that focus either on abstract threat containment and refugee control or refugee protection and humanitarian aid. (In)security practices are gendered and experienced. Examining the case study of Syrian refugees in Lebanon, this study explores the gendered impact of refugee security mechanisms on local realities. A transnational feminist approach will be used to position this research in relation to existing studies in the field of security and the refugee-protection regime, highlighting the social, cultural, legal, and political barriers to gender equality in the areas of violence, rights, and social inclusion. Through Photovoice methodology, the Syrian refugees’ (in)securities in Lebanon were given visibility by enabling local volunteers to record and reflect their realities through pictures, at the same time voice the participants’ anxieties and recommendations to reach normative policy change. This Participatory Action Research approach helped participants observe the structural barriers and lack of culturally inclusive refugee services that hinder security, increase discrimination, stigma, and poverty. The findings have implications for a shift of the refugee protection mechanisms to a community-based approach in ways that extend beyond narrow security policies that hinder women empowerment and raise vulnerabilities such as gendered exploitation, abuse, and neglect.

Keywords: gender, (in)security, Lebanon, refugee, Syrian refugees, women

Procedia PDF Downloads 116
139 An Overview of Domain Models of Urban Quantitative Analysis

Authors: Mohan Li

Abstract:

Nowadays, intelligent research technology is more and more important than traditional research methods in urban research work, and this proportion will greatly increase in the next few decades. Frequently such analyzing work cannot be carried without some software engineering knowledge. And here, domain models of urban research will be necessary when applying software engineering knowledge to urban work. In many urban plan practice projects, making rational models, feeding reliable data, and providing enough computation all make indispensable assistance in producing good urban planning. During the whole work process, domain models can optimize workflow design. At present, human beings have entered the era of big data. The amount of digital data generated by cities every day will increase at an exponential rate, and new data forms are constantly emerging. How to select a suitable data set from the massive amount of data, manage and process it has become an ability that more and more planners and urban researchers need to possess. This paper summarizes and makes predictions of the emergence of technologies and technological iterations that may affect urban research in the future, discover urban problems, and implement targeted sustainable urban strategies. They are summarized into seven major domain models. They are urban and rural regional domain model, urban ecological domain model, urban industry domain model, development dynamic domain model, urban social and cultural domain model, urban traffic domain model, and urban space domain model. These seven domain models can be used to guide the construction of systematic urban research topics and help researchers organize a series of intelligent analytical tools, such as Python, R, GIS, etc. These seven models make full use of quantitative spatial analysis, machine learning, and other technologies to achieve higher efficiency and accuracy in urban research, assisting people in making reasonable decisions.

Keywords: big data, domain model, urban planning, urban quantitative analysis, machine learning, workflow design

Procedia PDF Downloads 156
138 Software-Defined Networking: A New Approach to Fifth Generation Networks: Security Issues and Challenges Ahead

Authors: Behrooz Daneshmand

Abstract:

Software Defined Networking (SDN) is designed to meet the future needs of 5G mobile networks. The SDN architecture offers a new solution that involves separating the control plane from the data plane, which is usually paired together. Network functions traditionally performed on specific hardware can now be abstracted and virtualized on any device, and a centralized software-based administration approach is based on a central controller, facilitating the development of modern applications and services. These plan standards clear the way for a more adaptable, speedier, and more energetic network beneath computer program control compared with a conventional network. We accept SDN gives modern inquire about openings to security, and it can significantly affect network security research in numerous diverse ways. Subsequently, the SDN architecture engages systems to effectively screen activity and analyze threats to facilitate security approach modification and security benefit insertion. The segregation of the data planes and control and, be that as it may, opens security challenges, such as man-in-the-middle attacks (MIMA), denial of service (DoS) attacks, and immersion attacks. In this paper, we analyze security threats to each layer of SDN - application layer - southbound interfaces/northbound interfaces - controller layer and data layer. From a security point of see, the components that make up the SDN architecture have a few vulnerabilities, which may be abused by aggressors to perform noxious activities and hence influence the network and its administrations. Software-defined network assaults are shockingly a reality these days. In a nutshell, this paper highlights architectural weaknesses and develops attack vectors at each layer, which leads to conclusions about further progress in identifying the consequences of attacks and proposing mitigation strategies.

Keywords: software-defined networking, security, SDN, 5G/IMT-2020

Procedia PDF Downloads 66
137 Risk Management and Resiliency: Evaluating Walmart’s Global Supply Chain Leadership Using the Supply Chain Resilience Assessment and Management Framework

Authors: Meghan Biallas, Amanda Hoffman, Tamara Miller, Kimmy Schnibben, Janaina Siegler

Abstract:

This paper assesses Walmart’s supply chain resiliency amidst continuous supply chain disruptions. It aims to evaluate how Walmart can use supply chain resiliency theory to retain its status as a global supply chain leader. The Bloomberg terminal was used to organize Walmart’s 754 Tier-1 suppliers by the size of their relationship to Walmart. Additional data from IBISWorld and Statista was also used in the analysis. This research focused on the top ten Tier-1 suppliers, with the greatest percentage of their revenue attributed to Walmart. This paper also applied the firm’s information to the Supply Chain Resilience Assessment and Management (SCRAM) framework for supply chain resiliency to evaluate the firm’s capabilities, vulnerabilities, and gaps. A rubric was created to quantify Walmart’s risks using four pillars: flexibility, velocity, visibility, and collaboration. Information and examples were reported from Walmart’s 10k filing. For each example, a rating of 1 indicated “high” resiliency, 0 indicated “medium” resiliency, and -1 indicated “low” resiliency. Findings from this study include the following: (1) Walmart has maintained its leadership through its ability to remain resilient with regard to visibility, efficiency, capacity, and collaboration. (2) Walmart is experiencing increases in supply chain costs due to internal factors affecting the company and external factors affecting its suppliers. (3) There are a number of emerging supply chain risks with Walmart’s suppliers, which could cause issues for Walmart to remain a supply chain leader in the future. Using the SCRAM framework, this paper assesses how Walmart measures up to the Supply Chain Resiliency Theory, identifying areas of strength as well as areas where Walmart can improve in order to remain a global supply chain leader.

Keywords: supply chain resiliency, zone of balanced resilience, supply chain resilience assessment and management, supply chain theory.

Procedia PDF Downloads 87
136 A Convolutional Neural Network Based Vehicle Theft Detection, Location, and Reporting System

Authors: Michael Moeti, Khuliso Sigama, Thapelo Samuel Matlala

Abstract:

One of the principal challenges that the world is confronted with is insecurity. The crime rate is increasing exponentially, and protecting our physical assets especially in the motorist industry, is becoming impossible when applying our own strength. The need to develop technological solutions that detect and report theft without any human interference is inevitable. This is critical, especially for vehicle owners, to ensure theft detection and speedy identification towards recovery efforts in cases where a vehicle is missing or attempted theft is taking place. The vehicle theft detection system uses Convolutional Neural Network (CNN) to recognize the driver's face captured using an installed mobile phone device. The location identification function uses a Global Positioning System (GPS) to determine the real-time location of the vehicle. Upon identification of the location, Global System for Mobile Communications (GSM) technology is used to report or notify the vehicle owner about the whereabouts of the vehicle. The installed mobile app was implemented by making use of python as it is undoubtedly the best choice in machine learning. It allows easy access to machine learning algorithms through its widely developed library ecosystem. The graphical user interface was developed by making use of JAVA as it is better suited for mobile development. Google's online database (Firebase) was used as a means of storage for the application. The system integration test was performed using a simple percentage analysis. Sixty (60) vehicle owners participated in this study as a sample, and questionnaires were used in order to establish the acceptability of the system developed. The result indicates the efficiency of the proposed system, and consequently, the paper proposes the use of the system can effectively monitor the vehicle at any given place, even if it is driven outside its normal jurisdiction. More so, the system can be used as a database to detect, locate and report missing vehicles to different security agencies.

Keywords: CNN, location identification, tracking, GPS, GSM

Procedia PDF Downloads 130
135 Dislocation Density-Based Modeling of the Grain Refinement in Surface Mechanical Attrition Treatment

Authors: Reza Miresmaeili, Asghar Heydari Astaraee, Fereshteh Dolati

Abstract:

In the present study, an analytical model based on dislocation density model was developed to simulate grain refinement in surface mechanical attrition treatment (SMAT). The correlation between SMAT time and development in plastic strain on one hand, and dislocation density evolution, on the other hand, was established to simulate the grain refinement in SMAT. A dislocation density-based constitutive material law was implemented using VUHARD subroutine. A random sequence of shots is taken into consideration for multiple impacts model using Python programming language by utilizing a random function. The simulation technique was to model each impact in a separate run and then transferring the results of each run as initial conditions for the next run (impact). The developed Finite Element (FE) model of multiple impacts describes the coverage evolution in SMAT. Simulations were run to coverage levels as high as 4500%. It is shown that the coverage implemented in the FE model is equal to the experimental coverage. It is depicted that numerical SMAT coverage parameter is adequately conforming to the well-known Avrami model. Comparison between numerical results and experimental measurements for residual stresses and depth of deformation layers confirms the performance of the established FE model for surface engineering evaluations in SMA treatment. X-ray diffraction (XRD) studies of grain refinement, including resultant grain size and dislocation density, were conducted to validate the established model. The full width at half-maximum in XRD profiles can be used to measure the grain size. Numerical results and experimental measurements of grain refinement illustrate good agreement and show the capability of established FE model to predict the gradient microstructure in SMA treatment.

Keywords: dislocation density, grain refinement, severe plastic deformation, simulation, surface mechanical attrition treatment

Procedia PDF Downloads 111
134 Climate Change Impacts, Vulnerability, and Adaptation among Rural Households in Ethiopia

Authors: Birtukan Atinkut Asmare

Abstract:

Climate change disproportionately affects many Africans who heavily rely on climate-exposed sectors such as rain-fed agriculture and fishing, rendering them highly vulnerable. Gender plays a significant role, as men and women experience unequal impacts and vulnerabilities due to gender norms, labor divisions, resource access, and power dynamics. Drawing on an integrated framework, this study sheds light on the gendered impacts of climate change on household’s livelihood, their vulnerability, and adaptation in rural Ethiopia's Lake Tana Basin. This study utilized mixed research methods, integrating diverse qualitative techniques such as focus group discussions, key informant interviews, and field observations, along with quantitative data gathered through household surveys. The findings reveal that women-headed households were more vulnerable to climate change than male-headed households. Flood was the major climate-induced hazards in the area that threatened the lives and livelihoods of households. In response to climate change, households undertook different adaptation measures such as agroforestry practices, crop diversification, seasonal migration, petty trading, charcoal and fuel wood sales. However, the adaptation strategies were slightly varied based on the gender of the household head. Women-headed households specifically engaged in fuelwood collection and selling and petty trading activities. The main constraints for adaptation were limited access to technologies, extension services, information, and financial services. Therefore, this research urges attention from research, policy, and advisory services on rural households who are trying to survive in the face of climate change.

Keywords: agriculture, climate change impacts, ethiopia, gender

Procedia PDF Downloads 33
133 Building Transparent Supply Chains through Digital Tracing

Authors: Penina Orenstein

Abstract:

In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.

Keywords: data mining, supply chain, empirical research, data mapping

Procedia PDF Downloads 149
132 Evaluation of the Impact of Telematics Use on Young Drivers’ Driving Behaviour: A Naturalistic Driving Study

Authors: WonSun Chen, James Boylan, Erwin Muharemovic, Denny Meyer

Abstract:

In Australia, drivers aged between 18 and 24 remained at high risk of road fatality over the last decade. Despite the successful implementation of the Graduated Licensing System (GLS) that supports young drivers in their early phases of driving, the road fatality statistics for these drivers remains high. In response to these statistics, studies conducted in Australia prior to the start of the COVID-19 pandemic have demonstrated the benefits of using telematics devices for improving driving behaviour, However, the impact of COVID-19 lockdown on young drivers’ driving behaviour has emerged as a global concern. Therefore, this naturalistic study aimed to evaluate and compare the driving behaviour(such as acceleration, braking, speeding, etc.) of young drivers with the adoption of in-vehicle telematics devices. Forty-two drivers aged between 18 and 30 and residing in the Australian state of Victoria participated in this study during the period of May to October 2022. All participants drove with the telematics devices during the first 30-day. At the start of the second 30-day, twenty-one participants were randomised to an intervention group where they were provided with an additional telematics ray device that provided visual feedback to the drivers, especially when they committed to aggressive driving behaviour. The remaining twenty-one participants remined their driving journeys without the extra telematics ray device (control group). Such trustworthy data enabled the assessment of changes in the driving behaviour of these young drivers using a machine learning approach in Python. Results are expected to show participants from the intervention group will show improvements in their driving behaviour compared to those from the control group.Furthermore, the telematics data enable the assessment and quantification of such improvements in driving behaviour. The findings from this study are anticipated to shed some light in guiding the development of customised campaigns and interventions to further address the high road fatality among young drivers in Australia.

Keywords: driving behaviour, naturalistic study, telematics data, young drivers

Procedia PDF Downloads 95
131 Simulating Studies on Phosphate Removal from Laundry Wastewater Using Biochar: Dudinin Approach

Authors: Eric York, James Tadio, Silas Owusu Antwi

Abstract:

Laundry wastewater contains a diverse range of chemical pollutants that can have detrimental effects on human health and the environment. In this study, simulation studies by Spyder Python software v 3.2 to assess the efficacy of biochar in removing PO₄³⁻ from wastewater were conducted. Through modeling and simulation, the mechanisms involved in the adsorption process of phosphate by biochar were studied by altering variables which is specific to the phosphate from common laundry phosphate detergents, such as the aqueous solubility, initial concentration, and temperature using the Dudinin Approach (DA). Results showed that the concentration equilibrate at near the highest concentrations for Sugar beet-120 mgL⁻¹, Tailing-85 mgL⁻¹, CaO- rich-50 mgL⁻¹, Eggshell and rice straw-48 mgL⁻¹, Undaria Pinnatifida Roots-190 mgL⁻¹, Ca-Alginate Granular Beads -240 mgL⁻¹, Laminaria Japonica Powder -900 mgL⁻¹, Pinesaw dust-57 mgL⁻¹, Ricehull-190 mgL⁻¹, sesame straw- 470 mgL⁻¹, Sugar Bagasse-380 mgL⁻¹, Miscanthus Giganteus-240 mgL⁻¹, Wood Bc-130 mgL⁻¹, Pine-25 mgL⁻¹, Sawdust-6.8 mgL⁻¹, Sewage Sludge-, Rice husk-12 mgL⁻¹, Corncob-117 mgL⁻¹, Maize straw- 1800 mgL⁻¹ while Peanut -Eucalyptus polybractea-, Crawfish equilibrated at near concentration. CO₂ activated Thalia, sewage sludge biochar, Broussonetia Papyrifera Leaves equilibrated just at the lower concentration. Only Soyer bean Stover exhibited a sharp rise and fall peak in mid-concentration at 2 mgL⁻¹ volume. The modelling results were consistent with experimental findings from the literature, ensuring the accuracy, repeatability, and reliability of the simulation study. The simulation study provided insights into adsorption for PO₄³⁻ from wastewater by biochar using concentration per volume that can be adsorbed ideally under the given conditions. Studies showed that applying the principle experimentally in real wastewater with all its complexity is warranted and not far-fetched.

Keywords: simulation studies, phosphate removal, biochar, adsorption, wastewater treatment

Procedia PDF Downloads 75
130 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 93
129 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 31
128 A Perspective on Teaching Mathematical Concepts to Freshman Economics Students Using 3D-Visualisations

Authors: Muhammad Saqib Manzoor, Camille Dickson-Deane, Prashan Karunaratne

Abstract:

Cobb-Douglas production (utility) function is a fundamental function widely used in economics teaching and research. The key reason is the function's characteristics to describe the actual production using inputs like labour and capital. The characteristics of the function like returns to scale, marginal, and diminishing marginal productivities are covered in the introductory units in both microeconomics and macroeconomics with a 2-dimensional static visualisation of the function. However, less insight is provided regarding three-dimensional surface, changes in the curvature properties due to returns to scale, the linkage of the short-run production function with its long-run counterpart and marginal productivities, the level curves, and the constraint optimisation. Since (freshman) learners have diverse prior knowledge and cognitive skills, the existing “one size fits all” approach is not very helpful. The aim of this study is to bridge this gap by introducing technological intervention with interactive animations of the three-dimensional surface and sequential unveiling of the characteristics mentioned above using Python software. A small classroom intervention has helped students enhance their analytical and visualisation skills towards active and authentic learning of this topic. However, to authenticate the strength of our approach, a quasi-Delphi study will be conducted to ask domain-specific experts, “What value to the learning process in economics is there using a 2-dimensional static visualisation compared to using a 3-dimensional dynamic visualisation?’ Here three perspectives of the intervention were reviewed by a panel comprising of novice students, experienced students, novice instructors, and experienced instructors in an effort to determine the learnings from each type of visualisations within a specific domain of knowledge. The value of this approach is key to suggesting different pedagogical methods which can enhance learning outcomes.

Keywords: cobb-douglas production function, quasi-Delphi method, effective teaching and learning, 3D-visualisations

Procedia PDF Downloads 118
127 Sustainable Adaptation: Social Equity and Local-Level Climate Adaptation Planning in U.S. Cities

Authors: Duran Fiack, Jeremy Cumberbatch, Michael Sutherland, Nadine Zerphey

Abstract:

Civic leaders have increasingly relied upon local climate adaptation plans to identify vulnerabilities, prioritize goals, and implement actions in order to prepare cities for the present and projected effects of global climate change. The concept of sustainability is central to these efforts, as climate adaptation discussions are often framed within the context of economic resilience, environmental protection, and the distribution of climate change impacts across various socioeconomic groups. For urban centers, the climate change issue presents unique challenges for each of these dimensions; however, its potential impacts on marginalized populations are extensive. This study draws from the ‘just sustainabilities’ framework to perform a qualitative analysis of climate adaptation plans prepared by 22 of the 100 largest U.S. cities and examine whether, and to what extent, such initiatives prioritize social equity improvements. Past research has found that the integration of sustainability in urban policy and planning often produces outcomes that favor environmental and economic objectives over social equity improvements. We find that social equity is a particularly prominent theme in local-level climate adaptation efforts, relative to environmental quality and economic development. The findings contribute to the literature on climate adaptation and sustainability within the urban context and offer practical insight for local-level stakeholders concerning potential obstacles and opportunities for the integration of social equity initiatives into climate adaptation planning. Given the likelihood that climate changes will continue to impose unique challenges for marginalized communities in urban areas, advancing our understanding of how social equity concerns are integrated into adaptation efforts is likely to become an increasingly critical area of inquiry.

Keywords: climate adaptation plan, climate change, social equity, sustainability

Procedia PDF Downloads 119