Search results for: automatic reporting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1643

Search results for: automatic reporting

713 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 141
712 Multi Campus Universities: Exploring Structures and Administrative Relationships:; A Comparative Study of Eight Universities in UK and Five in Pakistan

Authors: Laila Akbarali

Abstract:

In the small scale study, an attempt is made to explore the structure and administrative relationships adopted by Multi Campus Universities [MCU] in UK and Pakistan and how these universities deal with some selected issues with respect to student related functions. For this study, literature on multi-site, divisionalized and other complex organizations related to business and Industry was consulted and an attempt was made to empirically test the normative models in the literature with respect to centralized , deconcentrated and decentralized structures. A questionnaire was used to gather data for this study. Purposive sampling was used. The findings of this study are somewhat different for UK and Pakistan. Contrary to a substantial body of organization theory, the results show that deconcentrated and decentralized universities in the UK are prone to delays in decision making and tend not to sensitive to local needs. In Pakistan on the other hand, deconcentrated and decentralized universities are more sensitive to local needs and there are less delays in decision making. The findings suggest that distance and reporting relationships could perhaps be responsible for the contradiction. The results also suggest that there is better coordination when the subsidiary campus sub-registrar reports to the registrar. The findings also highlight, that in both contexts, leadership at the campus level remains an issue. The results suggest that there may be factors other than structure that allow universities to keep their identity intact. The study highlights that MCU are inclined to use Information Technology and develop broad policies within which they allow their campuses to operate.

Keywords: administrative relationships, Multi-Campus, organization structure, registrar

Procedia PDF Downloads 316
711 Ultrasonic Spectroscopy of Polymer Based PVDF-TrFE Composites with CNT Fillers

Authors: J. Belovickis, V. Samulionis, J. Banys, M. V. Silibin, A. V. Solnyshkin, A. V. Sysa

Abstract:

Ferroelectric polymers exhibit good flexibility, processability and low cost of production. Doping of ferroelectric polymers with nanofillers may modify its dielectric, elastic or piezoelectric properties. Carbon nanotubes are one of the ingredients that can improve the mechanical properties of polymer based composites. In this work, we report on both the ultrasonic and the dielectric properties of the copolymer polyvinylidene fluoride/tetrafluoroethylene (P(VDF-TrFE)) of the composition 70/30 mol% with various concentrations of carbon nanotubes (CNT). Experimental study of ultrasonic wave attenuation and velocity in these composites has been performed over wide temperature range (100 K – 410 K) using an ultrasonic automatic pulse-echo tecnique. The temperature dependences of ultrasonic velocity and attenuation showed anomalies attributed to the glass transition and paraelectric-ferroelectric phase transition. Our investigations showed mechanical losses to be dependent on the volume fraction of the CNTs within the composites. The existence of broad hysteresis of the ultrasonic wave attenuation and velocity within the nanocomposites is presented between cooling and heating cycles. By the means of dielectric spectroscopy, it is shown that the dielectric properties may be tuned by varying the volume fraction of the CNT fillers.

Keywords: carbon nanotubes, polymer composites, PVDF-TrFE, ultrasonic spectroscopy

Procedia PDF Downloads 335
710 The Role of Psychological Resilience in Predicting Psychological Distress in Kuwaiti Adults during Corona Varies Pandemic

Authors: Al-Tammar M. Shahah

Abstract:

Background and Objective: A novel pneumonia caused by coronavirus disease 2019 (COVID-19), which is spreading domestically and internationally, has been identified by the Chinese city of Wuhan since the end of December 2019. Limited studies examined the psychological experience such as anxiety, depression, and stress during Corona pandemic. Moreover, to the best of author's knowledge, there is no study to date has examined the psychological resilience and mental health during Corona pandemic in Kuwait. Therefore, the present research investigates the role of psychological resilience in predicting psychological distress among Kuwaiti adults during Corona pandemic. Method: Kuwaiti citizens (N = 735) completed an online survey, which includes four scales the Hospital Anxiety and Depression Scale HADS (anxiety and depression), the Connor-Davidson Resilience Scale (CD-RISC-25), and the Perceived Stress Scale (PSS). A cross-sectional correlational design was used. Results: A high level of stress was observed, with 59% reported moderate to severe stress. In contrast, low levels of anxiety and depression were observed; with 70% reporting no anxiety symptoms and 74% report no depression symptoms. Psychological resilience was negatively correlated with anxiety, depression, and stress, consistent with previous studies. As expected, resilience was found to account for significant variance in anxiety and stress after controlling for quarantine variables and demographic variables. Conclusion: The findings suggest that increasing psychological resilience might help reduce psychological distress after confronting with stressful live events in Kuwaiti citizen.

Keywords: anxiety, corona, depression, psychological resilience, stress

Procedia PDF Downloads 116
709 Promoting Couple HIV Testing among Migrants for HIV Prevention: Learnings from Integrated Counselling and Testing Centre (ICTC) in Odisha, India

Authors: Sunil Mekale, Debasish Chowdhury, Sanchita Patnaik, Amitav Das, Ashok Agarwal

Abstract:

Background: Odisha is a low HIV prevalence state in India (ANC-HIV positivity of 0.42% as per HIV sentinel surveillance 2010-2011); however, it is an important source migration state with 3.2% of male migrants reporting to be PLHIV. USAID Public Health Foundation of India -PIPPSE project is piloting a source-destination corridor programme between Odisha and Gujarat. In Odisha, the focus has been on developing a comprehensive strategy to reach out to the out migrants and their spouses in the place of their origin based on their availability. The project has made concerted attempts to identify vulnerable districts with high out migration and high positivity rate. Description: 48 out of 97 ICTCs were selected from nine top high out migration districts through multistage sampling. A retrospective descriptive analysis of HIV positive male migrants and their spouses for two years (April 2013-March 2015) was conducted. A total of 3,645 HIV positive records were analysed. Findings: Among 34.2% detected HIV positive in the ICTCs, 23.3% were male migrants and 11% were spouses of male migrants; almost 50% of total ICTC attendees. More than 70% of the PLHIV male migrants and their spouses were less than 45 years old. Conclusions: Couple HIV testing approach may be considered for male migrants and their spouses. ICTC data analysis could guide in identifying the locations with high HIV positivity among male migrants and their spouses.

Keywords: HIV testing, migrants, spouse of migrants, Integrated Counselling and Testing Centre (ICTC)

Procedia PDF Downloads 372
708 Studying the Possibility to Weld AA1100 Aluminum Alloy by Friction Stir Spot Welding

Authors: Ahmad K. Jassim, Raheem Kh. Al-Subar

Abstract:

Friction stir welding is a modern and an environmentally friendly solid state joining process used to joint relatively lighter family of materials. Recently, friction stir spot welding has been used instead of resistance spot welding which has received considerable attention from the automotive industry. It is environmentally friendly process that eliminated heat and pollution. In this research, friction stir spot welding has been used to study the possibility to weld AA1100 aluminum alloy sheet with 3 mm thickness by overlapping the edges of sheet as lap joint. The process was done using a drilling machine instead of milling machine. Different tool rotational speeds of 760, 1065, 1445, and 2000 RPM have been applied with manual and automatic compression to study their effect on the quality of welded joints. Heat generation, pressure applied, and depth of tool penetration have been measured during the welding process. The result shows that there is a possibility to weld AA1100 sheets; however, there is some surface defect that happened due to insufficient condition of welding. Moreover, the relationship between rotational speed, pressure, heat generation and tool depth penetration was created.

Keywords: friction, spot, stir, environmental, sustainable, AA1100 aluminum alloy

Procedia PDF Downloads 192
707 Evaluation of Clinical Decision Support System in Electronic Medical Record System: A Case of Malawi National Art Electronic Medical Record System

Authors: Pachawo Bisani, Goodall Nyirenda

Abstract:

The Malawi National Antiretroviral Therapy (NART) Electronic Medical Record (EMR) system was designed and developed with guidance from the Ministry of Health through the Department of HIV and AIDS (DHA) with the aim of supporting the management of HIV patient data and reporting in high prevalence ART clinics. As of 2021, the system has been scaled up to over 206 facilities across the country. The system is integrated with the clinical decision support system (CDSS) to assist healthcare providers in making a decision about an individual patient at a particular point in time. Despite NART EMR undergoing several evaluations and assessments, little has been done to evaluate the clinical decision support system in the NART EMR system. Hence, the study aimed to evaluate the use of CDSS in the NART EMR system in Malawi. The study adopted a mixed-method approach, and data was collected through interviews, observations, and questionnaires. The study has revealed that the CDSS tools were integrated into the ART clinic workflow, making it easy for the user to use it. The study has also revealed challenges in system reliability and information accuracy. Despite the challenges, the study further revealed that the system is effective and efficient, and overall, users are satisfied with the system. The study recommends that the implementers focus more on the logic behind the clinical decision-support intervention in order to address some of the concerns and enhance the accuracy of the information supplied. The study further suggests consulting the system's actual users throughout implementation.

Keywords: clinical decision support system, electronic medical record system, usability, antiretroviral therapy

Procedia PDF Downloads 87
706 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio

Procedia PDF Downloads 157
705 Using Risk Management Indicators in Decision Tree Analysis

Authors: Adel Ali Elshaibani

Abstract:

Risk management indicators augment the reporting infrastructure, particularly for the board and senior management, to identify, monitor, and manage risks. This enhancement facilitates improved decision-making throughout the banking organization. Decision tree analysis is a tool that visually outlines potential outcomes, costs, and consequences of complex decisions. It is particularly beneficial for analyzing quantitative data and making decisions based on numerical values. By calculating the expected value of each outcome, decision tree analysis can help assess the best course of action. In the context of banking, decision tree analysis can assist lenders in evaluating a customer’s creditworthiness, thereby preventing losses. However, applying these tools in developing countries may face several limitations, such as data availability, lack of technological infrastructure and resources, lack of skilled professionals, cultural factors, and cost. Moreover, decision trees can create overly complex models that do not generalize well to new data, known as overfitting. They can also be sensitive to small changes in the data, which can result in different tree structures and can become computationally expensive when dealing with large datasets. In conclusion, while risk management indicators and decision tree analysis are beneficial for decision-making in banks, their effectiveness is contingent upon how they are implemented and utilized by the board of directors, especially in the context of developing countries. It’s important to consider these limitations when planning to implement these tools in developing countries.

Keywords: risk management indicators, decision tree analysis, developing countries, board of directors, bank performance, risk management strategy, banking institutions

Procedia PDF Downloads 53
704 Synthesis of Plant-Mediated Silver Nanoparticles Using Erythrina indica Extract and Evaluation of Their Anti-Microbial Activities

Authors: Chandra Sekhar Singh, P. Chakrapani, B. Arun Jyothi, A. Roja Rani

Abstract:

The green synthesis of metallic nanoparticles (NPs) involves biocompatible ingredients under physiological conditions of temperature and pressure. Moreover, the biologically active molecules involved in the green synthesis of NPs act as functionalizing ligands, making these NPs more suitable for biomedical applications. Among the most important bioreductants are plant extracts, which are relatively easy to handle, readily available, low cost, and have been well explored for the green synthesis of other nanomaterials. Various types of metallic NPs have already been synthesized using plant extracts. They have wide applicability in various areas such as electronics, catalysis, chemistry, energy, and medicine. Metallic nanoparticles are traditionally synthesized by wet chemical techniques, where the chemicals used are quite often toxic and flammable. In our study, we were described a cost effective and environment friendly technique for green synthesis of silver nanoparticles from 1mM AgNO3 solution through the aqueous extract of Erythrina indica as reducing as well as capping agent. Nanoparticles were characterized using UV–Vis absorption spectroscopy, FTIR, XRD, X-ray diffraction, SEM and TEM analysis showed the average particle size of 30 nm as well as revealed their spherical structure. Further these biologically synthesized nanoparticles were found to be highly toxic against different human pathogens viz. two Gram positive namely Klebsiella pneumonia and Bacillus subtilis bacteria and two were Gram negative bacteria namely Staphylococcus aureus and Escherichia coli (E. coli). This is for the first time reporting that Erythrina indica plant extract was used for the synthesis of nanoparticles.

Keywords: silver nanoparticles, green synthesis, antibacterial activity, FTIR, TEM, SEM

Procedia PDF Downloads 492
703 Automated Classification of Hypoxia from Fetal Heart Rate Using Advanced Data Models of Intrapartum Cardiotocography

Authors: Malarvizhi Selvaraj, Paul Fergus, Andy Shaw

Abstract:

Uterine contractions produced during labour have the potential to damage the foetus by diminishing the maternal blood flow to the placenta. In order to observe this phenomenon labour and delivery are routinely monitored using cardiotocography monitors. An obstetrician usually makes the diagnosis of foetus hypoxia by interpreting cardiotocography recordings. However, cardiotocography capture and interpretation is time-consuming and subjective, often lead to misclassification that causes damage to the foetus and unnecessary caesarean section. Both of these have a high impact on the foetus and the cost to the national healthcare services. Automatic detection of foetal heart rate may be an objective solution to help to reduce unnecessary medical interventions, as reported in several studies. This paper aim is to provide a system for better identification and interpretation of abnormalities of the fetal heart rate using RStudio. An open dataset of 552 Intrapartum recordings has been filtered with 0.034 Hz filters in an attempt to remove noise while keeping as much of the discriminative data as possible. Features were chosen following an extensive literature review, which concluded with FIGO features such as acceleration, deceleration, mean, variance and standard derivation. The five features were extracted from 552 recordings. Using these features, recordings will be classified either normal or abnormal. If the recording is abnormal, it has got more chances of hypoxia.

Keywords: cardiotocography, foetus, intrapartum, hypoxia

Procedia PDF Downloads 213
702 Reducing Support Structures in Design for Additive Manufacturing: A Neural Networks Approach

Authors: Olivia Borgue, Massimo Panarotto, Ola Isaksson

Abstract:

This article presents a neural networks-based strategy for reducing the need for support structures when designing for additive manufacturing (AM). Additive manufacturing is a relatively new and immature industrial technology, and the information to make confident decisions when designing for AM is limited. This lack of information impacts especially the early stages of engineering design, for instance, it is difficult to actively consider the support structures needed for manufacturing a part. This difficulty is related to the challenge of designing a product geometry accounting for customer requirements, manufacturing constraints and minimization of support structure. The approach presented in this article proposes an automatized geometry modification technique for reducing the use of the support structures while designing for AM. This strategy starts with a neural network-based strategy for shape recognition to achieve product classification, using an STL file of the product as input. Based on the classification, an automatic part geometry modification based on MATLAB© is implemented. At the end of the process, the strategy presents different geometry modification alternatives depending on the type of product to be designed. The geometry alternatives are then evaluated adopting a QFD-like decision support tool.

Keywords: additive manufacturing, engineering design, geometry modification optimization, neural networks

Procedia PDF Downloads 246
701 Sustainable Management Practices in Facilities Management for Housing Estates: Literature Review - Case of South Africa

Authors: Chidozie Celestine Uzoigwe, Thabelo Ramantswana

Abstract:

Purpose: The purpose of the paper is to review the current state of knowledge in Sustainable Management (SM) practices in Facilities Management (FM) for housing estates with a view to identifying gaps and drawbacks in the existing body of knowledge. Integrating SM practices in housing estates is understood and applied in some developed countries, but little is known about the concept by practitioners in developing nations like South Africa. Indeed, South African housing studies and policy documents emphasize the significance of sustainability practices in housing estates, but regrettably, it still remains in unsustainable housing for decades. Method: This is done through a comprehensive literature review in combination with the “Preferred Reporting Items for Systematic Reviews and Meta-Analyses” (PRISMA) guidelines and the Recursive Content Abstraction (RCA) analytical approach. Finding: Findings revealed there are significant challenges to the integration of SM practices in FM for housing estates in South Africa, such as lack of understanding and expertise by practitioners, absence of support from top-level management, lack of appropriate sustainability management framework as well as people management. The review further suggests the need for an enabler to guide practitioners in integrating the SM concept in housing estates. Originality: Integrating SM principles in facilities management for housing estates are required to eliminate the detrimental impact the built environment exerts on the well-being of individuals and organizations. Thus, the study underlines the need for an enabler that will facilitate practitioners embedding sustainable management measures in the management of housing estates in South Africa.

Keywords: facilities management, housing estates, sustainable facilities management, sustainable management, south africa

Procedia PDF Downloads 87
700 Enhanced Arabic Semantic Information Retrieval System Based on Arabic Text Classification

Authors: A. Elsehemy, M. Abdeen , T. Nazmy

Abstract:

Since the appearance of the Semantic web, many semantic search techniques and models were proposed to exploit the information in ontology to enhance the traditional keyword-based search. Many advances were made in languages such as English, German, French and Spanish. However, other languages such as Arabic are not fully supported yet. In this paper we present a framework for ontology based information retrieval for Arabic language. Our system consists of four main modules, namely query parser, indexer, search and a ranking module. Our approach includes building a semantic index by linking ontology concepts to documents, including an annotation weight for each link, to be used in ranking the results. We also augmented the framework with an automatic document categorizer, which enhances the overall document ranking. We have built three Arabic domain ontologies: Sports, Economic and Politics as example for the Arabic language. We built a knowledge base that consists of 79 classes and more than 1456 instances. The system is evaluated using the precision and recall metrics. We have done many retrieval operations on a sample of 40,316 documents with a size 320 MB of pure text. The results show that the semantic search enhanced with text classification gives better performance results than the system without classification.

Keywords: Arabic text classification, ontology based retrieval, Arabic semantic web, information retrieval, Arabic ontology

Procedia PDF Downloads 519
699 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: artificial neural networks, fuel consumption, friedman test, machine learning, statistical hypothesis testing

Procedia PDF Downloads 174
698 e-Learning Security: A Distributed Incident Response Generator

Authors: Bel G Raggad

Abstract:

An e-Learning setting is a distributed computing environment where information resources can be connected to any public network. Public networks are very unsecure which can compromise the reliability of an e-Learning environment. This study is only concerned with the intrusion detection aspect of e-Learning security and how incident responses are planned. The literature reported great advances in intrusion detection system (ids) but neglected to study an important ids weakness: suspected events are detected but an intrusion is not determined because it is not defined in ids databases. We propose an incident response generator (DIRG) that produces incident responses when the working ids system suspects an event that does not correspond to a known intrusion. Data involved in intrusion detection when ample uncertainty is present is often not suitable to formal statistical models including Bayesian. We instead adopt Dempster and Shafer theory to process intrusion data for the unknown event. The DIRG engine transforms data into a belief structure using incident scenarios deduced by the security administrator. Belief values associated with various incident scenarios are then derived and evaluated to choose the most appropriate scenario for which an automatic incident response is generated. This article provides a numerical example demonstrating the working of the DIRG system.

Keywords: decision support system, distributed computing, e-Learning security, incident response, intrusion detection, security risk, statefull inspection

Procedia PDF Downloads 429
697 Quality Management in Spice Paprika Production as a Synergy of Internal and External Quality Measures

Authors: É. Kónya, E. Szabó, I. Bata-Vidács, T. Deák, M. Ottucsák, N. Adányi, A. Székács

Abstract:

Spice paprika is a major spice commodity in the European Union (EU), produced locally and imported from non-EU countries, reported not only for chemical and microbiological contamination, but also for fraud. The effective interaction between producers’ quality management practices and government and EU activities is described on the example of spice paprika production and control in Hungary, a country of leading spice paprika producer and per capita consumer in Europe. To demonstrate the importance of various contamination factors in the Hungarian production and EU trade of spice paprika, several aspects concerning food safety of this commodity are presented. Alerts in the Rapid Alert System for Food and Feed (RASFF) of the EU between 2005 and 2013, as well as Hungarian state inspection results on spice paprika in 2004 are discussed, and quality non-compliance claims regarding spice paprika among EU member states are summarized in by means of network analysis. Quality assurance measures established along the spice paprika production technology chain at the leading Hungarian spice paprika manufacturer, Kalocsai Fűszerpaprika Zrt. are surveyed with main critical control points identified. The structure and operation of the Hungarian state food safety inspection system is described. Concerted performance of the latter two quality management systems illustrates the effective interaction between internal (manufacturer) and external (state) quality control measures.

Keywords: spice paprika, quality control, reporting mechanisms, RASFF, vulnerable points, HACCP

Procedia PDF Downloads 282
696 Security as the Key Factor in Contemporary Tourism: Specificities Identified from the Analysis of Responders' Attitudes

Authors: Petar Kurecic, Josipa Penic

Abstract:

The paper represents a product of mentor-graduate student cooperation, developed at the graduate study of Business Economics, major Tourism. The analysis was made through the anonymous questionnaire filled by the respondents from Croatia. Following the latest threatening events and having in mind those yet to come, it can be concluded that no country can benefit from the tourism industry if at the same time does not develop its security system as an integral part of the standard tourist offer. Analyzing the trends in contemporary tourism, the safety and security issues became the decisive factors for the choice of a certain destination. Consequently, countries must not perceive security systems and measures as an unnecessary expense but as an essential element in organizing their tourist services. All hotels and respectable tourist agencies should have a crisis management, with detailed, thoroughly elaborated procedures for emergency situations. Tourists should be timely informed about the potential dangers and risks and the measures taken to prevent them, as well as on procedures for emergency situations. Additionally, it would be good to have mobile applications that would enable tourists to make direct emergency calls with instructions on behavior in crisis situations. It is also essential to implement and put into effect sophisticated security measures such as using surveillance cameras, controlling access to buildings, information exchange with colleagues and neighbors, reporting the suspicious occurrences to the security services, and training staff for crisis management. The security issue is definitely one of the crucial factors in the development of tourism in a certain country.

Keywords: security, security measures in tourism, tourism, tourist destinations

Procedia PDF Downloads 278
695 The Impact of Data Science on Geography: A Review

Authors: Roberto Machado

Abstract:

We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.

Keywords: data science, geography, systematic review, optimization algorithms, supervised learning

Procedia PDF Downloads 15
694 Heat: A Healthy Eating Programme

Authors: Osagbai Joshua Eriki, Ngozi Agunwamba, Alice Hill, Lorna Almond, Maniya Duffy, Devashini Naidoo, David Ho, Raman Deo

Abstract:

Aims: To evaluate the baseline eating pattern in a psychiatric hospital through quantifying purchases of food and drink items at the hospital shop and to implement a traffic light healthy eating labeling system. Method: A electronic till with reporting capabilities was purchased. A two-week period of baseline data collection was conducted. Thereafter, a system for labeling items based on the nutritional value of the food items at the hospital shop was implemented. Green labeling represented the items with the lowest calories and red the most. Further data was collated on the number and types of items purchased by patients according to the category, and the initial effectiveness of the system was evaluated. Result: Despite the implementation of the traffic light system, the red category had the highest number of items purchased by patients, highlighting the importance of promoting healthy eating choices. However, the study also showed that the system was effective in promoting healthy options, as the number of items purchased from the green category increased during the study period. Conclusion: The implementation of a traffic light labeling system for items sold at the hospital shop offers a promising approach to promoting healthy eating habits and choices. This is likely to contribute to a toolkit of measures when considering the multifactorial challenges that obesity and weight issues pose for long-stay psychiatric inpatients

Keywords: mental health, nutrition, food, healthy

Procedia PDF Downloads 96
693 A New Intelligent, Dynamic and Real Time Management System of Sewerage

Authors: R. Tlili Yaakoubi, H.Nakouri, O. Blanpain, S. Lallahem

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 19 to 100 %. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 40 % of total volume rejected to the natural environment and of 65 % in the number of discharges.

Keywords: automation, optimization, paradigm, RTC

Procedia PDF Downloads 293
692 Smartphones as a Tool of Mobile Journalism in Saudi Arabia

Authors: Ahmed Deen

Abstract:

The introduction of the mobile devices which were equipped with internet access and a camera, as well as the messaging services, has become a major inspiration for the use of the mobile devices in the growth in the reporting of news. Mobile journalism (MOJO) was a creation of modern technology, especially the use of mobile technology for video journalism purposes. MOJO, thus, is the process by which information is collected and disseminated to society, through the use of mobile technology, and even the use of the tablets. This paper seeks to better understand the ethics of Saudi mobile journalists towards news coverage. Also, this study aims to explore the relationship between minimizing harms and truth-seeking efforts among Saudi mobile journalists. Three main ethics were targeted in this study, which are seek truth and report it, minimize harm, and being accountable. Diffusion of innovation theory applied to reach this study’s goals. The non- probability sampling approach, ‘Snowball Sampling’ was used to target 124 survey participants, an online survey via SurveyMonkey that was distributed through social media platforms as a web link. The code of ethics of the Society of Professional Journalists has applied as a scale in this study. This study found that the relationship between minimizing harm and truth-seeking efforts is significantly moderate among Saudi mobile journalists. Also, it is found that the level journalistic experiences and using smartphones to cover news are weakly and negatively related to the perceptions of mobile journalism among Saudi journalists, while Saudi journalists who use their smartphone to cover the news between 1-3 years, were the majority of participants (55 participants by 51.4%).

Keywords: mobile journalism, Saudi journalism, smartphone, Saudi Arabia

Procedia PDF Downloads 168
691 Optimizing Fire Suppression Time in Buildings by Forming a Fire Feedback Loop

Authors: Zhdanova A. O., Volkov R. S., Kuznetsov G. V., Strizhak P. A.

Abstract:

Fires in different types of facilities are a serious problem worldwide.It is still an unaccomplished science and technology objective to establish the minimum number and type of sensors in automatic systems of compartment fire suppression which would turn the fire-extinguishing agent spraying on and off in real time depending on the state of the fire, minimize the amount of agent applied, delay time in fire suppression and system response, as well as the time of combustion suppression. Based on the results of experimental studies, the conclusion was made that it is reasonable to use a gas analysis system and heat sensors (in the event of their prior activation) to determine the effectiveness of fire suppression (fire-extinguishing composition interacts with the fire). Thus, the concentration of CO in the interaction of the firefighting liquid with the fire increases to 0.7–1.2%, which indicates a slowdown in the flame combustion, and heat sensors stop responding at a gas medium temperature below 80 ºC, which shows a gradual decrease in the heat release from the fire. The evidence from this study suggests that the information received from the video recording equipment (video camera) should be used in real time as an additional parameter confirming fire suppression. Research was supported by Russian Science Foundation (project No 21-19-00009, https://rscf.ru/en/project/21-19-00009/).

Keywords: compartment fires, fire suppression, continuous control of fire behavior, feedback systems

Procedia PDF Downloads 124
690 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification

Authors: Bing Li, Zhi Li, Yilong Yang

Abstract:

Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.

Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery

Procedia PDF Downloads 132
689 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing

Procedia PDF Downloads 276
688 A Generative Pretrained Transformer-Based Question-Answer Chatbot and Phantom-Less Quantitative Computed Tomography Bone Mineral Density Measurement System for Osteoporosis

Authors: Mian Huang, Chi Ma, Junyu Lin, William Lu

Abstract:

Introduction: Bone health attracts more attention recently and an intelligent question and answer (QA) chatbot for osteoporosis is helpful for science popularization. With Generative Pretrained Transformer (GPT) technology developing, we build an osteoporosis corpus dataset and then fine-tune LLaMA, a famous open-source GPT foundation large language model(LLM), on our self-constructed osteoporosis corpus. Evaluated by clinical orthopedic experts, our fine-tuned model outperforms vanilla LLaMA on osteoporosis QA task in Chinese. Three-dimensional quantitative computed tomography (QCT) measured bone mineral density (BMD) is considered as more accurate than DXA for BMD measurement in recent years. We develop an automatic Phantom-less QCT(PL-QCT) that is more efficient for BMD measurement since no need of an external phantom for calibration. Combined with LLM on osteoporosis, our PL-QCT provides efficient and accurate BMD measurement for our chatbot users. Material and Methods: We build an osteoporosis corpus containing about 30,000 Chinese literatures whose titles are related to osteoporosis. The whole process is done automatically, including crawling literatures in .pdf format, localizing text/figure/table region by layout segmentation algorithm and recognizing text by OCR algorithm. We train our model by continuous pre-training with Low-rank Adaptation (LoRA, rank=10) technology to adapt LLaMA-7B model to osteoporosis domain, whose basic principle is to mask the next word in the text and make the model predict that word. The loss function is defined as cross-entropy between the predicted and ground-truth word. Experiment is implemented on single NVIDIA A800 GPU for 15 days. Our automatic PL-QCT BMD measurement adopt AI-associated region-of-interest (ROI) generation algorithm for localizing vertebrae-parallel cylinder in cancellous bone. Due to no phantom for BMD calibration, we calculate ROI BMD by CT-BMD of personal muscle and fat. Results & Discussion: Clinical orthopaedic experts are invited to design 5 osteoporosis questions in Chinese, evaluating performance of vanilla LLaMA and our fine-tuned model. Our model outperforms LLaMA on over 80% of these questions, understanding ‘Expert Consensus on Osteoporosis’, ‘QCT for osteoporosis diagnosis’ and ‘Effect of age on osteoporosis’. Detailed results are shown in appendix. Future work may be done by training a larger LLM on the whole orthopaedics with more high-quality domain data, or a multi-modal GPT combining and understanding X-ray and medical text for orthopaedic computer-aided-diagnosis. However, GPT model gives unexpected outputs sometimes, such as repetitive text or seemingly normal but wrong answer (called ‘hallucination’). Even though GPT give correct answers, it cannot be considered as valid clinical diagnoses instead of clinical doctors. The PL-QCT BMD system provided by Bone’s QCT(Bone’s Technology(Shenzhen) Limited) achieves 0.1448mg/cm2(spine) and 0.0002 mg/cm2(hip) mean absolute error(MAE) and linear correlation coefficient R2=0.9970(spine) and R2=0.9991(hip)(compared to QCT-Pro(Mindways)) on 155 patients in three-center clinical trial in Guangzhou, China. Conclusion: This study builds a Chinese osteoporosis corpus and develops a fine-tuned and domain-adapted LLM as well as a PL-QCT BMD measurement system. Our fine-tuned GPT model shows better capability than LLaMA model on most testing questions on osteoporosis. Combined with our PL-QCT BMD system, we are looking forward to providing science popularization and early morning screening for potential osteoporotic patients.

Keywords: GPT, phantom-less QCT, large language model, osteoporosis

Procedia PDF Downloads 63
687 Patient Support Program in Pharmacovigilance: Foster Patient Confidence and Compliance

Authors: Atul Khurana, Rajul Rastogi, Hans-Joachim Gamperl

Abstract:

The pharmaceutical companies are getting more inclined towards patient support programs (PSPs) which assist patients and/or healthcare professionals (HCPs) in more desirable disease management and cost-effective treatment. The utmost objective of these programs is patient care. The PSPs may include financial assistance to patients, medicine compliance programs, access to HCPs via phone or online chat centers, etc. The PSP has a crucial role in terms of customer acquisition and retention strategies. During the conduct of these programs, Marketing Authorisation Holder (MAH) may receive information related to concerned medicinal products, which is usually reported by patients or involved HCPs. This information may include suspected adverse reaction(s) during/after administration of medicinal products. Hence, the MAH should design PSP to comply with regulatory reporting requirements and avoid non-compliance during PV inspection. The emergence of wireless health devices is lowering the burden on patients to manually incorporate safety data, and building a significant option for patients to observe major swings in reference to drug safety. Therefore, to enhance the adoption of these programs, MAH not only needs to aware patients about advantages of the program, but also recognizes the importance of time of patients and commitments made in a constructive manner. It is indispensable that strengthening the public health is considered as the topmost priority in such programs, and the MAH is compliant to Pharmacovigilance (PV) requirements along with regulatory obligations.

Keywords: drug safety, good pharmacovigilance practice, patient support program, pharmacovigilance

Procedia PDF Downloads 303
686 Comparison of Breast Surface Doses for Full-Field Digital Mammography and Digital Breast Tomosynthesis Using Breast Phantoms

Authors: Chia-Hui Chen, Chien-Kuo Wang

Abstract:

Background: Full field digital mammography (FFDM) is widely used in diagnosis of breast cancer. Digital breast tomosynthesis (DBT) has recently been introduced into the clinic and is being used for screening for breast cancer in the general population. Hence, the radiation dose delivered to the patients involved in an imaging protocol is of utmost concern. Aim: To compare the surface radiation dose (ESD) of digital breast tomosynthesis (DBT) and full-field digital mammography (FFDM) by using breast phantoms. Method: We analyzed the average entrance surface dose (ESD) of FFDM and DBT by using breast phantoms. Optically Stimulated luminescent Dosimeters (OSLD) were placed in a tissue-equivalent Breast phantom at difference sites of interest. Absorbed dose measurements were obtained after digital breast tomosynthesis (DBT) and full-field digital mammography (FFDM) exposures. Results: An automatic exposure control (AEC) is proposed for surface dose measurement during DBT and FFDM. The mean ESD values for DBT and FFDM were 6.37 mGy and 3.51mGy, respectively. Using of OSLD measured for surface dose during DBT and FFDM. There were 19.87 mGy and 11.36 mGy, respectively. The surface exposure dose of DBT could possibly be increased by two times with FFDM. Conclusion: The radiation dose from DBT was higher than that of FFDM and the difference in dose between AEC and OSLD measurements at phantom surface.

Keywords: full-field digital mammography, digital breast tomosynthesis, optically stimulated luminescent dosimeters, surface dose

Procedia PDF Downloads 414
685 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing

Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello

Abstract:

In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.

Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction

Procedia PDF Downloads 424
684 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement

Authors: Gheida J. Shahrour, Martin J. Russell

Abstract:

The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.

Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation

Procedia PDF Downloads 535