Search results for: SAP ASE database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1604

Search results for: SAP ASE database

734 Seismic Performance of Slopes Subjected to Earthquake Mainshock Aftershock Sequences

Authors: Alisha Khanal, Gokhan Saygili

Abstract:

It is commonly observed that aftershocks follow the mainshock. Aftershocks continue over a period of time with a decreasing frequency and typically there is not sufficient time for repair and retrofit between a mainshock–aftershock sequence. Usually, aftershocks are smaller in magnitude; however, aftershock ground motion characteristics such as the intensity and duration can be greater than the mainshock due to the changes in the earthquake mechanism and location with respect to the site. The seismic performance of slopes is typically evaluated based on the sliding displacement predicted to occur along a critical sliding surface. Various empirical models are available that predict sliding displacement as a function of seismic loading parameters, ground motion parameters, and site parameters but these models do not include the aftershocks. The seismic risks associated with the post-mainshock slopes ('damaged slopes') subjected to aftershocks is significant. This paper extends the empirical sliding displacement models for flexible slopes subjected to earthquake mainshock-aftershock sequences (a multi hazard approach). A dataset was developed using 144 pairs of as-recorded mainshock-aftershock sequences using the Pacific Earthquake Engineering Research Center (PEER) database. The results reveal that the combination of mainshock and aftershock increases the seismic demand on slopes relative to the mainshock alone; thus, seismic risks are underestimated if aftershocks are neglected.

Keywords: seismic slope stability, mainshock, aftershock, landslide, earthquake, flexible slopes

Procedia PDF Downloads 136
733 Compliance with the Health and Safety Standards/Regulations in the South African Mining Industry: A Literature Review

Authors: Livhuwani Muthelo, Tebogo Maria Mothiba, Rambelani Nancy Malema

Abstract:

Background: Despite occupational legislation/standards being in place in the industry, there are many reported health and safety incidents, including both occupational injuries and illnesses in the South African mining industry. Purpose: This systematic literature review aimed to describe and identify the existing gaps in health and safety compliance within the South African mining industry and propose future research areas. Methodology: A systematic literature review was conducted using the key concepts of health and safety, compliance, standards, and mining. A total of 102 papers issued from 1994 to April 2020 were extracted from an online database search, which included a combination of South African and international government OHS legislation documents, policies, standards, reports from the mineral departments and international labour office, qualitative and quantitative journal articles, dissertations, seminars and conference proceedings. Results: The literature review revealed that, though there are laws, regulations, standards to guide the industry on health and safety issues in South Africa, the main challenge is with the compliance with the existing health and safety systems, wherein systems are not being implemented. Conclusion: Gaps between research, policy, and implementation in occupational health practice in the South African mining industry were also identified.

Keywords: circumstances, non-compliance, health and safety, standards, mining industry

Procedia PDF Downloads 257
732 Supplier Carbon Footprint Methodology Development for Automotive Original Equipment Manufacturers

Authors: Nur A. Özdemir, Sude Erkin, Hatice K. Güney, Cemre S. Atılgan, Enes Huylu, Hüseyin Y. Altıntaş, Aysemin Top, Özak Durmuş

Abstract:

Carbon emissions produced during a product’s life cycle, from extraction of raw materials up to waste disposal and market consumption activities are the major contributors to global warming. In the light of the science-based targets (SBT) leading the way to a zero-carbon economy for sustainable growth of the companies, carbon footprint reporting of the purchased goods has become critical for identifying hotspots and best practices for emission reduction opportunities. In line with Ford Otosan's corporate sustainability strategy, research was conducted to evaluate the carbon footprint of purchased products in accordance with Scope 3 of the Greenhouse Gas Protocol (GHG). The purpose of this paper is to develop a systematic and transparent methodology to calculate carbon footprint of the products produced by automotive OEMs (Original Equipment Manufacturers) within the context of automobile supply chain management. To begin with, primary material data were collected through IMDS (International Material Database System) corresponds to company’s three distinct types of vehicles including Light Commercial Vehicle (Courier), Medium Commercial Vehicle (Transit and Transit Custom), Heavy Commercial Vehicle (F-MAX). Obtained material data was classified as metals, plastics, liquids, electronics, and others to get insights about the overall material distribution of produced vehicles and matched to the SimaPro Ecoinvent 3 database which is one of the most extent versions for modelling material data related to the product life cycle. Product life cycle analysis was calculated within the framework of ISO 14040 – 14044 standards by addressing the requirements and procedures. A comprehensive literature review and cooperation with suppliers were undertaken to identify the production methods of parts used in vehicles and to find out the amount of scrap generated during part production. Cumulative weight and material information with related production process belonging the components were listed by multiplying with current sales figures. The results of the study show a key modelling on carbon footprint of products and processes based on a scientific approach to drive sustainable growth by setting straightforward, science-based emission reduction targets. Hence, this study targets to identify the hotspots and correspondingly provide broad ideas about our understanding of how to integrate carbon footprint estimates into our company's supply chain management by defining convenient actions in line with climate science. According to emission values arising from the production phase including raw material extraction and material processing for Ford OTOSAN vehicles subjected in this study, GHG emissions from the production of metals used for HCV, MCV and LCV account for more than half of the carbon footprint of the vehicle's production. Correspondingly, aluminum and steel have the largest share among all material types and achieving carbon neutrality in the steel and aluminum industry is of great significance to the world, which will also present an immense impact on the automobile industry. Strategic product sustainability plan which includes the use of secondary materials, conversion to green energy and low-energy process design is required to reduce emissions of steel, aluminum, and plastics due to the projected increase in total volume by 2030.

Keywords: automotive, carbon footprint, IMDS, scope 3, SimaPro, sustainability

Procedia PDF Downloads 95
731 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng

Abstract:

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).

Keywords: motion detection, motion tracking, trajectory analysis, video surveillance

Procedia PDF Downloads 531
730 Three Tier Indoor Localization System for Digital Forensics

Authors: Dennis L. Owuor, Okuthe P. Kogeda, Johnson I. Agbinya

Abstract:

Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the implementation of a robust system that is capable of locating, tracking mobile device users and store location information for both indoor and partially outdoor the cloud. The system can be used during disaster to track and locate mobile phone users. The developed system is a mobile application built based on Android, Hypertext Preprocessor (PHP), Cascading Style Sheets (CSS), JavaScript and MATLAB for the Android mobile users. Using Waterfall model of software development, we have implemented a three level system that is able to track, locate and store mobile device information in secure database (cloud) on almost a real time basis. The outcome of the study showed that the developed system is efficient with regard to the tracking and locating mobile devices. The system is also flexible, i.e. can be used in any building with fewer adjustments. Finally, the system is accurate for both indoor and outdoor in terms of locating and tracking mobile devices.

Keywords: indoor localization, digital forensics, fingerprinting, tracking and cloud

Procedia PDF Downloads 320
729 Simulation IDM for Schedule Generation of Slip-Form Operations

Authors: Hesham A. Khalek, Shafik S. Khoury, Remon F. Aziz, Mohamed A. Hakam

Abstract:

Slipforming operation’s linearity is a source of planning complications, and operation is usually subjected to bottlenecks at any point, so careful planning is required in order to achieve success. On the other hand, Discrete-event simulation concepts can be applied to simulate and analyze construction operations and to efficiently support construction scheduling. Nevertheless, preparation of input data for construction simulation is very challenging, time-consuming and human prone-error source. Therefore, to enhance the benefits of using DES in construction scheduling, this study proposes an integrated module to establish a framework for automating the generation of time schedules and decision support for Slipform construction projects, particularly through the project feasibility study phase by using data exchange between project data stored in an Intermediate database, DES and Scheduling software. Using the stored information, proposed system creates construction tasks attribute [e.g. activities durations, material quantities and resources amount], then DES uses all the given information to create a proposal for the construction schedule automatically. This research is considered a demonstration of a flexible Slipform project modeling, rapid scenario-based planning and schedule generation approach that may be of interest to both practitioners and researchers.

Keywords: discrete-event simulation, modeling, construction planning, data exchange, scheduling generation, EZstrobe

Procedia PDF Downloads 365
728 Correlation between Speech Emotion Recognition Deep Learning Models and Noises

Authors: Leah Lee

Abstract:

This paper examines the correlation between deep learning models and emotions with noises to see whether or not noises mask emotions. The deep learning models used are plain convolutional neural networks (CNN), auto-encoder, long short-term memory (LSTM), and Visual Geometry Group-16 (VGG-16). Emotion datasets used are Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS), Crowd-sourced Emotional Multimodal Actors Dataset (CREMA-D), Toronto Emotional Speech Set (TESS), and Surrey Audio-Visual Expressed Emotion (SAVEE). To make it four times bigger, audio set files, stretch, and pitch augmentations are utilized. From the augmented datasets, five different features are extracted for inputs of the models. There are eight different emotions to be classified. Noise variations are white noise, dog barking, and cough sounds. The variation in the signal-to-noise ratio (SNR) is 0, 20, and 40. In summation, per a deep learning model, nine different sets with noise and SNR variations and just augmented audio files without any noises will be used in the experiment. To compare the results of the deep learning models, the accuracy and receiver operating characteristic (ROC) are checked.

Keywords: auto-encoder, convolutional neural networks, long short-term memory, speech emotion recognition, visual geometry group-16

Procedia PDF Downloads 61
727 Classification of Multiple Cancer Types with Deep Convolutional Neural Network

Authors: Nan Deng, Zhenqiu Liu

Abstract:

Thousands of patients with metastatic tumors were diagnosed with cancers of unknown primary sites each year. The inability to identify the primary cancer site may lead to inappropriate treatment and unexpected prognosis. Nowadays, a large amount of genomics and transcriptomics cancer data has been generated by next-generation sequencing (NGS) technologies, and The Cancer Genome Atlas (TCGA) database has accrued thousands of human cancer tumors and healthy controls, which provides an abundance of resource to differentiate cancer types. Meanwhile, deep convolutional neural networks (CNNs) have shown high accuracy on classification among a large number of image object categories. Here, we utilize 25 cancer primary tumors and 3 normal tissues from TCGA and convert their RNA-Seq gene expression profiling to color images; train, validate and test a CNN classifier directly from these images. The performance result shows that our CNN classifier can archive >80% test accuracy on most of the tumors and normal tissues. Since the gene expression pattern of distant metastases is similar to their primary tumors, the CNN classifier may provide a potential computational strategy on identifying the unknown primary origin of metastatic cancer in order to plan appropriate treatment for patients.

Keywords: bioinformatics, cancer, convolutional neural network, deep leaning, gene expression pattern

Procedia PDF Downloads 290
726 Genome-Wide Functional Analysis of Phosphatase in Cryptococcus neoformans

Authors: Jae-Hyung Jin, Kyung-Tae Lee, Yee-Seul So, Eunji Jeong, Yeonseon Lee, Dongpil Lee, Dong-Gi Lee, Yong-Sun Bahn

Abstract:

Cryptococcus neoformans causes cryptococcal meningoencephalitis mainly in immunocompromised patients as well as immunocompetent people. But therapeutic options are limited to treat cryptococcosis. Some signaling pathways including cyclic AMP pathway, MAPK pathway, and calcineurin pathway play a central role in the regulation of the growth, differentiation, and virulence of C. neoformans. To understand signaling networks regulating the virulence of C. neoformans, we selected the 114 putative phosphatase genes, one of the major components of signaling networks, in the genome of C. neoformans. We identified putative phosphatases based on annotation in C. neoformans var. grubii genome database provided by the Broad Institute and National Center for Biotechnology Information (NCBI) and performed a BLAST search of phosphatases of Saccharomyces cerevisiae, Aspergillus nidulans, Candida albicans and Fusarium graminearum to Cryptococcus neoformans. We classified putative phosphatases into 14 groups based on InterPro phosphatase domain annotation. Here, we constructed 170 signature-tagged gene-deletion strains through homologous recombination methods for 91 putative phosphatases. We examined their phenotypic traits under 30 different in vitro conditions, including growth, differentiation, stress response, antifungal resistance and virulence-factor production.

Keywords: human fungal pathogen, phosphatase, deletion library, functional genomics

Procedia PDF Downloads 347
725 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 186
724 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 55
723 Changing Trends in the Use of Induction Agents for General Anesthesia for Cesarean Section

Authors: Mahmoud Hassanin, Amita Gupta

Abstract:

Background: During current practice, Thiopentone is not cost-effectively added to resources wastage, risk of drug error with antibiotics, short shelf life, infection risk, and risk of delay while preparing during category one cesarean section. There is no significant difference or preference to the other alternative as per current use. Aims and Objectives: Patient safety, Cost-effective use of trust resources, problem awareness, Consider improvising on the current practice. Methods: In conjunction with the local department survey results, many studies support the change. Results: More than 50%(15 from 29) are already using Propofol, more than 75% of the participant are willing to shift to Propofol if it becomes standard, and the cost analysis also revealed that Thiopentone 10 X500=£60 Propofol 10X200= £5.20, Cost of Thiopentone/year =£2190. Approximately GA in a year =35-40 could cost approximately £20 Propofol, given it is a well-established practice. We could save not only money, but it will be environmentally friendly also to avoid adding any carbon footprints. Recommendation: Thiopentone is rarely used as an induction agent for the category one Caesarean section in our obstetric emergency theatres. Most obstetric anesthetists are using Propofol. Keep both Propofol and thiopentone(powder not withdrawn) in the cat one cesarean section emergency drugs tray ready until the department completely changes the practice protocol. A further retrospective study is required to compare the outcomes for these induction agents through the local database.

Keywords: thiopentone, propofol, category 1 caesarean, induction agents

Procedia PDF Downloads 132
722 Application of Bim Model Data to Estimate ROI for Robots and Automation in Construction Projects

Authors: Brian Romansky

Abstract:

There are many practical, commercially available robots and semi-autonomous systems that are currently available for use in a wide variety of construction tasks. Adoption of these technologies has the potential to reduce the time and cost to deliver a project, reduce variability and risk in delivery time, increase quality, and improve safety on the job site. These benefits come with a cost for equipment rental or contract fees, access to specialists to configure the system, and time needed for set-up and support of the machines while in use. Calculation of the net ROI (Return on Investment) requires detailed information about the geometry of the site, the volume of work to be done, the overall project schedule, as well as data on the capabilities and past performance of available robotic systems. Assembling the required data and comparing the ROI for several options is complex and tedious. Many project managers will only consider the use of a robot in targeted applications where the benefits are obvious, resulting in low levels of adoption of automation in the construction industry. This work demonstrates how data already resident in many BIM (Building Information Model) projects can be used to automate ROI estimation for a sample set of commercially available construction robots. Calculations account for set-up and operating time along with scheduling support tasks required while the automated technology is in use. Configuration parameters allow for prioritization of time, cost, or safety as the primary benefit of the technology. A path toward integration and use of automatic ROI calculation with a database of available robots in a BIM platform is described.

Keywords: automation, BIM, robot, ROI.

Procedia PDF Downloads 76
721 The Determinants of Trade Flow and Potential between Ethiopia and Group of Twenty

Authors: Terefe Alemu

Abstract:

This study is intended to examine Ethiopia’s trade flow determinants and trade potential with G20 countries whether it was overtraded or there is/are trade potential by using trade gravity model. The sources of panel data used were IMF, WDI, United Nations population division, The Heritage Foundation, Washington's No. 1 think tank online website database, online distance calculator, and others for the duration of 2010 to 2019 for 10 consecutive years. The empirical data analyzing tool used was Random effect model (REM), which is effective in estimation of time-invariant data. The empirical data analyzed using STATA software result indicates that Ethiopia has a trade potential with seven countries of G20, whereas Ethiopia overtrade with 12 countries and EU region. The Ethiopia’s and G20 countries/region bilateral trade flow statistically significant/ p<0.05/determinants were the population of G20 countries, growth domestic products of G20 countries, growth domestic products of Ethiopia, geographical distance between Ethiopia and G20 countries. The top five G20 countries exported to Ethiopia were china, United State of America, European Union, India, and South Africa, whereas the top five G20 countries imported from Ethiopia were EU, China, United State of America, Saudi Arabia, and Germany, respectively. Finally, the policy implication were Ethiopia has to Keep the consistence of trade flow with overtraded countries and improve with under traded countries through trade policy revision, and secondly, focusing on the trade determinants to improve trade flow is recommended.

Keywords: trade gravity model, trade determinants, G20, international trade, trade potential

Procedia PDF Downloads 193
720 Land Cover Classification System for the Estimation of Carbon Storage in Terrestrial Ecosystems

Authors: Lei Zhang

Abstract:

The carbon cycle greatly influences global change, and the land cover changes contribute to the status and rate of the carbon budget in ecosystems. This paper proposes a land cover classification system for mapping land cover, the national ecological environment assessment, and estimating carbon storage in ecosystems. The classification system consists of basic land cover classes at levels Ⅰ and Ⅱ and auxiliary features at level III. The basic 38 classes characterizing land cover features are derived from 19 criteria referring to composition, structure, pattern, phenology, etc. The basic classes reflect the status of carbon storage in ecosystems. The auxiliary classes at level III complement the attributes of higher levels by 9 criteria. The 5 environmental criteria of temperature, moisture, landform, aspect and slope mainly reflect the potential and intensity of carbon storage in ecosystems. The disturbance of vegetation succession caused by land use type influences the vegetation carbon budget. The other 3 vegetation cover criteria, growth period, and species characteristics further refine the vegetation types. The hierarchical structure of the land cover map (the classes of levels Ⅰ and Ⅱ) is independent of the products of level III, which is helpful for land cover product management and applications. The classification system has been adopted in the Chinese national land cover database for the carbon budget in ecosystems at a 30 m scale.

Keywords: classification system, land cover, ecosystem, carbon storage, object based

Procedia PDF Downloads 54
719 The Effect of Adding CuO Nanoparticles on Boiling Heat Transfer Enhancement in Horizontal Flattened Tubes

Authors: M. A. Akhavan-Behabadi, M. Najafi, A. Abbasi

Abstract:

An empirical investigation was performed in order to study the heat transfer characteristics of R600a flow boiling inside horizontal flattened tubes and the simultaneous effect of nanoparticles on boiling heat transfer in flattened channel. Round copper tubes of 8.7 mm I.D. were deformed into flattened shapes with different inside heights of 6.9, 5.5, and 3.4 mm as test areas. The effect of different parameters such as mass flux, vapor quality and inside height on heat transfer coefficient was studied. Flattening the tube caused a significant enhancement in heat transfer performance, so that the maximum augmentation ratio of 163% was obtained in flattened channel with lowest internal height. A new correlation was developed based on the present experimental data to predict the heat transfer coefficient in flattened tubes. This correlation estimated 90% of the entire database within ±20%. The best flat channel with the point of view of heat transfer performance was selected to study the effect of nanoparticle on heat transfer enhancement. Four homogenized mixtures containing 1% weight fraction of R600a/oil with different CuO nanoparticles concentration including 0.5%, 1% and 1.5% mass fraction of R600a/oil/CuO were studied. Observations show that heat transfer was improved by adding nanoparticles, which lead to maximum enhancement of 79% compare to the pure refrigerant at the same test condition.

Keywords: nano fluids, heat transfer, flattend tube, transport phenomena

Procedia PDF Downloads 422
718 Corn Production in the Visayas: An Industry Study from 2002-2019

Authors: Julie Ann L. Gadin, Andrearose C. Igano, Carl Joseph S. Ignacio, Christopher C. Bacungan

Abstract:

Corn production has become an important and pervasive industry in the Visayas for many years. Its role as a substitute commodity to rice heightens demand for health-particular consumers. Unfortunately, the corn industry is confronted with several challenges, such as weak institutions. Considering these issues, the paper examined the factors that influence corn production in the three administrative regions in the Visayas, namely, Western Visayas, Central Visayas, and Eastern Visayas. The data used was retrieved from a variety of publicly available data sources such as the Philippine Statistics Authority, the Department of Agriculture, the Philippine Crop Insurance Corporation, and the International Disaster Database. Utilizing a dataset from 2002 to 2019, the indicators were tested using three multiple linear regression (MLR) models. Results showed that the land area harvested (p=0.02), and the value of corn production (p=0.00) are statistically significant variables that influence corn production in the Visayas. Given these findings, it is suggested that the policy of forest conversion and sustainable land management should be effective in enabling farmworkers to obtain land to grow corn crops, especially in rural regions. Furthermore, the Biofuels Act of 2006, the Livestock Industry Restructuring and Rationalization Act, and supported policy, Senate Bill No. 225, or an Act Establishing the Philippine Corn Research Institute and Appropriating Funds, should be enforced inclusively in order to improve the demand for the corn-allied industries which may lead to an increase in the value and volume of corn production in the Visayas.

Keywords: corn, industry, production, MLR, Visayas

Procedia PDF Downloads 177
717 Dynamics of Museum Visitors’ Experiences Studies: A Bibliometric Analysis

Authors: Tesfaye Fentaw Nigatu, Alexander Trupp, Teh Pek Yen

Abstract:

Research on museums and the experiences of visitors has flourished in recent years, especially after museums became centers of edutainment beyond preserving heritage resources. This paper aims to comprehensively understand the changes, continuities, and future research development directions of museum visitors’ experiences. To identify current research trends, the paper summarizes and analyses research article publications from 1986 to 2023 on museum visitors' experiences. Bibliometric analysis software VOSviewer and Harzing POP (Publish or Perish) were used to analyze 407 academic articles. The articles were generated from the Scopus database. The study attempted to map new insights for future scholars and academics to expand the scope of museum visitors’ experience studies by analyzing keywords, citation patterns, influential articles in the field, publication trends, collaborations between authors, institutions, and clusters of highly cited articles. Accessibility to museums, social media usage within museums, aesthetics in museum settings, mixed reality experiences, sustainability issues, and emotions have emerged as key research areas in the study of museum visitors' experiences. The results benefit stakeholders and researchers in advancing the collective progress of considering recent research trends to stay informed about the latest developments and breakthroughs in the global academic landscape and visitors’ experiences development in the museum.

Keywords: bibliometric analysis, museum, network analysis, visitors’ experiences, visual analysis

Procedia PDF Downloads 52
716 Chinese Travelers’ Outbound Intentions to Visit Short-and-Long Haul Destinations: The Impact of Cultural Distance

Authors: Lei Qin

Abstract:

Culture has long been recognized as a possible reason to influence travelers’ decisions, which explains why travelers in different countries make distinct decisions. Cultural distance is a concept illustrating how much difference there is between travelers’ home culture and that of the destination, but the research in distinguishing short-and-long haul travel destinations is limited. This study explored the research gap by examining the impact of cultural distance on Chinese travelers’ intentions to visit short-haul and long-haul destinations, respectively. Six cultural distance measurements, including five measurements calculated from secondary database (Kogut & Singh, Developed Kogut & Singh, Euclidean distance Index (EDI), world value survey index (WVS), social axioms measurement (SAM)) and perceived cultural distance (PCD) collected from the primary survey. Of the six measurements, culture distance has the opposite impact on Chinese outbound travelers’ intentions in the short-haul and long haul. For short-haul travel, travelers’ intentions for traveling can be positive influenced by cultural distance; a possible reason is that travelers’ novelty-seeking satisfaction is greater than the strangeness obtained from overseas regions. For long-haul travel, travelers’ intentions for traveling can be negative influenced by cultural distance, a possible explanation is that travelers’ uncertainty, risk, and language concerns of farther destinations.

Keywords: cultural distance, intention, outbound travel, short-long haul

Procedia PDF Downloads 187
715 The Developing of Knowledge-Based System for the Medical Treatment with Herbs

Authors: Rujijan Vichivanives

Abstract:

This research aims to create a knowledge-based system as a database for self-healthcare analysis, diagnosis of simple illnesses, and the use of Thai herbs instead of modern medicine by using principles of Thai traditional medication theory. These were disseminated by website network programs within Suan Sunandha Rajabhat University. The population used in this study was divided into two groups: the first group consisted of four experts of Thai traditional medication and the second group was 300 website users. The methods used for collecting data were paper questionnaires and poll questionnaires on the website. The statistics used for analyzing data was at an average level. The results were divided into three parts: the first part was the development of a knowledge-based system and the second part was applied programs on website. Both parts could be fulfilled and achieved according to the set goal. The third part was the evaluation of the study: The evaluation of the viewpoints of the experts towards website designs were evaluated at a good level of 4.20. The satisfaction evaluation of the users was found at a good level of average satisfactory level at 4.24. It was found that the young population of those under the age of 16 had less cares about their health than the population of other teenagers, working age adults and those of older age. The research findings should be extended in order to encourage the lifestyle modifications to people of all ages by using the self-healthcare principles.

Keywords: developing, herbs, knowledge-based system, medical treatment

Procedia PDF Downloads 319
714 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, the Scale Invariant Feature Transform (SIFT)

Procedia PDF Downloads 224
713 Iris Feature Extraction and Recognition Based on Two-Dimensional Gabor Wavelength Transform

Authors: Bamidele Samson Alobalorun, Ifedotun Roseline Idowu

Abstract:

Biometrics technologies apply the human body parts for their unique and reliable identification based on physiological traits. The iris recognition system is a biometric–based method for identification. The human iris has some discriminating characteristics which provide efficiency to the method. In order to achieve this efficiency, there is a need for feature extraction of the distinct features from the human iris in order to generate accurate authentication of persons. In this study, an approach for an iris recognition system using 2D Gabor for feature extraction is applied to iris templates. The 2D Gabor filter formulated the patterns that were used for training and equally sent to the hamming distance matching technique for recognition. A comparison of results is presented using two iris image subjects of different matching indices of 1,2,3,4,5 filter based on the CASIA iris image database. By comparing the two subject results, the actual computational time of the developed models, which is measured in terms of training and average testing time in processing the hamming distance classifier, is found with best recognition accuracy of 96.11% after capturing the iris localization or segmentation using the Daughman’s Integro-differential, the normalization is confined to the Daugman’s rubber sheet model.

Keywords: Daugman rubber sheet, feature extraction, Hamming distance, iris recognition system, 2D Gabor wavelet transform

Procedia PDF Downloads 51
712 Peripheral Facial Nerve Palsy after Lip Augmentation

Authors: Sana Ilyas, Kishalaya Mukherjee, Suresh Shetty

Abstract:

Lip Augmentation has become more common in recent years. Patients do not expect to experience facial palsy after having lip augmentation. This poster will present the findings of such a presentation and will discuss the possible pathophysiology and management. (This poster has been published as a paper in the dental update, June 2022) Aim: The aim of the study was to explore the link between facial nerve palsy and lip fillers, to explore the literature surrounding facial nerve palsy, and to discuss the case of a patient who presented with facial nerve palsy with seemingly unknown cause. Methodology: There was a thorough assessment of the current literature surrounding the topic. This included published papers in journals through PubMed database searches and printed books on the topic. A case presentation was discussed in detail of a patient presenting with peripheral facial nerve palsy and associating it with lip augmentation that she had a day prior. Results and Conclusion: Even though the pathophysiology may not be clear for this presentation, it is important to highlight uncommon presentations or complications that may occur after treatment. This can help with understanding and managing similar cases, should they arise.It is also important to differentiate cause and association in order to make an accurate diagnosis. This may be difficult if there is little scientific literature. Therefore, further research can help to improve the understanding of the pathophysiology of similar presentations. This poster has been published as a paper in dental update, June 2022, and therefore shares a similar conclusiom.

Keywords: facial palsy, lip augmentation, causation and correlation, dental cosmetics

Procedia PDF Downloads 133
711 Analysis of Spectral Radiative Entropy Generation in a Non-Gray Participating Medium with Heat Source (Furnaces)

Authors: Asadollah Bahrami

Abstract:

In the present study, spectral radiative entropy generation is analyzed in a furnace filled with a mixture of H₂O, CO₂ and soot at radiative equilibrium. For the angular and spatial discretization of the radiative transfer equation and radiative entropy generation equations, the discrete ordinates method and the finite volume method are used, respectively. Spectral radiative properties are obtained using the correlated-k (CK) non-gray model with updated parameters based on the HITEMP2010 high-resolution database. In order to evaluate the effects of the location of the heat source, boundary condition and wall emissivity on radiative entropy generation, five cases are considered with different conditions. The spectral and total radiative entropy generation in the system are calculated for all cases and the effects of mentioned parameters on radiative entropy generation are attentively analyzed and finally, the optimum condition is especially presented. The most important results can be stated as follows: Results demonstrate that the wall emissivity has a considerable effect on the radiative entropy generation. Also, irreversible radiative transfer at the wall with lower temperatures is the main source of radiative entropy generation in the furnaces. In addition, the effect of the location of the heat source on total radiative entropy generation is less than other factors. Eventually, it can be said that characterizing the effective parameters of radiative entropy generation provides an approach to minimizing the radiative entropy generation and enhancing the furnace's performance practicality.

Keywords: spectral radiative entropy generation, non-gray medium, correlated k(CK) model, heat source

Procedia PDF Downloads 83
710 The Comparison of Primary B-Cell and NKT-Cell Non-Hodgkin Lymphomas in Nasopharynx, Nasal Cavity, and Paranasal Sinuses

Authors: Jiajia Peng, Jianqing Qiu, Jianjun Ren, Yu Zhao

Abstract:

Background: We aimed to compare clinical and survival differences between B-cell (B-NHL) and NKT-cell non-Hodgkin lymphomas (NKT-NHL) located in the nasal cavity, nasopharynx and paranasal sinuses, which are always categorized as one sinonasal type. Methods: Patients diagnosed with primary B-NHL and NKT-NHL in the nasal cavity, nasopharynx, and paranasal sinuses from the SEER database were included. We identified these patients based on histological types and anatomical sites and subsequently conducted univariate and multivariate Cox regression and Kaplan–Meier analyses to examine cancer-special survival (CSS) outcomes. Results: Overall, most B-NHL cases originated from the nasopharynx, while the majority of NKT-NHL cases occurred in the nasal cavity. Notably, the CSS outcomes improved significantly in all sinonasal B-NHL cases over time, whereas no such improvement trend was observed in each sinonasal NKT-NHL type. Additionally, increasing age was linked with an elevated risk of death in B-NHL, particularly in the nasal cavity (HR:3.37), rather than in NKT-NHL. Compared with B-NHL, the adverse effect of the higher stage on CSS was more evident in NKT-NHL, particularly in its nasopharynx site (HR: 5.12). Furthermore, radiotherapy was beneficial for survival in patients with sinonasal B-NHL and NKT-NHL, except in those with NKT-NHL in the nasopharynx site. However, chemotherapy has only been beneficial for CSS in patients with B-NHL in paranasal sinuses (HR: 0.42) since 2010, rather than in other types of B-NHL or NKT-NHL. Conclusions: Although B-NHL and NKT-NHL in the nasal cavity, nasopharynx and paranasal sinuses have similar anatomical locations, their clinic demographics and prognoses are largely different and should be treated and studied as distinct diseases.

Keywords: B-cell non-Hodgkin lymphomas, NKT-cell non-Hodgkin lymphomas, nasal cavity lymphomas, nasal sinuses lymphomas, nasopharynx lymphomas

Procedia PDF Downloads 92
709 Integration of Information and Communication Technology (ICT) for Effective Education of Adult Learners in Developing Communities in South-West Nigeria

Authors: Omotoke Omosalewa Owolowo

Abstract:

Mass literacy adult and non-formal education are part of the provisions of Nigeria’s National policy on Education. The advent of Information and Communication Technology (ICT), especially in this era of industrial revolution, calls for approaching these literacy and adult education in different perspective for community development. There is dire need of Needs Assessment for effective training of rural dwellers to actualize the policy requirement and for the purpose of aligning with the Sustainable Development Goals in South - West Nigeria. The present study is a preliminary survey designed to determine level of awareness, use and familiarity of community dwellers of social media. Adult dwellers from 24 communities from four states in Southern Nigeria constitute the sample, a total of 578 adults (380 females, 198 males) with age range between 21 and 52 years. The survey shows that 68% are aware of SMS, 21% of WhatsApp, 14% of Facebook while the remaining could not say precisely what social medium is their favorite. However, most of them (80%) could not see how their phones can be used to boost their status, improve their vacations or be used to develop them in their respective community. The study is expected to lead to a more elaborate training program on assessment of knowledge acquisition, participation and attitude of adult literate and non- literate members in communities for empowerment and to integrate ICT techniques. The results of this study provides a database for the larger study.

Keywords: mass literacy, community development, information and communication technology, adult learners

Procedia PDF Downloads 39
708 Football Smart Coach: Analyzing Corner Kicks Using Computer Vision

Authors: Arth Bohra, Marwa Mahmoud

Abstract:

In this paper, we utilize computer vision to develop a tool for youth coaches to formulate set-piece tactics for their players. We used the Soccernet database to extract the ResNet features and camera calibration data for over 3000 corner kick across 500 professional matches in the top 6 European leagues (English Premier League, UEFA Champions League, Ligue 1, La Liga, Serie A, Bundesliga). Leveraging the provided homography matrix, we construct a feature vector representing the formation of players on these corner kicks. Additionally, labeling the videos manually, we obtained the pass-trajectory of each of the 3000+ corner kicks by segmenting the field into four zones. Next, after determining the localization of the players and ball, we used event data to give the corner kicks a rating on a 1-4 scale. By employing a Convolutional Neural Network, our model managed to predict the success of a corner kick given the formations of players. This suggests that with the right formations, teams can optimize the way they approach corner kicks. By understanding this, we can help coaches formulate set-piece tactics for their own teams in order to maximize the success of their play. The proposed model can be easily extended; our method could be applied to even more game situations, from free kicks to counterattacks. This research project also gives insight into the myriad of possibilities that artificial intelligence possesses in transforming the domain of sports.

Keywords: soccer, corner kicks, AI, computer vision

Procedia PDF Downloads 162
707 Dynamic Gabor Filter Facial Features-Based Recognition of Emotion in Video Sequences

Authors: T. Hari Prasath, P. Ithaya Rani

Abstract:

In the world of visual technology, recognizing emotions from the face images is a challenging task. Several related methods have not utilized the dynamic facial features effectively for high performance. This paper proposes a method for emotions recognition using dynamic facial features with high performance. Initially, local features are captured by Gabor filter with different scale and orientations in each frame for finding the position and scale of face part from different backgrounds. The Gabor features are sent to the ensemble classifier for detecting Gabor facial features. The region of dynamic features is captured from the Gabor facial features in the consecutive frames which represent the dynamic variations of facial appearances. In each region of dynamic features is normalized using Z-score normalization method which is further encoded into binary pattern features with the help of threshold values. The binary features are passed to Multi-class AdaBoost classifier algorithm with the well-trained database contain happiness, sadness, surprise, fear, anger, disgust, and neutral expressions to classify the discriminative dynamic features for emotions recognition. The developed method is deployed on the Ryerson Multimedia Research Lab and Cohn-Kanade databases and they show significant performance improvement owing to their dynamic features when compared with the existing methods.

Keywords: detecting face, Gabor filter, multi-class AdaBoost classifier, Z-score normalization

Procedia PDF Downloads 262
706 Comprehensive Evaluation of COVID-19 Through Chest Images

Authors: Parisa Mansour

Abstract:

The coronavirus disease 2019 (COVID-19) was discovered and rapidly spread to various countries around the world since the end of 2019. Computed tomography (CT) images have been used as an important alternative to the time-consuming RT. PCR test. However, manual segmentation of CT images alone is a major challenge as the number of suspected cases increases. Thus, accurate and automatic segmentation of COVID-19 infections is urgently needed. Because the imaging features of the COVID-19 infection are different and similar to the background, existing medical image segmentation methods cannot achieve satisfactory performance. In this work, we try to build a deep convolutional neural network adapted for the segmentation of chest CT images with COVID-19 infections. First, we maintain a large and novel chest CT image database containing 165,667 annotated chest CT images from 861 patients with confirmed COVID-19. Inspired by the observation that the boundary of an infected lung can be improved by global intensity adjustment, we introduce a feature variable block into the proposed deep CNN, which adjusts the global features of features to segment the COVID-19 infection. The proposed PV array can effectively and adaptively improve the performance of functions in different cases. We combine features of different scales by proposing a progressive atrocious space pyramid fusion scheme to deal with advanced infection regions with various aspects and shapes. We conducted experiments on data collected in China and Germany and showed that the proposed deep CNN can effectively produce impressive performance.

Keywords: chest, COVID-19, chest Image, coronavirus, CT image, chest CT

Procedia PDF Downloads 41
705 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 173