Search results for: identifying coloring
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2018

Search results for: identifying coloring

1928 Visual Speech Perception of Arabic Emphatics

Authors: Maha Saliba Foster

Abstract:

Speech perception has been recognized as a bi-sensory process involving the auditory and visual channels. Compared to the auditory modality, the contribution of the visual signal to speech perception is not very well understood. Studying how the visual modality affects speech recognition can have pedagogical implications in second language learning, as well as clinical application in speech therapy. The current investigation explores the potential effect of speech visual cues on the perception of Arabic emphatics (AEs). The corpus consists of 36 minimal pairs each containing two contrasting consonants, an AE versus a non-emphatic (NE). Movies of four Lebanese speakers were edited to allow perceivers to have partial view of facial regions: lips only, lips-cheeks, lips-chin, lips-cheeks-chin, lips-cheeks-chin-neck. In the absence of any auditory information and relying solely on visual speech, perceivers were above chance at correctly identifying AEs or NEs across vowel contexts; moreover, the models were able to predict the probability of perceivers’ accuracy in identifying some of the COIs produced by certain speakers; additionally, results showed an overlap between the measurements selected by the computer and those selected by human perceivers. The lack of significant face effect on the perception of AEs seems to point to the lips, present in all of the videos, as the most important and often sufficient facial feature for emphasis recognition. Future investigations will aim at refining the analyses of visual cues used by perceivers by using Principal Component Analysis and including time evolution of facial feature measurements.

Keywords: Arabic emphatics, machine learning, speech perception, visual speech perception

Procedia PDF Downloads 287
1927 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing

Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh

Abstract:

Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.

Keywords: continual assessment, predictive analytics, random forest, student psychological profile

Procedia PDF Downloads 110
1926 New Employee on-Boarding Program: Effective Tool for Reducing the Prevalence of Workplace Injuries/Accidents

Authors: U. Ugochukwu, J. Lee, P. Conley

Abstract:

According to a recent survey by the UT Southwestern Workplace Safety Committee, the three most common on-the-job injuries reported by workers at the medical center are musculoskeletal injuries, slip-and-fall injuries and repetitive motion injuries. Last year alone, of the 650 documented workplace injuries and accidents, 45% were seen in employees in their first-two years of employment. UT Southwestern New Employee On-Boarding program was created and modeled to follows OSHA’s model that consist of: determining if training is needed, identifying training needs, identifying goals and objectives, developing learning activities, conducting the training, evaluating program effectiveness, and improving the program. The hospital’s management best practices were recreated to limit and control workplace injuries and accidents. Regular trainings and workshops on workplace safety and compliance were initiated for new employees. Various computer workstations were evaluated and recommendations were made to reduce musculoskeletal disorders. Post exposure protocols and workers protection programs were remodeled for infectious agents and chemicals used in the hospital, and medical surveillance programs were updated, for every emerging threat, to ensure they are in compliance with the US policy, regulatory and standard setting organizations. If ignorance of specific job hazards and of proper work practices is to blame for this higher injury rate, then training will help to provide a solution. Use of this program in training activities is just one of many ways UT Southwestern complied with the OSHA standards that relate to training while enhancing the safety and health of their employees.

Keywords: ergonomics, hazard, on-boarding, surveillance, workplace

Procedia PDF Downloads 312
1925 On the convergence of the Mixed Integer Randomized Pattern Search Algorithm

Authors: Ebert Brea

Abstract:

We propose a novel direct search algorithm for identifying at least a local minimum of mixed integer nonlinear unconstrained optimization problems. The Mixed Integer Randomized Pattern Search Algorithm (MIRPSA), so-called by the author, is based on a randomized pattern search, which is modified by the MIRPSA for finding at least a local minimum of our problem. The MIRPSA has two main operations over the randomized pattern search: moving operation and shrinking operation. Each operation is carried out by the algorithm when a set of conditions is held. The convergence properties of the MIRPSA is analyzed using a Markov chain approach, which is represented by an infinite countable set of state space λ, where each state d(q) is defined by a measure of the qth randomized pattern search Hq, for all q in N. According to the algorithm, when a moving operation is carried out on the qth randomized pattern search Hq, the MIRPSA holds its state. Meanwhile, if the MIRPSA carries out a shrinking operation over the qth randomized pattern search Hq, the algorithm will visit the next state, this is, a shrinking operation at the qth state causes a changing of the qth state into (q+1)th state. It is worthwhile pointing out that the MIRPSA never goes back to any visited states because the MIRPSA only visits any qth by shrinking operations. In this article, we describe the MIRPSA for mixed integer nonlinear unconstrained optimization problems for doing a deep study of its convergence properties using Markov chain viewpoint. We herein include a low dimension case for showing more details of the MIRPSA, when the algorithm is used for identifying the minimum of a mixed integer quadratic function. Besides, numerical examples are also shown in order to measure the performance of the MIRPSA.

Keywords: direct search, mixed integer optimization, random search, convergence, Markov chain

Procedia PDF Downloads 443
1924 Antibacterial and Antioxidant Capacity of Fabric Treated with Purple-Fleshed Sweet Potato Extract

Authors: Kyung Hwa Hong, Eunmi Koh

Abstract:

Wool and cotton fabrics are pretreated by a tannic acid aqueous solution to increase their dyeability and then dyed by Purple-Fleshed Sweet Potato (PSP) extract. The dyed fabrics are then investigated by various analysis techniques. The results revealed that wool and cotton fabrics can be dyed bluish red through the pretreatment and dyeing process. Both wool and cotton fabrics only pretreated with tannic acid display decreased L* value but no significant changes in a* and b* values as the concentration of tannic acid increases. And, as expected, the pretreated fabrics are even darker and show a richer purple color after the dyeing process with the PSP extract. With regard to the colorfastness of wool and cotton fabrics dyed by PSP extract in cleaning circumstances, such as dry-cleaning (for wool) and washing (for cotton), the wool and cotton fabrics had a 4.0 and 4.0 grade of colorfastness to dry-cleaning and washing, respectively. However, they both exhibited significantly inferior colorfastness to light (grade of 1.5). Thus, it was found that there is still a need for improvement with regard to color fastness, particularly against light. On the other hand, the wool and cotton fabrics also showed antibacterial and antioxidant characteristics. In addition, both the wool and cotton fabrics showed potential antibacterial ability (>99%) against Staphylococcus aureus; however, they showed somewhat insufficient antibacterial ability (60.8% for wool and 94.8% for cotton) against Klebsiella pneumoniae. Also, their antioxidant abilities increased up to ca. 90% with an increase in the tannic acid concentration (up to 0.5%). However, after the dyeing process, the antibacterial and antioxidant ability tended to decrease. This is assumed to have occurred because functional moieties such as phenolic acids were detached from the pretreated fabrics into the hot water (the dyeing solution) during the dyeing process. Therefore, further study would be necessary to derive the optimum treatment and dyeing conditions so as to maximize the coloring effect and functionalities of the fabrics.

Keywords: antibacterial activity, antioxidant activity, purple-fleshed sweet potato, fabrics

Procedia PDF Downloads 271
1923 Language Inequalities in the Algerian Public Space: A Semiotic Landscape Analysis

Authors: Sarah Smail

Abstract:

Algeria has been subject to countless conquests and invasions that resulted in having a diverse linguistic repertoire. The sociolinguistic situation of the country made linguistic landscape analysis pertinent. This in fact, has led to the growth of diverse linguistic landscape studies that mainly focused on identifying the sociolinguistic situation of the country through shop names analysis. The present research adds to the existing literature by offering another perspective to the analysis of signs by combining the physical and digital semiotic landscape. The powerful oil, gas and agri-food industries in Algeria make it interesting to focus on the commodification of natural resources for the sake of identifying the language and semiotic resources deployed in the Algerian public scene in addition to the identification of the visibility of linguistic inequalities and minorities in the business domain. The study discusses the semiotic landscape of three trade cities: Bejaia, Setif and Hassi-Messaoud. In addition to interviews conducted with business owners and graphic designers and questionnaires with business employees. Withal, the study relies on Gorter’s multilingual inequalities in public space (MIPS) model (2021) and Irvine and Gal’s language ideology and linguistic differentiation (2000). The preliminary results demonstrate the sociolinguistic injustice existing in the business domain, e.g., the exclusion of the official languages, the dominance of foreign languages, and the excessive use of the roman script.

Keywords: semiotic landscaping, digital scapes, language commodification, linguistic inequalities, business signage

Procedia PDF Downloads 80
1922 Identifying E-Learning Components at North-West University, Mafikeng Campus

Authors: Sylvia Tumelo Nthutang, Nehemiah Mavetera

Abstract:

Educational institutions are under pressure from their competitors. Regulators and community groups need educational institutions to adopt appropriate business and organizational practices. Globally, educational institutions are now using e-learning as the best teaching and learning approach. E-learning is becoming the center of attention to the learning institutions, educational systems and software inventors. North-West University (NWU) is currently using eFundi, a Learning Management System (LMS). LMS are all information systems and procedures that adds value to students learning and support the learning material in text or any multimedia files. With various e-learning tools, students would be able to access all the materials related to the course in electronic copies. The study was tasked with identifying the e-learning components at the NWU, Mafikeng campus. Quantitative research methodology was considered in data collection and descriptive statistics for data analysis. The Activity Theory (AT) was used as a theory to guide the study. AT outlines the limitations amongst e-learning at the macro-organizational level (plan, guiding principle, campus-wide solutions) and micro-organization (daily functioning practice, collaborative transformation, specific adaptation). On a technological environment, AT gives people an opportunity to change from concentrating on computers as an area of concern but also understand that technology is part of human activities. The findings have identified the university’s current IT tools and knowledge on e-learning elements. It was recommended that university should consider buying computer resources that consumes less power and practice e-learning effectively.

Keywords: e-learning, information and communication technology (ICT), teaching, virtual learning environment

Procedia PDF Downloads 259
1921 Automated Video Surveillance System for Detection of Suspicious Activities during Academic Offline Examination

Authors: G. Sandhya Devi, G. Suvarna Kumar, S. Chandini

Abstract:

This research work aims to develop a system that will analyze and identify students who indulge in malpractices/suspicious activities during the course of an academic offline examination. Automated Video Surveillance provides an optimal solution which helps in monitoring the students and identifying the malpractice event immediately. This work is organized into three modules. The first module deals with performing an impersonation check using a PCA-based face recognition method which is done by cross checking his profile with the database. The presence or absence of the student is even determined in this module by implementing an image registration technique wherein a grid is formed by considering all the images registered using the frontal camera at the determined positions. Second, detecting such facial malpractices in which a student gets involved in conversation with another, trying to obtain unauthorized information etc., based on the threshold range evaluated by considering his/her mouth state whether open or closed. The third module deals with identification of unauthorized material or gadgets used in the examination hall by training the positive samples of the object through various stages. Here, a top view camera feed is analyzed to detect the suspicious activities. The system automatically alerts the administration when any suspicious activities are identified, thereby reducing the error rate caused due to manual monitoring. This work is an improvement over our previous work published in identifying suspicious activities done by examinees in an offline examination.

Keywords: impersonation, image registration, incrimination, object detection, threshold evaluation

Procedia PDF Downloads 207
1920 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets

Authors: Kothuri Sriraman, Mattupalli Komal Teja

Abstract:

In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).

Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm

Procedia PDF Downloads 317
1919 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models

Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai

Abstract:

Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.

Keywords: plant identification, CNN, image processing, vision transformer, classification

Procedia PDF Downloads 70
1918 Development of a Roadmap for Assessment the Sustainability of Buildings in Saudi Arabia Using Building Information Modeling

Authors: Ibrahim A. Al-Sulaihi, Khalid S. Al-Gahtani, Abdullah M. Al-Sugair, Aref A. Abadel

Abstract:

Achieving environmental sustainability is one of the important issues considered in many countries’ vision. Green/Sustainable building is widely used terminology for describing a friendly environmental construction. Applying sustainable practices has a significant importance in various fields, including construction field that consumes an enormous amount of resource and causes a considerable amount of waste. The need for sustainability is increased in the regions that suffering from the limitation of natural resource and extreme weather conditions such as Saudi Arabia. Since buildings designs are getting sophisticated, the need for tools, which support decision-making for sustainability issues, is increasing, especially in the design and preconstruction stages. In this context, Building Information Modeling (BIM) can aid in performing complex building performance analyses to ensure an optimized sustainable building design. Accordingly, this paper introduces a roadmap towards developing a systematic approach for presenting the sustainability of buildings using BIM. The approach includes set of main processes including; identifying the sustainability parameters that can be used for sustainability assessment in Saudi Arabia, developing sustainability assessment method that fits the special circumstances in the Kingdom, identifying the sustainability requirements and BIM functions that can be used for satisfying these requirements, and integrating these requirements with identified functions. As a result, the sustainability-BIM approach can be developed which helps designers in assessing the sustainability and exploring different design alternatives at the early stage of the construction project.

Keywords: green buildings, sustainability, BIM, rating systems, environment, Saudi Arabia

Procedia PDF Downloads 359
1917 Optimization of Beneficiation Process for Upgrading Low Grade Egyptian Kaolin

Authors: Nagui A. Abdel-Khalek, Khaled A. Selim, Ahmed Hamdy

Abstract:

Kaolin is naturally occurring ore predominantly containing kaolinite mineral in addition to some gangue minerals. Typical impurities present in kaolin ore are quartz, iron oxides, titanoferrous minerals, mica, feldspar, organic matter, etc. The main coloring impurity, particularly in the ultrafine size range, is titanoferrous minerals. Kaolin is used in many industrial applications such as sanitary ware, table ware, ceramic, paint, and paper industries, each of which should be of certain specifications. For most industrial applications, kaolin should be processed to obtain refined clay so as to match with standard specifications. For example, kaolin used in paper and paint industries need to be of high brightness and low yellowness. Egyptian kaolin is not subjected to any beneficiation process and the Egyptian companies apply selective mining followed by, in some localities, crushing and size reduction only. Such low quality kaolin can be used in refractory and pottery production but not in white ware and paper industries. This paper aims to study the amenability of beneficiation of an Egyptian kaolin ore of El-Teih locality, Sinai, to be suitable for different industrial applications. Attrition scrubbing and classification followed by magnetic separation are applied to remove the associated impurities. Attrition scrubbing and classification are used to separate the coarse silica and feldspars. Wet high intensity magnetic separation was applied to remove colored contaminants such as iron oxide and titanium oxide. Different variables affecting of magnetic separation process such as solid percent, magnetic field, matrix loading capacity, and retention time are studied. The results indicated that substantial decrease in iron oxide (from 1.69% to 0.61% ) and TiO2 (from 3.1% to 0.83%) contents as well as improving iso-brightness (from 63.76% to 75.21% and whiteness (from 79.85% to 86.72%) of the product can be achieved.

Keywords: Kaolin, titanoferrous minerals, beneficiation, magnetic separation, attrition scrubbing, classification

Procedia PDF Downloads 333
1916 Process Optimization and Automation of Information Technology Services in a Heterogenic Digital Environment

Authors: Tasneem Halawani, Yamen Khateeb

Abstract:

With customers’ ever-increasing expectations for fast services provisioning for all their business needs, information technology (IT) organizations, as business partners, have to cope with this demanding environment and deliver their services in the most effective and efficient way. The purpose of this paper is to identify optimization and automation opportunities for the top requested IT services in a heterogenic digital environment and widely spread customer base. In collaboration with systems, processes, and subject matter experts (SMEs), the processes in scope were approached by analyzing four-year related historical data, identifying and surveying stakeholders, modeling the as-is processes, and studying systems integration/automation capabilities. This effort resulted in identifying several pain areas, including standardization, unnecessary customer and IT involvement, manual steps, systems integration, and performance measurement. These pain areas were addressed by standardizing the top five requested IT services, eliminating/automating 43 steps, and utilizing a single platform for end-to-end process execution. In conclusion, the optimization of IT service request processes in a heterogenic digital environment and widely spread customer base is challenging, yet achievable without compromising the service quality and customers’ added value. Further studies can focus on measuring the value of the eliminated/automated process steps to quantify the enhancement impact. Moreover, a similar approach can be utilized to optimize other IT service requests, with a focus on business criticality.

Keywords: automation, customer value, heterogenic, integration, IT services, optimization, processes

Procedia PDF Downloads 90
1915 The Use of Surveys to Combat Fake News in Media Literacy Education

Authors: Jaejun Jong

Abstract:

Fake news has recently become a serious international problem. Therefore, researchers and policymakers worldwide have sought to understand fake news and develop strategies to combat it. This study consists of two primary parts: (1) a literature review of how surveys were used to understand fake news and identify problems caused by fake news, and (2) a discussion of how surveys were used to fight back against fake news in educational settings. This second section specifically analyzes surveys used to evaluate a South Korean elementary school program designed to improve students’ metacognition and critical thinking. This section seeks to identify potential problems that may occur in the elementary school setting. The literature review shows that surveys can help people to understand fake news based on its traits rather than its definition due to the lack of agreement on the definition of fake news. The literature review also shows that people are not good at identifying fake news or evaluating their own ability to identify fake news; indeed, they are more likely to share information that aligns with their previous beliefs. In addition, the elementary school survey data shows that there may be substantial errors in the program evaluation process, likely caused by processing errors or the survey procedure, though the exact cause is not specified. Such a significant error in evaluating the effects of the educational program prevents teachers from making proper decisions and accurately evaluating the program. Therefore, identifying the source of such errors would improve the overall quality of education, which would benefit both teachers and students.

Keywords: critical thinking, elementary education, program evaluation, survey

Procedia PDF Downloads 79
1914 The Effects of Nanoemulsions Based on Commercial Oils for the Quality of Vacuum-Packed Sea Bass at 2±2°C

Authors: Mustafa Durmuş, Yesim Ozogul, Esra Balıkcı, Saadet Gokdoğan, Fatih Ozogul, Ali Rıza Köşker, İlknur Yuvka

Abstract:

Food scientists and researchers have paid attention to develop new ways for improving the nutritional value of foods. The application of nanotechnology techniques to the food industry may allow the modification of food texture, taste, sensory attributes, coloring strength, processability, and stability during shelf life of products. In this research, the effects of nanoemulsions based on commercial oils for vacuum-packed sea bass fillets stored at 2±2°C were investigated in terms of the sensory, chemical (total volatile basic nitrogen (TVB-N), thiobarbituric acid (TBA), peroxide value (PV) and free fatty acids (FFA), pH, water holding capacity (WHC)) and microbiological qualities (total anaerobic bacteria and total lactic acid bacteria). Physical properties of emulsions (viscosity, the particle size of droplet, thermodynamic stability, refractive index, and surface tension) were determined. Nanoemulsion preparation method was based on high energy principle, with ultrasonic homojenizator. Sensory analyses of raw fish showed that the demerit points of the control group were found higher than those of treated groups. The sensory score (odour, taste and texture) of the cooked fillets decreased with storage time, especially in the control. Results obtained from chemical and microbiological analyses also showed that nanoemulsions significantly (p<0.05) decreased the values of biochemical parameters and growth of bacteria during storage period, thus improving quality of vacuum-packed sea bass.

Keywords: quality parameters, nanoemulsion, sea bass, shelf life, vacuum packing

Procedia PDF Downloads 438
1913 Micro-Ribonucleic Acid-21 as High Potential Prostate Cancer Biomarker

Authors: Regina R. Gunawan, Indwiani Astuti, H. Raden Danarto

Abstract:

Cancer is the leading cause of death worldwide. Cancer is caused by mutations that alter the function of normal human genes and give rise to cancer genes. MicroRNA (miRNA) is a small non-coding RNA that regulates the gen through complementary bond towards mRNA target and cause mRNA degradation. miRNA works by either promoting or suppressing cell proliferation. miRNA level expression in cancer may offer another value of miRNA as a biomarker in cancer diagnostic. miRNA-21 is believed to have a role in carcinogenesis by enhancing proliferation, anti-apoptosis, cell cycle progression and invasion of tumor cells. Hsa-miR-21-5p marker has been identified in Prostate Cancer (PCa) and Benign Prostatic Hyperplasia (BPH) patient’s urine. This research planned to explore the diagnostic performance of miR-21 to differentiate PCa and BPH patients. In this study, urine samples were collected from 20 PCa patients and 20 BPH patients. miR-21 relative expression against the reference gene was analyzed and compared between the two. miRNA expression was analyzed using the comparative quantification method to find the fold change. miR-21 validity in identifying PCa patients was performed by quantifying the sensitivity and specificity with the contingency table. miR-21 relative expression against miR-16 in PCa patient and in BPH patient has 12,98 differences in fold change. From a contingency table of Cq expression of miR-21 in identifying PCa patients from BPH patient, Cq miR-21 has 100% sensitivity and 75% specificity. miR-21 relative expression can be used in discriminating PCa from BPH by using a urine sample. Furthermore, the expression of miR-21 has higher sensitivity compared to PSA (Prostate specific antigen), therefore miR-21 has a high potential to be analyzed and developed more.

Keywords: benign prostate hyperplasia, biomarker, miRNA-21, prostate cancer

Procedia PDF Downloads 132
1912 Automatic Detection of Traffic Stop Locations Using GPS Data

Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell

Abstract:

Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.

Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data

Procedia PDF Downloads 254
1911 Towards Law Data Labelling Using Topic Modelling

Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran

Abstract:

The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.

Keywords: courts of accounts, data labelling, document similarity, topic modeling

Procedia PDF Downloads 153
1910 Self-Organizing Maps for Credit Card Fraud Detection

Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 30
1909 Self-Organizing Maps for Credit Card Fraud Detection and Visualization

Authors: Peng, Chun-Yi, Chen, Wei-Hsuan, Ueng, Shyh-Kuang

Abstract:

This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.

Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies

Procedia PDF Downloads 35
1908 Empowering Transformers for Evidence-Based Medicine

Authors: Jinan Fiaidhi, Hashmath Shaik

Abstract:

Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.

Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers

Procedia PDF Downloads 18
1907 A Corpus-Based Study of Subtitling Religious Words into Arabic

Authors: Yousef Sahari, Eisa Asiri

Abstract:

Hollywood films are produced in an open and liberal context, and when subtitling for a more conservative and closed society such as an Arabic society, religious words can pose a thorny challenge for subtitlers. Using a corpus of 90 Hollywood films released between 2000 and 2018 and applying insights from Descriptive Translation Studies (Toury, 1995, 2012) and the dichotomy of domestication and foreignization, this paper investigates three main research questions: (1) What are the dominant religious terms and functions in the English subtitles? (2) What are the dominant translation strategies used in the translation of religious words? (3) Do these strategies tend to be SL-oriented or TL-oriented (domesticating or foreignising)? To answer the research questions above, a quantitative and qualitative analysis of the corpus is conducted, in which the researcher adopts a self-designed, parallel, aligned corpus of ninety films and their Arabic subtitles. A quantitative analysis is performed to compare the frequencies and distribution of religious words, their functions, and the translation strategies employed by the subtitlers of ninety films, with the aim of identifying similarities or differences in addition to identifying the impact of functions of religious terms on the use of subtitling strategies. Based on the quantitative analysis, a qualitative analysis is performed to identify any translational patterns in Arabic translations of religious words and the possible reasons for subtitlers’ choices. The results show that the function of religious words has a strong influence on the choice of subtitling strategies. Also, it is found that foreignization strategies are applied in about two-thirds of the total occurrences of religious words.

Keywords: religious terms, subtitling, audiovisual translation, modern standard arabic, subtitling strategies, english-arabic subtitling

Procedia PDF Downloads 135
1906 Role and Impact of Artificial Intelligence in Sales and Distribution Management

Authors: Kiran Nair, Jincy George, Suhaib Anagreh

Abstract:

Artificial intelligence (AI) in a marketing context is a form of a deterministic tool designed to optimize and enhance marketing tasks, research tools, and techniques. It is on the verge of transforming marketing roles and revolutionize the entire industry. This paper aims to explore the current dissemination of the application of artificial intelligence (AI) in the marketing mix, reviewing the scope and application of AI in various aspects of sales and distribution management. The paper also aims at identifying the areas of the strong impact of AI in factors of sales and distribution management such as distribution channel, purchase automation, customer service, merchandising automation, and shopping experiences. This is a qualitative research paper that aims to examine the impact of AI on sales and distribution management of 30 multinational brands in six different industries, namely: airline; automobile; banking and insurance; education; information technology; retail and telecom. Primary data is collected by means of interviews and questionnaires from a sample of 100 marketing managers that have been selected using convenient sampling method. The data is then analyzed using descriptive statistics, correlation analysis and multiple regression analysis. The study reveals that AI applications are extensively used in sales and distribution management, with a strong impact on various factors such as identifying new distribution channels, automation in merchandising, customer service, and purchase automation as well as sales processes. International brands have already integrated AI extensively in their day-to-day operations for better efficiency and improved market share while others are investing heavily in new AI applications for gaining competitive advantage.

Keywords: artificial intelligence, sales and distribution, marketing mix, distribution channel, customer service

Procedia PDF Downloads 131
1905 A System Dynamics Model for Analyzing Customer Satisfaction in Healthcare Systems

Authors: Mahdi Bastan, Ali Mohammad Ahmadvand, Fatemeh Soltani Khamsehpour

Abstract:

Health organizations’ sustainable development has nowadays become highly affected by customers’ satisfaction due to significant changes made in the business environment of the healthcare system and emerging of Competitiveness paradigm. In case we look at the hospitals and other health organizations as service providers concerning profit issues, the satisfaction of employees as interior customers, and patients as exterior customers would be of significant importance in health business success. Furthermore, satisfaction rate could be considered in performance assessment of healthcare organizations as a perceived quality measure. Several researches have been carried out in identification of effective factors on patients’ satisfaction in health organizations. However, considering a systemic view, the complex causal relations among many components of healthcare system would be an issue that its acquisition and sustainability requires an understanding of the dynamic complexity, an appropriate cognition of different components, and effective relationships among them resulting ultimately in identifying the generative structure of patients’ satisfaction. Hence, the presenting paper applies system dynamics approaches coherently and methodologically to represent the systemic structure of customers’ satisfaction of a health system involving the constituent components and interactions among them. Then, the results of different policies taken on the system are simulated via developing mathematical models, identifying leverage points, and using scenario making technique and then, the best solutions are presented to improve customers’ satisfaction of the services. The presenting approach supports taking advantage of decision support systems. Additionally, relying on understanding of system behavior Dynamics, the effective policies for improving the health system would be recognized.

Keywords: customer satisfaction, healthcare, scenario, simulation, system dynamics

Procedia PDF Downloads 388
1904 Low-Cost Reusable Thermal Energy Storage Particle for Concentrating Solar Power

Authors: Kyu Bum Han, Eunjin Jeon, Kimberly Watts, Brenda Payan Medina

Abstract:

Gen3 Concentrating Solar Power (CSP) high-temperature thermal systems have the potential to lower the cost of a CSP system. When compared to the other systems (chloride salt blends and supercritical fluids), the particle transport system can avoid many of the issues associated with high fluid temperature systems at high temperature because of its ability to operate at ambient pressure with limited corrosion or thermal stability risk. Furthermore, identifying and demonstrating low-cost particles that have excellent optical properties and durability can significantly reduce the levelized cost of electricity (LCOE) of particle receivers. The currently available thermal transfer particle in the study and market is oxidized at about 700oC, which reduces its durability, generates particle loss by high friction loads, and causes the color change. To meet the CSP SunShot goal, the durability of particles must be improved by identifying particles that are less abrasive to other structural materials. Furthermore, the particles must be economically affordable and the solar absorptance of the particles must be increased while minimizing thermal emittance. We are studying a novel thermal transfer particle, which has low cost, high durability, and high solar absorptance at high temperatures. The particle minimizes thermal emittance and will be less abrasive to other structural materials. Additionally, the particle demonstrates reusability, which significantly lowers the LCOE. This study will contribute to two principal disciplines of energy science: materials synthesis and manufacturing. Developing this particle for thermal transfer will have a positive impact on the ceramic study and industry as well as the society.

Keywords: concentrating solar power, thermal energy storage, particle, reusability, economics

Procedia PDF Downloads 206
1903 Recommender Systems for Technology Enhanced Learning (TEL)

Authors: Hailah Alballaa, Azeddine Chikh

Abstract:

Several challenges impede the adoption of Recommender Systems for Technology Enhanced Learning (TEL): to collect and identify possible datasets; to select between different recommender approaches; to evaluate their performances. The aim is of this paper is twofold: First, it aims to introduce a survey on the most significant work in this area. Second, it aims at identifying possible research directions.

Keywords: datasets, content-based filtering, recommender systems, TEL

Procedia PDF Downloads 225
1902 Utility of CT Perfusion Imaging for Diagnosis and Management of Delayed Cerebral Ischaemia Following Subarachnoid Haemorrhage

Authors: Abdalla Mansour, Dan Brown, Adel Helmy, Rikin Trivedi, Mathew Guilfoyle

Abstract:

Introduction: Diagnosing delayed cerebral ischaemia (DCI) following aneurysmal subarachnoid haemorrhage (SAH) can be challenging, particularly in poor-grade patients. Objectives: This study sought to assess the value of routine CTP in identifying (or excluding) DCI and in guiding management. Methods: Eight-year retrospective neuroimaging study at a large UK neurosurgical centre. Subjects included a random sample of adult patients with confirmed aneurysmal SAH that had a CTP scan during their inpatient stay, over a 8-year period (May 2014 - May 2022). Data collected through electronic patient record and PACS. Variables included age, WFNS scale, aneurysm site, treatment, the timing of CTP, radiologist report, and DCI management. Results: Over eight years, 916 patients were treated for aneurysmal SAH; this study focused on 466 patients that were randomly selected. Of this sample, 181 (38.84%) had one or more CTP scans following brain aneurysm treatment (Total 318). The first CTP scan in each patient was performed at 1-20 days following ictus (median 4 days). There was radiological evidence of DCI in 83, and no reversible ischaemia was found in 80. Findings were equivocal in the remaining 18. Of the 103 patients treated with clipping, 49 had DCI radiological evidence, in comparison to 31 of 69 patients treated with endovascular embolization. The remaining 9 patients are either unsecured aneurysms or non-aneurysmal SAH. Of the patients with radiological evidence of DCI, 65 had a treatment change following the CTP directed at improving cerebral perfusion. In contrast, treatment was not changed for (61) patients without radiological evidence of DCI. Conclusion: CTP is a useful adjunct to clinical assessment in the diagnosis of DCI and is helpful in identifying patients that may benefit from intensive therapy and those in whom it is unlikely to be effective.

Keywords: SAH, vasospasm, aneurysm, delayed cerebral ischemia

Procedia PDF Downloads 47
1901 Extraction and Encapsulation of Carotenoids from Carrot

Authors: Gordana Ćetković, Sanja Podunavac-Kuzmanović, Jasna Čanadanović-Brunet, Vesna Tumbas Šaponjac, Vanja Šeregelj, Jelena Vulić, Slađana Stajčić

Abstract:

The color of food is one of the decisive factors for consumers. Potential toxicity of artificial food colorants has led to the consumers' preference for natural products over products with artificial colors. Natural pigments have many bioactive functions, such as antioxidant, provitamin and many other. Having this in mind, the acceptability of natural colorants by the consumers is much higher. Being present in all photosynthetic plant tissues carotenoids are probably most widespread pigments in nature. Carrot (Daucus carota) is a good source of functional food components. Carrot is especially rich in carotenoids, mainly α- and β-carotene and lutein. For this study, carrot was extracted using classical extraction with hexane and ethyl acetate, as well as supercritical CO₂ extraction. The extraction efficiency was evaluated by estimation of carotenoid yield determined spectrophotometrically. Classical extraction using hexane (18.27 mg β-carotene/100 g DM) was the most efficient method for isolation of carotenoids, compared to ethyl acetate classical extraction (15.73 mg β-carotene/100 g DM) and supercritical CO₂ extraction (0.19 mg β-carotene/100 g DM). Three carrot extracts were tested in terms of antioxidant activity using DPPH and reducing power assay as well. Surprisingly, ethyl acetate extract had the best antioxidant activity on DPPH radicals (AADPPH=120.07 μmol TE/100 g) while hexane extract showed the best reducing power (RP=1494.97 μmol TE/100 g). Hexane extract was chosen as the most potent source of carotenoids and was encapsulated in whey protein by freeze-drying. Carotenoid encapsulation efficiency was found to be high (89.33%). Based on our results it can be concluded that carotenoids from carrot can be efficiently extracted using hexane and classical extraction method. This extract has the potential to be applied in encapsulated form due to high encapsulation efficiency and coloring capacity. Therefore it can be used for dietary supplements development and food fortification.

Keywords: carotenoids, carrot, extraction, encapsulation

Procedia PDF Downloads 245
1900 Development and Validation of Integrated Continuous Improvement Framework for Competitiveness: Mixed Research of Ethiopian Manufacturing Industries

Authors: Haftu Hailu Berhe, Hailekiros Sibhato Gebremichael, Kinfe Tsegay Beyene, Haileselassie Mehari

Abstract:

The purpose of the study is to develop and validate integrated literature-based JIT, TQM, TPM, SCM and LSS framework through a combination of the PDCA cycle and DMAIC methodology. The study adopted a mixed research approach. Accordingly, the qualitative study employed to develop the framework is based on identifying the uniqueness and common practices of JIT, TQM, TPM, SCM and LSS initiatives, the existing practice of the integration, identifying the existing gaps in the framework and practices, developing new integrated JIT, TQM, TPM, SCM and LSS practice framework. Previous very few studies of the uniqueness and common practices of the five initiatives are preserved. Whereas the quantitative study working to validate the framework is based on empirical analysis of the self-administered questionnaire using a statistical package for social science. A combination of the PDCA cycle and DMAIC methodology stand integrated CI framework is developed. The proposed framework is constructed as a project-based framework with five detailed implementation phases. Besides, the empirical analysis demonstrated that the proposed framework is valuable if adopted and implemented correctly. So far, there is no study proposed & validated the integrated CI framework within the scope of the study. Therefore, this is the earliest study that proposed and validated the framework for manufacturing industries. The proposed framework is applicable to manufacturing industries and can assist in achieving competitive advantages when the manufacturing industries, institutions and government offer unconditional efforts in implementing the full contents of the framework.

Keywords: integrated continuous improvement framework, just in time, total quality management, total productive maintenance, supply chain management, lean six sigma

Procedia PDF Downloads 103
1899 Identifying Applicant Potential Through Admissions Testing

Authors: Belinda Brunner

Abstract:

Objectives: Communicate common test constructs of well-known higher education admissions tests. Discuss influences on admissions test construct definition and design and discuss research on related to factors influencing success in academic study. Discuss how admissions tests can be used to identify relevant talent. Examine how admissions test can be used to facilitate educational mobility and inform selection decisions when the prerequisite curricula is not standardized Observations: Generally speaking, constructs of admissions tests can be placed along a continuum from curriculum-related knowledge to more general reasoning abilities. For example, subject-specific achievement tests are more closely aligned to a prescribed curriculum, while reasoning tests are typically not associated with a specific curriculum. This session will draw reference from the test-constructs of well-known international higher education admissions tests, such as the UK clinical aptitude test (UKCAT) which is used for medicine and dentistry admissions. Conclusions: The purpose of academic admissions testing is to identify potential students with the prerequisite skills set needed to succeed in the academic environment, but how can the test construct help achieve this goal? Determination of the appropriate test construct for tests used in the admissions selection decisions should be influenced by a number of factors, including the preceding academic curricula, other criteria influencing the admissions decision, and the principal purpose for testing. Attendees of this session will learn the types of aptitudes and knowledge that are assessed higher education admissions tests and will have the opportunity to gain insight into how careful and deliberate consideration of the desired test constructs can aid in identifying potential students with the greatest likelihood of success in medical school.

Keywords: admissions, measuring success, selection, identify skills

Procedia PDF Downloads 469