Search results for: automatic classification
1030 Development of Electronic Services in Georgia: Analysis of Current Situation
Authors: Dato Surmanidze, Dato Antadze, Tornike Partenadze
Abstract:
Public online services in Georgia are concentrated on main target segments: public administration, business, population, non-governmental and other interested organizations. Therefore, the strategy of digital Georgia is focused on providing G2C, G2B/B2G, G2NGO and G2G services. In G2C framework sophisticated and high-technological online services have been developed in order to provide passports, identity cards, documentations concerning residence and civil acts (birth, marriage, divorce, child adoption, change of name and surname, death, etc) as well as other services. Websites like my.gov.ge and sda.gov.ge have distance services like electronic application, processing and decision making. In line with international standards automatic services like electronic tenders, product catalogues, invoices and payment have been developed. This creates better investment climate for foreign companies in Georgia in the framework of G2B politics. The website mybusiness.gov.ge creates better conditions for local business. Among electronic services is e-NRMS (electronic system for national resource management) which was introduced by the Ministry of Finance of Georgia. The system was created in order to ensure management of national resources by state and business organizations. It is integrated with bank services and provides G2C, G2B and B2G representatives with electronic services. Also a portal meteo.gov.ge was created which gives electronic services concerning air, geological, environmental and pollution issues. Also worknet.gov.ge should be mentioned which is an electronic hub of information management for employers and employees. The information portal of labor market will facilitate receipt of information, its analysis and delivery to interested people like employers and employees. However, nowadays it’s been two years that only employees portal is activated. Therefore, awareness about the portal, its competitiveness and success is undermined.Keywords: electronic services, public administration, information technology, information society
Procedia PDF Downloads 2681029 A Geographical Framework for Studying the Territorial Sustainability Based on Land Use Change
Authors: Miguel Ramirez, Ivan Lizarazo
Abstract:
The emergence of various interpretations of sustainability, including weak and strong paradigms, can be traced back to the definition of sustainable development provided in the 1987 Brundtland report and the subsequent evolution of the sustainability concept. However, there has been limited scholarly attention given to clarifying the concept of sustainability within the theoretical and conceptual framework of geography. The discipline has predominantly been focused on understanding the diverse conceptions of sustainability within its epistemological boundaries, resulting in tensions between sustainability paradigms and their associated dimensions, including the incorporation of political perspectives, with particular emphasis on environmental geography's epistemology. In response to this gap, a conceptual framework for sustainability is proposed, effectively integrating spatial and territorial concepts. This framework aims to enhance geography's role in contributing to sustainability by utilizing the land system theory, which is based on the dynamics of land use change. Such an integrated conceptual framework enables incorporating methodological tools such as remote sensing, encompassing various earth observations and fusion methods, and supervised classification techniques. Additionally, it looks for better integration of socioecological information, thereby capturing essential population-related features.Keywords: geography, sustainability, land change science, territorial sustainability
Procedia PDF Downloads 811028 Meta-Instruction Theory in Mathematics Education and Critique of Bloom’s Theory
Authors: Abdollah Aliesmaeili
Abstract:
The purpose of this research is to present a different perspective on the basic math teaching method called meta-instruction, which reverses the learning path. Meta-instruction is a method of teaching in which the teaching trajectory starts from brain education into learning. This research focuses on the behavior of the mind during learning. In this method, students are not instructed in mathematics, but they are educated. Another goal of the research is to "criticize Bloom's classification in the cognitive domain and reverse it", because it cannot meet the educational and instructional needs of the new generation and "substituting math education instead of math teaching". This is an indirect method of teaching. The method of research is longitudinal through four years. Statistical samples included students ages 6 to 11. The research focuses on improving the mental abilities of children to explore mathematical rules and operations by playing only with eight measurements (any years 2 examinations). The results showed that there is a significant difference between groups in remembering, understanding, and applying. Moreover, educating math is more effective than instructing in overall learning abilities.Keywords: applying, Bloom's taxonomy, brain education, mathematics teaching method, meta-instruction, remembering, starmath method, understanding
Procedia PDF Downloads 241027 Application of Rapidly Exploring Random Tree Star-Smart and G2 Quintic Pythagorean Hodograph Curves to the UAV Path Planning Problem
Authors: Luiz G. Véras, Felipe L. Medeiros, Lamartine F. Guimarães
Abstract:
This work approaches the automatic planning of paths for Unmanned Aerial Vehicles (UAVs) through the application of the Rapidly Exploring Random Tree Star-Smart (RRT*-Smart) algorithm. RRT*-Smart is a sampling process of positions of a navigation environment through a tree-type graph. The algorithm consists of randomly expanding a tree from an initial position (root node) until one of its branches reaches the final position of the path to be planned. The algorithm ensures the planning of the shortest path, considering the number of iterations tending to infinity. When a new node is inserted into the tree, each neighbor node of the new node is connected to it, if and only if the extension of the path between the root node and that neighbor node, with this new connection, is less than the current extension of the path between those two nodes. RRT*-smart uses an intelligent sampling strategy to plan less extensive routes by spending a smaller number of iterations. This strategy is based on the creation of samples/nodes near to the convex vertices of the navigation environment obstacles. The planned paths are smoothed through the application of the method called quintic pythagorean hodograph curves. The smoothing process converts a route into a dynamically-viable one based on the kinematic constraints of the vehicle. This smoothing method models the hodograph components of a curve with polynomials that obey the Pythagorean Theorem. Its advantage is that the obtained structure allows computation of the curve length in an exact way, without the need for quadratural techniques for the resolution of integrals.Keywords: path planning, path smoothing, Pythagorean hodograph curve, RRT*-Smart
Procedia PDF Downloads 1671026 Comparison of the Results of a Parkinson’s Holter Monitor with Patient Diaries, in Real Conditions of Use: A Sub-Analysis of the MoMoPa-EC Clinical Trial
Authors: Alejandro Rodríguez-Molinero, Carlos Pérez-López, Jorge Hernández-Vara, Àngels Bayes-Rusiñol, Juan Carlos Martínez-Castrillo, David A. Pérez-Martínez
Abstract:
Background: Monitoring motor symptoms in Parkinson's patients is often a complex and time-consuming task for clinicians, as Hauser's diaries are often poorly completed by patients. Recently, new automatic devices (Parkinson's holter: STAT-ON®) have been developed capable of monitoring patients' motor fluctuations. The MoMoPa-EC clinical trial (NCT04176302) investigates which of the two methods produces better clinical results. In this sub-analysis, the concordance between both methods is analyzed. Methods: In the MoMoPa-EC clinical trial, 164 patients with moderate-severe Parkinson's disease and at least two hours a day of Off will be included. At the time of patient recruitment, all of them completed a seven-day motor fluctuation diary at home (Hauser’s diary) while wearing the Parkinson's holter. In this sub-analysis, 71 patients with complete data for the purpose of this comparison were included. The intraclass correlation coefficient was calculated between the patient diary entries and the Parkinson's holter data in terms of time On, Off, and time with dyskinesias. Results: The intra-class correlation coefficient of both methods was 0.57 (95% CI: 0.3-0.74) for daily time in Off (%), 0.48 (95% CI: 0.14-0.68) for daily time in On (%), and 0.37 (95% CI %: -0.04-0.62) for daily time with dyskinesias (%). Conclusions: Both methods have a moderate agreement with each other. We will have to wait for the results of the MoMoPa-EC project to estimate which of them has the greatest clinical benefits. Acknowledgment: This work is supported by AbbVie S.L.U, the Instituto de Salud Carlos III [DTS17/00195], and the European Fund for Regional Development, 'A way to make Europe'.Keywords: Parkinson, sensor, motor fluctuations, dyskinesia
Procedia PDF Downloads 2331025 MRI Quality Control Using Texture Analysis and Spatial Metrics
Authors: Kumar Kanudkuri, A. Sandhya
Abstract:
Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy
Procedia PDF Downloads 1711024 Designing Cultural-Creative Products with the Six Categories of Hanzi (Chinese Character Classification)
Authors: Pei-Jun Xue, Ming-Yu Hsiao
Abstract:
Chinese characters, or hanzi, represent a process of simplifying three-dimensional signs into plane signifiers. From pictograms at the beginning to logograms today, a Han linguist thus classified them into six categories known as the six categories of Chinese characters. Design is a process of signification, and cultural-creative design is a process translating ideas into design with creativity upon culture. Aiming to investigate the process of cultural-creative design transforming cultural text into cultural signs, this study analyzed existing cultural-creative products with the six categories of Chinese characters by treating such products as representations which accurately communicate the designer’s ideas to users through the categorization, simplification, and interpretation of sign features. This is a two-phase pilot study on designing cultural-creative products with the six categories of Chinese characters. Phase I reviews the related literature on the theory of the six categories of Chinese characters investigated and concludes with the process and principles of character evolution. Phase II analyzes the design of existing cultural-creative products with the six categories of Chinese characters and explores the conceptualization of product design.Keywords: six categories of Chinese characters, cultural-creative product design, cultural signs, cultural product
Procedia PDF Downloads 3441023 Variation in the Morphology of Soft Palate
Authors: Hema Lattupalli
Abstract:
Introduction: The palate forms a partition between the oral cavity and nasal cavity. The palate is made up of two parts hard palate and soft palate. The Hard palate forms the anterior part of the palate, the soft palate forms a movable muscular fold covered by mucous membrane that is suspended from the posterior border of a hard palate. Aim and Objectives: Soft palate morphological variations have a great paucity in the literature. It’s also believed that the soft palate has no such important anatomical variations. There is a variable presentation of the soft palate morphology in the lateral cephalograms. The aim of this study is to identify the velar morphology. Materials and Methods: 100 normal subjects between the age group of 20 – 35 were taken for the study. Method: Lateral Cephalogram (radiologic study). Results: Different shapes of the soft palate were observed in the lateral cephalograms. The morphology of soft palate was classified into six types 1.Leaf like (50 cases) most common type, 2.Straight line (20 cases), 3.S shaped (4 cases) very rare, 4.Butt like (10 cases), 5. Rat tail (6 cases), 6. Hook shaped (10 cases). Conclusion: This classification helps us to understand the better diversity of the velar morphology in mid-sagittal plane. These findings help us to understand the etiology of OSAS.Keywords: soft palate, cephalometric radiographs, morphology, cleft palate, obstructive sleep apnoea syndrome
Procedia PDF Downloads 3631022 Extraction of Urban Land Features from TM Landsat Image Using the Land Features Index and Tasseled Cap Transformation
Authors: R. Bouhennache, T. Bouden, A. A. Taleb, A. Chaddad
Abstract:
In this paper we propose a method to map the urban areas. The method uses an arithmetic calculation processed from the land features indexes and Tasseled cap transformation TC of multi spectral Thematic Mapper Landsat TM image. For this purpose the derived indexes image from the original image such SAVI the soil adjusted vegetation index, UI the urban Index, and EBBI the enhanced built up and bareness index were staked to form a new image and the bands were uncorrelated, also the Spectral Angle Mapper (SAM) and Spectral Information Divergence (SID) supervised classification approaches were first applied on the new image TM data using the reference spectra of the spectral library and subsequently the four urban, vegetation, water and soil land cover categories were extracted with their accuracy assessment.The urban features were represented using a logic calculation applied to the brightness, UI-SAVI, NDBI-greenness and EBBI- brightness data sets. The study applied to Blida and mentioned that the urban features can be mapped with an accuracy ranging from 92 % to 95%.Keywords: EBBI, SAVI, Tasseled Cap Transformation, UI
Procedia PDF Downloads 4831021 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method
Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park
Abstract:
3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)
Procedia PDF Downloads 2341020 Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information
Authors: Wei-Jong Yang, Wei-Hau Du, Pau-Choo Chang, Jar-Ferr Yang, Pi-Hsia Hung
Abstract:
The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an important feature for visual thing recognition. With color-based SIFT features and SVM, we can discard unreliable matching pairs and increase the robustness of matching tasks. The experimental results show that the proposed object recognition system with color-assistant SIFT SVM classifier achieves higher recognition rate than that with the traditional gray SIFT and SVM classification in various situations.Keywords: color moments, visual thing recognition system, SIFT, color SIFT
Procedia PDF Downloads 4691019 Alignment and Antagonism in Flux: A Diachronic Sentiment Analysis of Attitudes towards the Chinese Mainland in the Hong Kong Press
Authors: William Feng, Qingyu Gao
Abstract:
Despite the extensive discussions about Hong Kong’s sentiments towards the Chinese Mainland since the sovereignty transfer in 1997, there has been no large-scale empirical analysis of the changing attitudes in the mainstream media, which both reflect and shape sentiments in the society. To address this gap, the present study uses an optimised semantic-based automatic sentiment analysis method to examine a corpus of news about China from 1997 to 2020 in three main Chinese-language newspapers in Hong Kong, namely Apple Daily, Ming Pao, and Oriental Daily News. The analysis shows that although the Hong Kong press had a positive emotional tone toward China in general, the overall trend of sentiment was becoming increasingly negative. Meanwhile, the alignment and antagonism toward China have both increased, providing empirical evidence of attitudinal polarisation in the Hong Kong society. Specifically, Apple Daily’s depictions of China have become increasingly negative, though with some positive turns before 2008, whilst Oriental Daily News has consistently expressed more favourable sentiments. Ming Pao maintained an impartial stance toward China through an increased but balanced representation of positive and negative sentiments, with its subjectivity and sentiment intensity growing to an industry-standard level. The results provide new insights into the complexity of sentiments towards China in the Hong Kong press and media attitudes in general in terms of the “us” and “them” positioning by explicating the cross-newspaper and cross-period variations using an enhanced sentiment analysis method which incorporates sentiment-oriented and semantic role analysis techniques.Keywords: media attitude, sentiment analysis, Hong Kong press, one country two systems
Procedia PDF Downloads 1211018 Heavy Vehicle Traffic Estimation Using Automatic Traffic Recorders/Weigh-In-Motion Data: Current Practice and Proposed Methods
Authors: Muhammad Faizan Rehman Qureshi, Ahmed Al-Kaisy
Abstract:
Accurate estimation of traffic loads is critical for pavement and bridge design, among other transportation applications. Given the disproportional impact of heavier axle loads on pavement and bridge structures, truck and heavy vehicle traffic is expected to be a major determinant of traffic load estimation. Further, heavy vehicle traffic is also a major input in transportation planning and economic studies. The traditional method for estimating heavy vehicle traffic primarily relies on AADT estimation using Monthly Day of the Week (MDOW) adjustment factors as well as the percent heavy vehicles observed using statewide data collection programs. The MDOW factors are developed using daily and seasonal (or monthly) variation patterns for total traffic, consisting predominantly of passenger cars and other smaller vehicles. Therefore, while using these factors may yield reasonable estimates for total traffic (AADT), such estimates may involve a great deal of approximation when applied to heavy vehicle traffic. This research aims at assessing the approximation involved in estimating heavy vehicle traffic using MDOW adjustment factors for total traffic (conventional approach) along with three other methods of using MDOW adjustment factors for total trucks (class 5-13), combination-unit trucks (class 8-13), as well as adjustment factors for each vehicle class separately. Results clearly indicate that the conventional method was outperformed by the other three methods by a large margin. Further, using the most detailed and data intensive method (class-specific adjustment factors) does not necessarily yield a more accurate estimation of heavy vehicle traffic.Keywords: traffic loads, heavy vehicles, truck traffic, adjustment factors, traffic data collection
Procedia PDF Downloads 231017 Hate Speech Detection in Tunisian Dialect
Authors: Helmi Baazaoui, Mounir Zrigui
Abstract:
This study addresses the challenge of hate speech detection in Tunisian Arabic text, a critical issue for online safety and moderation. Leveraging the strengths of the AraBERT model, we fine-tuned and evaluated its performance against the Bi-LSTM model across four distinct datasets: T-HSAB, TNHS, TUNIZI-Dataset, and a newly compiled dataset with diverse labels such as Offensive Language, Racism, and Religious Intolerance. Our experimental results demonstrate that AraBERT significantly outperforms Bi-LSTM in terms of Recall, Precision, F1-Score, and Accuracy across all datasets. The findings underline the robustness of AraBERT in capturing the nuanced features of Tunisian Arabic and its superior capability in classification tasks. This research not only advances the technology for hate speech detection but also provides practical implications for social media moderation and policy-making in Tunisia. Future work will focus on expanding the datasets and exploring more sophisticated architectures to further enhance detection accuracy, thus promoting safer online interactions.Keywords: hate speech detection, Tunisian Arabic, AraBERT, Bi-LSTM, Gemini annotation tool, social media moderation
Procedia PDF Downloads 111016 Adversarial Disentanglement Using Latent Classifier for Pose-Independent Representation
Authors: Hamed Alqahtani, Manolya Kavakli-Thorne
Abstract:
The large pose discrepancy is one of the critical challenges in face recognition during video surveillance. Due to the entanglement of pose attributes with identity information, the conventional approaches for pose-independent representation lack in providing quality results in recognizing largely posed faces. In this paper, we propose a practical approach to disentangle the pose attribute from the identity information followed by synthesis of a face using a classifier network in latent space. The proposed approach employs a modified generative adversarial network framework consisting of an encoder-decoder structure embedded with a classifier in manifold space for carrying out factorization on the latent encoding. It can be further generalized to other face and non-face attributes for real-life video frames containing faces with significant attribute variations. Experimental results and comparison with state of the art in the field prove that the learned representation of the proposed approach synthesizes more compelling perceptual images through a combination of adversarial and classification losses.Keywords: disentanglement, face detection, generative adversarial networks, video surveillance
Procedia PDF Downloads 1291015 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model
Authors: Didier Auroux, Vladimir Groza
Abstract:
This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization
Procedia PDF Downloads 3161014 To Determine the Effects of Regulatory Food Safety Inspections on the Grades of Different Categories of Retail Food Establishments across the Dubai Region
Authors: Shugufta Mohammad Zubair
Abstract:
This study explores the Effect of the new food System Inspection system also called the new inspection color card scheme on reduction of critical & major food safety violations in Dubai. Data was collected from all retail food service establishments located in two zones in the city. Each establishment was visited twice, once before the launch of the new system and one after the launch of the system. In each visit, the Inspection checklist was used as the evaluation tool for observation of the critical and major violations. The old format of the inspection checklist was concerned with scores based on the violations; but the new format of the checklist for the new inspection color card scheme is divided into administrative, general major and critical which gives a better classification for the inspectors to identify the critical and major violations of concerned. The study found that there has been a better and clear marking of violations after the launch of new inspection system wherein the inspectors are able to mark and categories the violations effectively. There had been a 10% decrease in the number of food establishment that was previously given A grade. The B & C grading were also considerably dropped by 5%.Keywords: food inspection, risk assessment, color card scheme, violations
Procedia PDF Downloads 3241013 Global Differences in Job Satisfaction of Healthcare Professionals
Authors: Jonathan H. Westover, Ruthann Cunningham, Jaron Harvey
Abstract:
Purpose: Job satisfaction is one of the most critical attitudes among employees. Understanding whether employees are satisfied with their jobs and what is driving that satisfaction is important for any employer, but particularly for healthcare organizations. This study looks at the question of job satisfaction and drivers of job satisfaction among healthcare professionals at a global scale, looking for trends that generalize across 37 countries. Study: This study analyzed job satisfaction responses to the 2015 Work Orientations IV wave of the International Social Survey Programme (ISSP) to understand differences in antecedents for and levels of job satisfaction among healthcare professionals. A total of 18,716 respondents from 37 countries participated in the annual survey. Findings: Respondents self-identified their occupational category based on corresponding International Standard Classification of Occupations (ISCO-08) codes. Results suggest that mean overall job satisfaction was highest among health service managers and generalist medical practitioners and lowest among environmental hygiene professionals and nursing professionals. Originality: Many studies have addressed the issue of job satisfaction in healthcare, examining small samples of specific healthcare workers. In this study, using a large international dataset, we are able to examine questions of job satisfaction across large groups of healthcare workers in different occupations within the healthcare field.Keywords: job satisfaction, healthcare industry, global comparisons, workplace
Procedia PDF Downloads 1451012 Review and Classification of the Indicators and Trends Used in Bridge Performance Modeling
Authors: S. Rezaei, Z. Mirzaei, M. Khalighi, J. Bahrami
Abstract:
Bridges, as an essential part of road infrastructures, are affected by various deterioration mechanisms over time due to the changes in their performance. As changes in performance can have many negative impacts on society, it is essential to be able to evaluate and measure the performance of bridges throughout their life. This evaluation includes the development or the choice of the appropriate performance indicators, which, in turn, are measured based on the selection of appropriate models for the existing deterioration mechanism. The purpose of this article is a statistical study of indicators and deterioration mechanisms of bridges in order to discover further research capacities in bridges performance assessment. For this purpose, some of the most common indicators of bridge performance, including reliability, risk, vulnerability, robustness, and resilience, were selected. The researches performed on each index based on the desired deterioration mechanisms and hazards were comprehensively reviewed. In addition, the formulation of the indicators and their relationship with each other were studied. The research conducted on the mentioned indicators were classified from the point of view of deterministic or probabilistic method, the level of study (element level, object level, etc.), and the type of hazard and the deterioration mechanism of interest. For each of the indicators, a number of challenges and recommendations were presented according to the review of previous studies.Keywords: bridge, deterioration mechanism, lifecycle, performance indicator
Procedia PDF Downloads 1051011 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 1611010 Deep Learning based Image Classifiers for Detection of CSSVD in Cacao Plants
Authors: Atuhurra Jesse, N'guessan Yves-Roland Douha, Pabitra Lenka
Abstract:
The detection of diseases within plants has attracted a lot of attention from computer vision enthusiasts. Despite the progress made to detect diseases in many plants, there remains a research gap to train image classifiers to detect the cacao swollen shoot virus disease or CSSVD for short, pertinent to cacao plants. This gap has mainly been due to the unavailability of high quality labeled training data. Moreover, institutions have been hesitant to share their data related to CSSVD. To fill these gaps, image classifiers to detect CSSVD-infected cacao plants are presented in this study. The classifiers are based on VGG16, ResNet50 and Vision Transformer (ViT). The image classifiers are evaluated on a recently released and publicly accessible KaraAgroAI Cocoa dataset. The best performing image classifier, based on ResNet50, achieves 95.39\% precision, 93.75\% recall, 94.34\% F1-score and 94\% accuracy on only 20 epochs. There is a +9.75\% improvement in recall when compared to previous works. These results indicate that the image classifiers learn to identify cacao plants infected with CSSVD.Keywords: CSSVD, image classification, ResNet50, vision transformer, KaraAgroAI cocoa dataset
Procedia PDF Downloads 1031009 Decision Tree Analysis of Risk Factors for Intravenous Infiltration among Hospitalized Children: A Retrospective Study
Authors: Soon-Mi Park, Ihn Sook Jeong
Abstract:
This retrospective study was aimed to identify risk factors of intravenous (IV) infiltration for hospitalized children. The participants were 1,174 children for test and 424 children for validation, who admitted to a general hospital, received peripheral intravenous injection therapy at least once and had complete records. Data were analyzed with frequency and percentage or mean and standard deviation were calculated, and decision tree analysis was used to screen for the most important risk factors for IV infiltration for hospitalized children. The decision tree analysis showed that the most important traditional risk factors for IV infiltration were the use of ampicillin/sulbactam, IV insertion site (lower extremities), and medical department (internal medicine) both in the test sample and validation sample. The correct classification was 92.2% in the test sample and 90.1% in the validation sample. More careful attention should be made to patients who are administered ampicillin/sulbactam, have IV site in lower extremities and have internal medical problems to prevent or detect infiltration occurrence.Keywords: decision tree analysis, intravenous infiltration, child, validation
Procedia PDF Downloads 1761008 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison
Authors: Saugata Bose, Ritambhra Korpal
Abstract:
The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram
Procedia PDF Downloads 3581007 Mineralogy and Classification of Altered Host Rocks in the Zaghia Iron Oxide Deposit, East of Bafq, Central Iran
Authors: Azat Eslamizadeh, Neda Akbarian
Abstract:
The Zaghia Iron ore, in 15 km east of a town named Bafq, is located in Precambrian formation of Central Iran in form of a small local deposit. The Volcano-sedimentary rocks of Precambrian-Cambrian age, belonging to Rizu series have spread through the region. Substantial portion of the deposit is covered by alluvial deposits. The rocks hosting the Zaghia iron ore have a main combination of rhyolitic tuffs along with clastic sediments, carbonate include sandstone, limestone, dolomite, conglomerate and is somewhat metamorphed causing them to have appeared as slate and phyllite. Moreover, carbonate rocks are in existence as skarn compound of marble bearing tremolite with mineralization of magnetite-hematite. The basic igneous rocks have dramatically altered into green rocks consist of actinolite-tremolite and chlorite along with amount of iron (magnetite + Martite). The youngest units of ore-bearing rocks in the area are found as dolerite - diabase dikes. The dikes are cutting the rhyolitic tuffs and carbonate rocks.Keywords: Zaghia, iron ore deposite, mineralogy, petrography Bafq, Iran
Procedia PDF Downloads 5241006 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis
Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu
Abstract:
Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding
Procedia PDF Downloads 1671005 Recovery of Dredged Sediments With Lime or Cement as Platform Materials for Use in a Roadway
Authors: Abriak Yassine, Zri Abdeljalil, Benzerzour Mahfoud., Hadj Sadok Rachid, Abriak Nor-Edine
Abstract:
In this study, firstly, the study of the capacity reuse of dredged sediments and treated sediments with lime or cement were used in an establishment layer and the base layer of the roadway. Also, the analysis of mineral changes caused by the addition of lime or cement on the way as described in the mechanical results of stabilised sediments. After determining the quantity of lime and cement required to stabilise the sediment, the compaction characteristics were studied using the modified Proctor method. Then the evolution of the three parameters, that is, ideal water content and maximum dry density had been determined. Mechanical exhibitions can be assessed across the resistance to compression, flexibility modulus and the resistance under traction. The resistance of the formulation treated with cement addition (ROLAC®645) increase with the quantity of ROLAC®645. Traction resistances and the elastic modulus were utilized to assess the potential of the formulation as road construction materials utilizing classification diagram. The results show the various formulations with ROLAC® 645may be employed in subgrades and foundation layers for roads.Keywords: cement, dredged, sediment, foundation layer, resistance
Procedia PDF Downloads 1001004 Decomposition of Funds Transfer Pricing Components in Islamic Bank: The Exposure Effect of Shariah Non-Compliant Event Rectification Process
Authors: Azrul Azlan Iskandar Mirza
Abstract:
The purpose of Funds Transfer Pricing (FTP) for Islamic Bank is to promote prudent liquidity risk-taking behavior of business units. The acquirer of stable deposits will be rewarded whilst a business unit that generates long-term assets will be charged for added liquidity funding risks. In the end, it promotes risk-adjusted pricing by incorporating profit rate risk and liquidity risk component in the product pricing. However, in the event of Shariah non-compliant (SNCE), FTP components will be examined in the rectification plan especially when Islamic banks need to purify the non-compliance income. The finding shows that the determination between actual and provision cost will defer the decision among Shariah committee in Islamic banks. This paper will review each of FTP components to ensure the classification of actual and provision costs reflect the decision on rectification process on SNCE. This will benefit future decision and its consistency of Islamic banks.Keywords: fund transfer pricing, Islamic banking, Islamic finance, shariah non-compliant event
Procedia PDF Downloads 1951003 Second-Order Complex Systems: Case Studies of Autonomy and Free Will
Authors: Eric Sanchis
Abstract:
Although there does not exist a definitive consensus on a precise definition of a complex system, it is generally considered that a system is complex by nature. The presented work illustrates a different point of view: a system becomes complex only with regard to the question posed to it, i.e., with regard to the problem which has to be solved. A complex system is a couple (question, object). Because the number of questions posed to a given object can be potentially substantial, complexity does not present a uniform face. Two types of complex systems are clearly identified: first-order complex systems and second-order complex systems. First-order complex systems physically exist. They are well-known because they have been studied by the scientific community for a long time. In second-order complex systems, complexity results from the system composition and its articulation that are partially unknown. For some of these systems, there is no evidence of their existence. Vagueness is the keyword characterizing this kind of systems. Autonomy and free will, two mental productions of the human cognitive system, can be identified as second-order complex systems. A classification based on the properties structure makes it possible to discriminate complex properties from the others and to model this kind of second order complex systems. The final outcome is an implementable synthetic property that distinguishes the solid aspects of the actual property from those that are uncertain.Keywords: autonomy, free will, synthetic property, vaporous complex systems
Procedia PDF Downloads 2051002 Facies, Diagenetic Analysis and Sequence Stratigraphy of Habib Rahi Formation Dwelling in the Vicinity of Jacobabad Khairpur High, Southern Indus Basin, Pakistan
Authors: Muhammad Haris, Syed Kamran Ali, Mubeen Islam, Tariq Mehmood, Faisal Shah
Abstract:
Jacobabad Khairpur High, part of a Sukkur rift zone, is the separating boundary between Central and Southern Indus Basin, formed as a result of Post-Jurassic uplift after the deposition of Middle Jurassic Chiltan Formation. Habib Rahi Formation of Middle to Late Eocene outcrops in the vicinity of Jacobabad Khairpur High, a section at Rohri near Sukkur is measured in detail for lithofacies, microfacies, diagenetic analysis and sequence stratigraphy. Habib Rahi Formation is richly fossiliferous and consists of mostly limestone with subordinate clays and marl. The total thickness of the formation in this section is 28.8m. The bottom of the formation is not exposed, while the upper contact with the Sirki Shale of the Middle Eocene age is unconformable in some places. A section is measured using Jacob’s Staff method, and traverses were made perpendicular to the strike. Four different lithofacies were identified based on outcrop geology which includes coarse-grained limestone facies (HR-1 to HR-5), massive bedded limestone facies (HR-6 HR-7), and micritic limestone facies (HR-8 to HR-13) and algal dolomitic limestone facie (HR-14). Total 14 rock samples were collected from outcrop for detailed petrographic studies, and thin sections of respective samples were prepared and analyzed under the microscope. On the basis of Dunham’s (1962) classification systems after studying textures, grain size, and fossil content and using Folk’s (1959) classification system after reviewing Allochems type, four microfacies were identified. These microfacies include HR-MF 1: Benthonic Foraminiferal Wackstone/Biomicrite Microfacies, HR-MF 2: Foramineral Nummulites Wackstone-Packstone/Biomicrite Microfacies HR-MF 3: Benthonic Foraminiferal Packstone/Biomicrite Microfacies, HR-MF 4: Bioclasts Carbonate Mudstone/Micrite Microfacies. The abundance of larger benthic Foraminifera’s (LBF), including Assilina sp., A. spiral abrade, A. granulosa, A. dandotica, A. laminosa, Nummulite sp., N. fabiani, N. stratus, N. globulus, Textularia, Bioclasts, and Red algae indicates shallow marine (Tidal Flat) environment of deposition. Based on variations in rock types, grain size, and marina fauna Habib Rahi Formation shows progradational stacking patterns, which indicates coarsening upward cycles. The second order of sea-level rise is identified (spanning from Y-Persian to Bartonian age) that represents the Transgressive System Tract (TST) and a third-order Regressive System Tract (RST) (spanning from Bartonian to Priabonian age). Diagenetic processes include fossils replacement by mud, dolomitization, pressure dissolution associated stylolites features and filling with dark organic matter. The presence of the microfossils includes Nummulite. striatus, N. fabiani, and Assilina. dandotica, signify Bartonian to Priabonian age of Habib Rahi Formation.Keywords: Jacobabad Khairpur High, Habib Rahi Formation, lithofacies, microfacies, sequence stratigraphy, diagenetic history
Procedia PDF Downloads 4731001 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data
Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine
Procedia PDF Downloads 240