Search results for: automatic attendance
785 Analysis of Urban Rail Transit Station's Accessibility Reliability: A Case Study of Hangzhou Metro, China
Authors: Jin-Qu Chen, Jie Liu, Yong Yin, Zi-Qi Ju, Yu-Yao Wu
Abstract:
Increase in travel fare and station’s failure will have huge impact on passengers’ travel. The Urban Rail Transit (URT) station’s accessibility reliability under increasing travel fare and station failure are analyzed in this paper. Firstly, the passenger’s travel path is resumed based on stochastic user equilibrium and Automatic Fare Collection (AFC) data. Secondly, calculating station’s importance by combining LeaderRank algorithm and Ratio of Station Affected Passenger Volume (RSAPV), and then the station’s accessibility evaluation indicators are proposed based on the analysis of passenger’s travel characteristic. Thirdly, station’s accessibility under different scenarios are measured and rate of accessibility change is proposed as station’s accessibility reliability indicator. Finally, the accessibility of Hangzhou metro stations is analyzed by the formulated models. The result shows that Jinjiang station and Liangzhu station are the most important and convenient station in the Hangzhou metro, respectively. Station failure and increase in travel fare and station failure have huge impact on station’s accessibility, except for increase in travel fare. Stations in Hangzhou metro Line 1 have relatively worse accessibility reliability and Fengqi Road station’s accessibility reliability is weakest. For Hangzhou metro operational department, constructing new metro line around Line 1 and protecting Line 1’s station preferentially can effective improve the accessibility reliability of Hangzhou metro.Keywords: automatic fare collection data, AFC, station’s accessibility reliability, stochastic user equilibrium, urban rail transit, URT
Procedia PDF Downloads 135784 Population Dynamics and Land Use/Land Cover Change on the Chilalo-Galama Mountain Range, Ethiopia
Authors: Yusuf Jundi Sado
Abstract:
Changes in land use are mostly credited to human actions that result in negative impacts on biodiversity and ecosystem functions. This study aims to analyze the dynamics of land use and land cover changes for sustainable natural resources planning and management. Chilalo-Galama Mountain Range, Ethiopia. This study used Thematic Mapper 05 (TM) for 1986, 2001 and Landsat 8 (OLI) data 2017. Additionally, data from the Central Statistics Agency on human population growth were analyzed. Semi-Automatic classification plugin (SCP) in QGIS 3.2.3 software was used for image classification. Global positioning system, field observations and focus group discussions were used for ground verification. Land Use Land Cover (LU/LC) change analysis was using maximum likelihood supervised classification and changes were calculated for the 1986–2001 and the 2001–2017 and 1986-2017 periods. The results show that agricultural land increased from 27.85% (1986) to 44.43% and 51.32% in 2001 and 2017, respectively with the overall accuracies of 92% (1986), 90.36% (2001), and 88% (2017). On the other hand, forests decreased from 8.51% (1986) to 7.64 (2001) and 4.46% (2017), and grassland decreased from 37.47% (1986) to 15.22%, and 15.01% in 2001 and 2017, respectively. It indicates for the years 1986–2017 the largest area cover gain of agricultural land was obtained from grassland. The matrix also shows that shrubland gained land from agricultural land, afro-alpine, and forest land. Population dynamics is found to be one of the major driving forces for the LU/LU changes in the study area.Keywords: Landsat, LU/LC change, Semi-Automatic classification plugin, population dynamics, Ethiopia
Procedia PDF Downloads 85783 Factors Associated with Suicidal Ideation among Undergraduate College Students
Authors: Samantha Vennice G. Sarcia
Abstract:
A person dies every 40 seconds throughout the world due to suicide-related behaviors. Suicidal ideation is a strong precursor to suicide completion. It is one of the major health challenges faced by the world today thus, it is highly substantial. The present study investigated the influence of personality traits and socio-demographic characteristics in predicting suicidal ideation. Using the Suicide Behaviors Questionnaire-Revised and the Big Five Inventory, the degree of suicidal ideation and the associated personality traits were identified. Out of 194 students from the allied health courses, the findings suggest that the college students are at-risk and have passive thoughts about suicide. Using multiple regression analysis, there was an identified significant relationship among the factors associated with suicidal ideation, particularly the number of persons in the household, living arrangement, attendance in church activities, agreeableness, conscientiousness, and neuroticism. Findings can help in the development of campus-based suicide prevention programs.Keywords: depression, personality traits, suicidal ideation, suicide
Procedia PDF Downloads 225782 Automatic and High Precise Modeling for System Optimization
Authors: Stephanie Chen, Mitja Echim, Christof Büskens
Abstract:
To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization
Procedia PDF Downloads 409781 The Increasing Importance of the Role of AI in Higher Education
Authors: Joshefina Bengoechea Fernandez, Alex Bell
Abstract:
In its 2021 guidance for policy makers, the UNESCO has proposed 4 areas where AI can be applied in educational settings: These are: 1) Education management and delivery; 2) Learning and assessment; 3) Empowering teachers and facilitating teaching, and 4) Providing lifelong learning possibilities (UNESCO, 2021). Like with wblockchain technologies, AI will automate the management of educational institutions. These include, but are not limited to admissions, timetables, attendance, and homework monitoring. Furthermore, AI will be used to select relevant learning content across learning platforms for each student, based on his or her personalized needs. A problem educators face is the “one-size-fits-all” approach that does not work with a diverse student population. The purpose of this paper is to illustrate if the implementation of Technology is the solution to the Problems faced in Higher Education. The paper builds upon a constructivist approach, combining a literature review and research on key publications and academic reports.Keywords: artificial intelligence, learning platforms, students personalised needs, life- long learning, privacy, ethics
Procedia PDF Downloads 104780 A Case Study in Using Gamification in the Mobile Computing Course
Authors: Rula Al Azawi, Abobaker Shafi
Abstract:
The purpose of this paper is to use gamification technology in the mobile computing course to increase students motivation and engagement. The game applied to be designed by students focusing also to design educational game for children with age six years. This game will teach the students how to learn in a fun way. Our case study is implemented at Gulf College which is affiliated with Staffordshire University-UK. Our game design was applied to teach students Android Studio software by designing an educational game. Our goal with gamification is to improve student attendance, increase student engagement, problem solving and user stratification. Finally, we describe the findings and results of our case study. The data analysis and evaluation are based on students feedback, staff feedback and the final marking grades for the students.Keywords: gamification, educational game, android studio software, students motivation and engagement
Procedia PDF Downloads 455779 Analysis of Initial Entry-Level Technology Course Impacts on STEM Major Selection
Authors: Ethan Shafer, Timothy Graziano
Abstract:
This research seeks to answer whether first-year courses at institutions of higher learning can impact STEM major selection. Unlike many universities, an entry-level technology course (often referred to as CS0) is required for all United States Military Academy (USMA) students–regardless of major–in their first year of attendance. Students at the academy choose their major at the end of their first year of studies. Through student responses to a multi-semester survey, this paper identifies a number of factors that potentially influence STEM major selection. Student demographic data, pre-existing exposure and access to technology, perceptions of STEM subjects, and initial desire for a STEM major are captured before and after taking a CS0 course. An analysis of factors that contribute to student perception of STEM and major selection are presented. This work provides recommendations and suggestions for institutions currently providing or looking to provide CS0-like courses to their students.Keywords: education, STEM, pedagogy, digital literacy
Procedia PDF Downloads 121778 Statistical Feature Extraction Method for Wood Species Recognition System
Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof
Abstract:
Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images
Procedia PDF Downloads 425777 An Evaluation of a First Year Introductory Statistics Course at a University in Jamaica
Authors: Ayesha M. Facey
Abstract:
The evaluation sought to determine the factors associated with the high failure rate among students taking a first-year introductory statistics course. By utilizing Tyler’s Objective Based Model, the main objectives were: to assess the effectiveness of the lecturer’s teaching strategies; to determine the proportion of students who attends lectures and tutorials frequently and to determine the impact of infrequent attendance on performance; to determine how the assigned activities assisted in students understanding of the course content; to ascertain the possible issues being faced by students in understanding the course material and obtain possible solutions to the challenges and to determine whether the learning outcomes have been achieved based on an assessment of the second in-course examination. A quantitative survey research strategy was employed and the study population was students enrolled in semester one of the academic year 2015/2016. A convenience sampling approach was employed resulting in a sample of 98 students. Primary data was collected using self-administered questionnaires over a one-week period. Secondary data was obtained from the results of the second in-course examination. Data were entered and analyzed in SPSS version 22 and both univariate and bivariate analyses were conducted on the information obtained from the questionnaires. Univariate analyses provided description of the sample through means, standard deviations and percentages while bivariate analyses were done using Spearman’s Rho correlation coefficient and Chi-square analyses. For secondary data, an item analysis was performed to obtain the reliability of the examination questions, difficulty index and discriminant index. The examination results also provided information on the weak areas of the students and highlighted the learning outcomes that were not achieved. Findings revealed that students were more likely to participate in lectures than tutorials and that attendance was high for both lectures and tutorials. There was a significant relationship between participation in lectures and performance on examination. However, a high proportion of students has been absent from three or more tutorials as well as lectures. A higher proportion of students indicated that they completed the assignments obtained from the lectures sometimes while they rarely completed tutorial worksheets. Students who were more likely to complete their assignments were significantly more likely to perform well on their examination. Additionally, students faced a number of challenges in understanding the course content and the topics of probability, binomial distribution and normal distribution were the most challenging. The item analysis also highlighted these topics as problem areas. Problems doing mathematics and application and analyses were their major challenges faced by students and most students indicated that some of the challenges could be alleviated if additional examples were worked in lectures and they were given more time to solve questions. Analysis of the examination results showed that a number of learning outcomes were not achieved for a number of topics. Based on the findings recommendations were made that suggested adjustments to grade allocations, delivery of lectures and methods of assessment.Keywords: evaluation, item analysis, Tyler’s objective based model, university statistics
Procedia PDF Downloads 190776 Effects of Cattaneo-Christov Heat Flux on 3D Magnetohydrodynamic Viscoelastic Fluid Flow with Variable Thermal Conductivity
Authors: Muhammad Ramzan
Abstract:
A mathematical model has been envisaged to discuss three-dimensional Viscoelastic fluid flow with an effect of Cattaneo-Christov heat flux in attendance of magnetohydrodynamic (MHD). Variable thermal conductivity with the impact of homogeneous-heterogeneous reactions and convective boundary condition is also taken into account. Homotopy analysis method is engaged to obtain series solutions. Graphical illustrations depicting behaviour of sundry parameters on skin friction coefficient and all involved distributions are also given. It is observed that velocity components are decreasing functions of Viscoelastic fluid parameter. Furthermore, strength of homogeneous and heterogeneous reactions have opposite effects on concentration distribution. A comparison with a published paper has also been established and an excellent agreement is obtained; hence reliable results are being presented.Keywords: Cattaneo Christov heat flux, homogenous-heterogeneous reactions, magnetic field, variable thermal conductivity
Procedia PDF Downloads 197775 Automatic Differentiation of Ultrasonic Images of Cystic and Solid Breast Lesions
Authors: Dmitry V. Pasynkov, Ivan A. Egoshin, Alexey A. Kolchev, Ivan V. Kliouchkin
Abstract:
In most cases, typical cysts are easily recognized at ultrasonography. The specificity of this method for typical cysts reaches 98%, and it is usually considered as gold standard for typical cyst diagnosis. However, it is necessary to have all the following features to conclude the typical cyst: clear margin, the absence of internal echoes and dorsal acoustic enhancement. At the same time, not every breast cyst is typical. It is especially characteristic for protein-contained cysts that may have significant internal echoes. On the other hand, some solid lesions (predominantly malignant) may have cystic appearance and may be falsely accepted as cysts. Therefore we tried to develop the automatic method of cystic and solid breast lesions differentiation. Materials and methods. The input data were the ultrasonography digital images with the 256-gradations of gray color (Medison SA8000SE, Siemens X150, Esaote MyLab C). Identification of the lesion on these images was performed in two steps. On the first one, the region of interest (or contour of lesion) was searched and selected. Selection of such region is carried out using the sigmoid filter where the threshold is calculated according to the empirical distribution function of the image brightness and, if necessary, it was corrected according to the average brightness of the image points which have the highest gradient of brightness. At the second step, the identification of the selected region to one of lesion groups by its statistical characteristics of brightness distribution was made. The following characteristics were used: entropy, coefficients of the linear and polynomial regression, quantiles of different orders, an average gradient of brightness, etc. For determination of decisive criterion of belonging to one of lesion groups (cystic or solid) the training set of these characteristics of brightness distribution separately for benign and malignant lesions were received. To test our approach we used a set of 217 ultrasonic images of 107 cystic (including 53 atypical, difficult for bare eye differentiation) and 110 solid lesions. All lesions were cytologically and/or histologically confirmed. Visual identification was performed by trained specialist in breast ultrasonography. Results. Our system correctly distinguished all (107, 100%) typical cysts, 107 of 110 (97.3%) solid lesions and 50 of 53 (94.3%) atypical cysts. On the contrary, with the bare eye it was possible to identify correctly all (107, 100%) typical cysts, 96 of 110 (87.3%) solid lesions and 32 of 53 (60.4%) atypical cysts. Conclusion. Automatic approach significantly surpasses the visual assessment performed by trained specialist. The difference is especially large for atypical cysts and hypoechoic solid lesions with the clear margin. This data may have a clinical significance.Keywords: breast cyst, breast solid lesion, differentiation, ultrasonography
Procedia PDF Downloads 269774 Automatic Detection of Traffic Stop Locations Using GPS Data
Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell
Abstract:
Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data
Procedia PDF Downloads 275773 AgriInnoConnect Pro System Using Iot and Firebase Console
Authors: Amit Barde, Dipali Khatave, Vaishali Savale, Atharva Chavan, Sapna Wagaj, Aditya Jilla
Abstract:
AgriInnoConnect Pro is an advanced agricultural automation system designed to enhance irrigation efficiency and overall farm management through IoT technology. Using MIT App Inventor, Telegram, Arduino IDE, and Firebase Console, it provides a user-friendly interface for farmers. Key hardware includes soil moisture sensors, DHT11 sensors, a 12V motor, a solenoid valve, a stepdown transformer, Smart Fencing, and AC switches. The system operates in automatic and manual modes. In automatic mode, the ESP32 microcontroller monitors soil moisture and autonomously controls irrigation to optimize water usage. In manual mode, users can control the irrigation motor via a mobile app. Telegram bots enable remote operation of the solenoid valve and electric fencing, enhancing farm security. Additionally, the system upgrades conventional devices to smart ones using AC switches, broadening automation capabilities. AgriInnoConnect Pro aims to improve farm productivity and resource management, addressing the critical need for sustainable water conservation and providing a comprehensive solution for modern farm management. The integration of smart technologies in AgriInnoConnect Pro ensures precision farming practices, promoting efficient resource allocation and sustainable agricultural development.Keywords: agricultural automation, IoT, soil moisture sensor, ESP32, MIT app inventor, telegram bot, smart farming, remote control, firebase console
Procedia PDF Downloads 43772 Automatic Detection and Filtering of Negative Emotion-Bearing Contents from Social Media in Amharic Using Sentiment Analysis and Deep Learning Methods
Authors: Derejaw Lake Melie, Alemu Kumlachew Tegegne
Abstract:
The increasing prevalence of social media in Ethiopia has exacerbated societal challenges by fostering the proliferation of negative emotional posts and comments. Illicit use of social media has further exacerbated divisions among the population. Addressing these issues through manual identification and aggregation of emotions from millions of users for swift decision-making poses significant challenges, particularly given the rapid growth of Amharic language usage on social platforms. Consequently, there is a critical need to develop an intelligent system capable of automatically detecting and categorizing negative emotional content into social, religious, and political categories while also filtering out toxic online content. This paper aims to leverage sentiment analysis techniques to achieve automatic detection and filtering of negative emotional content from Amharic social media texts, employing a comparative study of deep learning algorithms. The study utilized a dataset comprising 29,962 comments collected from social media platforms using comment exporter software. Data pre-processing techniques were applied to enhance data quality, followed by the implementation of deep learning methods for training, testing, and evaluation. The results showed that CNN, GRU, LSTM, and Bi-LSTM classification models achieved accuracies of 83%, 50%, 84%, and 86%, respectively. Among these models, Bi-LSTM demonstrated the highest accuracy of 86% in the experiment.Keywords: negative emotion, emotion detection, social media filtering sentiment analysis, deep learning.
Procedia PDF Downloads 23771 Induced Emotional Empathy and Contextual Factors like Presence of Others Reduce the Negative Stereotypes Towards Persons with Disabilities through Stronger Prosociality
Authors: Shailendra Kumar Mishra
Abstract:
In this paper, we focus on how contextual factors like the physical presence of other perceivers and then developed induced emotional empathy towards a person with disabilities may reduce the automatic negative stereotypes and then response towards that person. We demonstrated in study 1 that negative attitude based on negative stereotypes assessed on ATDP-test questionnaires on five points Linkert-scale are significantly less negative when participants were tested with a group of perceivers and then tested alone separately by applying 3 (positive, indifferent, and negative attitude levels) X 2 (physical presence condition and alone) factorial design of ANOVA test. In the second study, we demonstrate, by applying regression analysis, in the presence of other perceivers, whether in a small group, participants showed more induced emotional empathy through stronger prosociality towards a high distress target like a person with disabilities in comparison of that of other stigmatized persons such as racial biased or gender-biased people. Thus results show that automatic affective response in the form of induced emotional empathy in perceiver and contextual factors like the presence of other perceivers automatically activate stronger prosocial norms and egalitarian goals towards physically challenged persons in comparison to other stigmatized persons like racial or gender-biased people. This leads to less negative attitudes and behaviour towards a person with disabilities.Keywords: contextual factors, high distress target, induced emotional empathy, stronger prosociality
Procedia PDF Downloads 138770 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 43769 Work Experience and Employability: Results and Evaluation of a Pilot Training Course on Skills for Company Tutors
Authors: Javier Barraycoa, Olga Lasaga
Abstract:
Work experience placements are one of the main routes to employment and acquiring professional experience for recent graduates. The effectiveness of these work experience placements is conditioned to the training in skills, especially teaching skills, of company tutors. For this reason, a manual specifically designed for training company tutors in these skills has been developed. Similarly, a pilot semi-attendance course to provide the resources that enable tutors to improve their role as instructors was carried out. The course was quantitatively and qualitatively evaluated with the aim of assessing its effectiveness, detecting shortcomings and areas to be improved, and revising the manual contents. One of the biggest achievements was the raising of awareness in the participating tutors of the importance of their work and of the need to develop teaching skills. As a result of this project, we have detected a need to design specific training supplements according to knowledge areas and sectors, to collate good practices and to create easily accessible audiovisual materials.Keywords: company tutors, employability, teaching skills, work experience
Procedia PDF Downloads 248768 Board of Directors Characteristics and Credit Union Financial Performance
Authors: Luisa Unda, Kamran Ahmed, Paul Mather
Abstract:
We examine the effect of board characteristics on the performance and asset quality of credit unions in Australia, using a large sample covering the period 2004-2012. Credit unions are unique in that they are customer-owned financial institutions and directors are democratically elected by members, which is distinctly different from other financial institutions, such as commercial banks. We find that board remuneration, board expertise, and attendance at board meetings have significantly positive impacts on credit union performance and asset quality, while board members who hold multiple directorships (busy directors), have a significant negative impact on credit union performance. Financial performance also improves with larger boards and long-tenured directors in credit unions. All of these relations hold after we control for alternative measures of performance, credit union characteristics and endogeneity problem.Keywords: credit unions, corporate governance, board of directors, financial performance, Australia, asset quality
Procedia PDF Downloads 518767 Effects of a Simulated Power Cut in Automatic Milking Systems on Dairy Cows Heart Activity
Authors: Anja Gräff, Stefan Holzer, Manfred Höld, Jörn Stumpenhausen, Heinz Bernhardt
Abstract:
In view of the increasing quantity of 'green energy' from renewable raw materials and photovoltaic facilities, it is quite conceivable that power supply variations may occur, so that constantly working machines like automatic milking systems (AMS) may break down temporarily. The usage of farm-made energy is steadily increasing in order to keep energy costs as low as possible. As a result, power cuts are likely to happen more frequently. Current work in the framework of the project 'stable 4.0' focuses on possible stress reactions by simulating power cuts up to four hours in dairy farms. Based on heart activity it should be found out whether stress on dairy cows increases under these circumstances. In order to simulate a power cut, 12 random cows out of 2 herds were not admitted to the AMS for at least two hours on three consecutive days. The heart rates of the cows were measured and the collected data evaluated with HRV Program Kubios Version 2.1 on the basis of eight parameters (HR, RMSSD, pNN50, SD1, SD2, LF, HF and LF/HF). Furthermore, stress reactions were examined closely via video analysis, milk yield, ruminant activity, pedometer and measurements of cortisol metabolites. Concluding it turned out, that during the test only some animals were suffering from minor stress symptoms, when they tried to get into the AMS at their regular milking time, but couldn´t be milked because the system was manipulated. However, the stress level during a regular “time-dependent milking rejection” was just as high. So the study comes to the conclusion, that the low psychological stress level in the case of a 2-4 hours failure of an AMS does not have any impact on animal welfare and health.Keywords: dairy cow, heart activity, power cut, stable 4.0
Procedia PDF Downloads 311766 Threats and Preventive Methods to Avoid Bird Strikes at the Deblin Military Airfield, Poland
Authors: J. Cwiklak, M. Grzegorzewski, M. Adamski
Abstract:
The paper presents results of the project conducted in Poland devoted to study on bird strikes at military airfields. The main aim of this project was to develop methods of aircraft protection against threats from birds. The studies were carried out using two methods. One by transect and the other one by selected sector scanning. During the research, it was recorded, that 104 species of birds in the number about of 36000 were observed. The most frequent ones were starling Sturnus vulgaris (31.0%), jackdaw Corvus monedula (18.3%), rook Corvus frugilegus (15.9 %), lapwing Vanellus vanellus (6.2%). Moreover, it was found, that starlings constituted the most serious threat. It resulted from their relatively high attendance at the runway (about 300 individuals). Possible repellent techniques concerning of the Deblin military airfield were discussed. The analysis of the birds’ concentration depending on the altitude, part of the day, year, part of the airfield constituted a base to work out critical flight phase and appropriate procedures to prevent bird strikes.Keywords: airport, bird strikes, flight safety, preventive methods
Procedia PDF Downloads 402765 Security Risks Assessment: A Conceptualization and Extension of NFC Touch-And-Go Application
Authors: Ku Aina Afiqah Ku Adzman, Manmeet Mahinderjit Singh, Zarul Fitri Zaaba
Abstract:
NFC operates on low-range 13.56 MHz frequency within a distance from 4cm to 10cm, and the applications can be categorized as touch and go, touch and confirm, touch and connect, and touch and explore. NFC applications are vulnerable to various security and privacy attacks such due to its physical nature; unprotected data stored in NFC tag and insecure communication between its applications. This paper aims to determine the likelihood of security risks happening in an NFC technology and application. We present an NFC technology taxonomy covering NFC standards, types of application and various security and privacy attack. Based on observations and the survey presented to evaluate the risk assessment within the touch and go application demonstrates two security attacks that are high risks namely data corruption and DOS attacks. After the risks are determined, risk countermeasures by using AHP is adopted. The guideline and solutions to these two high risks, attacks are later applied to a secure NFC-enabled Smartphone Attendance System.Keywords: Near Field Communication (NFC), risk assessment, multi-criteria decision making, Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 302764 A Deep Learning Approach to Calculate Cardiothoracic Ratio From Chest Radiographs
Authors: Pranav Ajmera, Amit Kharat, Tanveer Gupte, Richa Pant, Viraj Kulkarni, Vinay Duddalwar, Purnachandra Lamghare
Abstract:
The cardiothoracic ratio (CTR) is the ratio of the diameter of the heart to the diameter of the thorax. An abnormal CTR, that is, a value greater than 0.55, is often an indicator of an underlying pathological condition. The accurate prediction of an abnormal CTR from chest X-rays (CXRs) aids in the early diagnosis of clinical conditions. We propose a deep learning-based model for automatic CTR calculation that can assist the radiologist with the diagnosis of cardiomegaly and optimize the radiology flow. The study population included 1012 posteroanterior (PA) CXRs from a single institution. The Attention U-Net deep learning (DL) architecture was used for the automatic calculation of CTR. A CTR of 0.55 was used as a cut-off to categorize the condition as cardiomegaly present or absent. An observer performance test was conducted to assess the radiologist's performance in diagnosing cardiomegaly with and without artificial intelligence (AI) assistance. The Attention U-Net model was highly specific in calculating the CTR. The model exhibited a sensitivity of 0.80 [95% CI: 0.75, 0.85], precision of 0.99 [95% CI: 0.98, 1], and a F1 score of 0.88 [95% CI: 0.85, 0.91]. During the analysis, we observed that 51 out of 1012 samples were misclassified by the model when compared to annotations made by the expert radiologist. We further observed that the sensitivity of the reviewing radiologist in identifying cardiomegaly increased from 40.50% to 88.4% when aided by the AI-generated CTR. Our segmentation-based AI model demonstrated high specificity and sensitivity for CTR calculation. The performance of the radiologist on the observer performance test improved significantly with AI assistance. A DL-based segmentation model for rapid quantification of CTR can therefore have significant potential to be used in clinical workflows.Keywords: cardiomegaly, deep learning, chest radiograph, artificial intelligence, cardiothoracic ratio
Procedia PDF Downloads 98763 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 155762 2D Convolutional Networks for Automatic Segmentation of Knee Cartilage in 3D MRI
Authors: Ananya Ananya, Karthik Rao
Abstract:
Accurate segmentation of knee cartilage in 3-D magnetic resonance (MR) images for quantitative assessment of volume is crucial for studying and diagnosing osteoarthritis (OA) of the knee, one of the major causes of disability in elderly people. Radiologists generally perform this task in slice-by-slice manner taking 15-20 minutes per 3D image, and lead to high inter and intra observer variability. Hence automatic methods for knee cartilage segmentation are desirable and are an active field of research. This paper presents design and experimental evaluation of 2D convolutional neural networks based fully automated methods for knee cartilage segmentation in 3D MRI. The architectures are validated based on 40 test images and 60 training images from SKI10 dataset. The proposed methods segment 2D slices one by one, which are then combined to give segmentation for whole 3D images. Proposed methods are modified versions of U-net and dilated convolutions, consisting of a single step that segments the given image to 5 labels: background, femoral cartilage, tibia cartilage, femoral bone and tibia bone; cartilages being the primary components of interest. U-net consists of a contracting path and an expanding path, to capture context and localization respectively. Dilated convolutions lead to an exponential expansion of receptive field with only a linear increase in a number of parameters. A combination of modified U-net and dilated convolutions has also been explored. These architectures segment one 3D image in 8 – 10 seconds giving average volumetric Dice Score Coefficients (DSC) of 0.950 - 0.962 for femoral cartilage and 0.951 - 0.966 for tibia cartilage, reference being the manual segmentation.Keywords: convolutional neural networks, dilated convolutions, 3 dimensional, fully automated, knee cartilage, MRI, segmentation, U-net
Procedia PDF Downloads 261761 Deep Learning-Based Approach to Automatic Abstractive Summarization of Patent Documents
Authors: Sakshi V. Tantak, Vishap K. Malik, Neelanjney Pilarisetty
Abstract:
A patent is an exclusive right granted for an invention. It can be a product or a process that provides an innovative method of doing something, or offers a new technical perspective or solution to a problem. A patent can be obtained by making the technical information and details about the invention publicly available. The patent owner has exclusive rights to prevent or stop anyone from using the patented invention for commercial uses. Any commercial usage, distribution, import or export of a patented invention or product requires the patent owner’s consent. It has been observed that the central and important parts of patents are scripted in idiosyncratic and complex linguistic structures that can be difficult to read, comprehend or interpret for the masses. The abstracts of these patents tend to obfuscate the precise nature of the patent instead of clarifying it via direct and simple linguistic constructs. This makes it necessary to have an efficient access to this knowledge via concise and transparent summaries. However, as mentioned above, due to complex and repetitive linguistic constructs and extremely long sentences, common extraction-oriented automatic text summarization methods should not be expected to show a remarkable performance when applied to patent documents. Other, more content-oriented or abstractive summarization techniques are able to perform much better and generate more concise summaries. This paper proposes an efficient summarization system for patents using artificial intelligence, natural language processing and deep learning techniques to condense the knowledge and essential information from a patent document into a single summary that is easier to understand without any redundant formatting and difficult jargon.Keywords: abstractive summarization, deep learning, natural language Processing, patent document
Procedia PDF Downloads 123760 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack
Authors: Lucas Bublitz, Michael Herdrich
Abstract:
By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach
Procedia PDF Downloads 75759 Automatic Near-Infrared Image Colorization Using Synthetic Images
Authors: Yoganathan Karthik, Guhanathan Poravi
Abstract:
Colorizing near-infrared (NIR) images poses unique challenges due to the absence of color information and the nuances in light absorption. In this paper, we present an approach to NIR image colorization utilizing a synthetic dataset generated from visible light images. Our method addresses two major challenges encountered in NIR image colorization: accurately colorizing objects with color variations and avoiding over/under saturation in dimly lit scenes. To tackle these challenges, we propose a Generative Adversarial Network (GAN)-based framework that learns to map NIR images to their corresponding colorized versions. The synthetic dataset ensures diverse color representations, enabling the model to effectively handle objects with varying hues and shades. Furthermore, the GAN architecture facilitates the generation of realistic colorizations while preserving the integrity of dimly lit scenes, thus mitigating issues related to over/under saturation. Experimental results on benchmark NIR image datasets demonstrate the efficacy of our approach in producing high-quality colorizations with improved color accuracy and naturalness. Quantitative evaluations and comparative studies validate the superiority of our method over existing techniques, showcasing its robustness and generalization capability across diverse NIR image scenarios. Our research not only contributes to advancing NIR image colorization but also underscores the importance of synthetic datasets and GANs in addressing domain-specific challenges in image processing tasks. The proposed framework holds promise for various applications in remote sensing, medical imaging, and surveillance where accurate color representation of NIR imagery is crucial for analysis and interpretation.Keywords: computer vision, near-infrared images, automatic image colorization, generative adversarial networks, synthetic data
Procedia PDF Downloads 43758 A Workable Mechanism to Support Students Who Are at Risk
Authors: Mohamed Chabi
Abstract:
The project of helping students at risk started at the Math department in the new foundation program at Qatar University in the fall 2012 semester. The purpose was to find ways to help students who were struggling with their math courses Elementary algebra or Precalculus course due to many factors. Department had formed the Committee “students at Risk” at the start of 12-13 to assist struggling students in our math courses to get their studies on track. A mechanism was developed to support students who are at risk using a developed E-Monitoring system. E-Monitoring system was developed to manage automatically all transactions relevant to the students’ attendance, Students ‘‘warning Students’’ grading, etc. E-Monitoring System produce various statistics such as, Overall course statistics, Performance, Students at Risk… to help department to develop a higher quality of education in the Foundation Program at Math department. The mechanism was studies and evaluated. Whatever the cause, the sooner we identify students who are not performing well academically, the sooner we can provide, or direct them to the resources that are available to them. In this paper, we outline the mechanism and its effect on students’ performance. The collected data from various exams shows that students had benefited from the mechanism.Keywords: students at risk, e-monitoring system, warning students, performance
Procedia PDF Downloads 488757 Beijing Xicheng District Housing Price Econometric Analysis: “Multi-School Zoning”Policy
Authors: Haoxue Cui, Sirui Zhang, Shanshan Gao, Weiyi Zhang, Lantian Wang, Xuanwen Zheng
Abstract:
The 2020 "multi-school zoning" policy makes students ineligible for direct attendance in their district. To study whether the housing price trend of the school district is affected by the policy, This paper studies housing prices based on the school district division in Xicheng District, Beijing. In this paper, we collected housing prices and the basic situation of communities from "Anjuke", which were divided into two periods of 15 months before and after the 731 policy in the Xicheng District, Beijing. Then we used DID model and time fixed effect to investigate the DIFFERENTIAL statistics, that is, the overall net impact of the policy. The results show that the coefficient is negative at a certain statistical level. It indicates that the housing prices of school districts in the Xicheng district decreased after the "multi-school zoning" policy, which shows that the policy has effectively reduced the housing price of school districts in the Xicheng District and laid a foundation for the "double reduction" policy in 2022.Keywords: “multi-school zoning”policy, DID, time fixed effect, housing prices
Procedia PDF Downloads 160756 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation
Authors: Miguel Contreras, David Long, Will Bachman
Abstract:
Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models
Procedia PDF Downloads 205