Search results for: data reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28990

Search results for: data reduction

24670 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data

Authors: Devin Simmons

Abstract:

At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.

Keywords: ferry vessels, transportation, modeling, AIS data

Procedia PDF Downloads 180
24669 Power Transformer Risk-Based Maintenance by Optimization of Transformer Condition and Transformer Importance

Authors: Kitti Leangkrua

Abstract:

This paper presents a risk-based maintenance strategy of a power transformer in order to optimize operating and maintenance costs. The methodology involves the study and preparation of a database for the collection the technical data and test data of a power transformer. An evaluation of the overall condition of each transformer is performed by a program developed as a result of the measured results; in addition, the calculation of the main equipment separation to the overall condition of the transformer (% HI) and the criteria for evaluating the importance (% ImI) of each location where the transformer is installed. The condition assessment is performed by analysis test data such as electrical test, insulating oil test and visual inspection. The condition of the power transformer will be classified from very poor to very good condition. The importance is evaluated from load criticality, importance of load and failure consequence. The risk matrix is developed for evaluating the risk of each power transformer. The high risk power transformer will be focused firstly. The computerized program is developed for practical use, and the maintenance strategy of a power transformer can be effectively managed.

Keywords: asset management, risk-based maintenance, power transformer, health index

Procedia PDF Downloads 311
24668 An Assessment of Existing Material Management Process in Building Construction Projects in Nepal

Authors: Uttam Neupane, Narendra Budha, Subash Kumar Bhattarai

Abstract:

Material management is an essential part in construction project management. There are a number of material management problems in the Nepalese construction industry, which contribute to an inefficient material management system. Ineffective material management can cause waste of time and money thus increasing the problem of time and cost overrun. An assessment of material management system with gap and solution was carried out on 20 construction projects implemented by the Federal Level Project Implementation Unit (FPIU); Kaski district of Nepal. To improve the material management process, the respondents have provided possible solutions to overcome the gaps seen in the current material management process. The possible solutions are preparation of material schedule in line with the construction schedule for material requirement planning, verifications of material and locating of source, purchasing of the required material in advance before commencement of work, classifying the materials, and managing the inventory based on their usage value and eliminating and reduction in wastages during the overall material management process.

Keywords: material management, construction site, inventory, construction project

Procedia PDF Downloads 76
24667 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 31
24666 Hybrid Collaborative-Context Based Recommendations for Civil Affairs Operations

Authors: Patrick Cummings, Laura Cassani, Deirdre Kelliher

Abstract:

In this paper we present findings from a research effort to apply a hybrid collaborative-context approach for a system focused on Marine Corps civil affairs data collection, aggregation, and analysis called the Marine Civil Information Management System (MARCIMS). The goal of this effort is to provide operators with information to make sense of the interconnectedness of entities and relationships in their area of operation and discover existing data to support civil military operations. Our approach to build a recommendation engine was designed to overcome several technical challenges, including 1) ensuring models were robust to the relatively small amount of data collected by the Marine Corps civil affairs community; 2) finding methods to recommend novel data for which there are no interactions captured; and 3) overcoming confirmation bias by ensuring content was recommended that was relevant for the mission despite being obscure or less well known. We solve this by implementing a combination of collective matrix factorization (CMF) and graph-based random walks to provide recommendations to civil military operations users. We also present a method to resolve the challenge of computation complexity inherent from highly connected nodes through a precomputed process.

Keywords: Recommendation engine, collaborative filtering, context based recommendation, graph analysis, coverage, civil affairs operations, Marine Corps

Procedia PDF Downloads 128
24665 An Exploratory Study on the Impact of Video-stimulated Reflection on Novice EFL Teachers’ Professional Development

Authors: Ibrahima Diallo

Abstract:

The literature on teacher education foregrounds reflection as an important aspect of professional practice. Reflection for a teacher consists in critically analysing and evaluating retrospectively a lesson to see what worked, what did not work, and how to improve it for the future. Now, many teacher education programmes worldwide consider the ability to reflect as one of the hallmarks of an effective educator. However, in some context like Senegal, reflection has not been given due consideration in teacher education programmes. In contexts where it has been in the education landscape for some time now, reflection is mostly depicted as an individual written activity and many teacher trainees have become disenchanted by the repeated enactments of this task that is solely intended to satisfy course requirements. This has resulted in whitewashing weaknesses or even ‘faking’ reflection. Besides, the “one-size-fits-all” approach of reflection could not flourish because how reflection impacts on practice is still unproven. Therefore, reflective practice needs to be contextualised and made more thought-provoking through dialogue and by using classroom data. There is also a need to highlight change brought in teachers’ practice through reflection. So, this study introduces reflection in a new context and aims to show evidenced change in novice EFL teachers’ practice through dialogic data-led reflection. The purpose of this study is also to contribute to the scarce literature on reflection in sub-Saharan Africa by bringing new perspectives on contextualised teacher-led reflection. Eight novice EFL teachers participated in this qualitative longitudinal study, and data have been gathered online through post-lesson reflection recordings and lesson videos for a period of four months. Then, the data have been thematically analysed using NVivo to systematically organize and manage the large amount of data. The analysis followed the six steps approach to thematic analysis. Major themes related to teachers’ classroom practice and their conception of reflection emerged from the analysis of the data. The results showed that post-lesson reflection with a peer can help novice EFL teachers gained more awareness on their classroom practice. Dialogic reflection also helped them evaluate their lessons and seek for improvement. The analysis of the data also gave insight on teachers’ conception of reflection in an EFL context. It was found that teachers were more engaged in reflection when using their lesson video recordings. Change in teaching behaviour as a result of reflection was evidenced by the analysis of the lesson video recordings. This study has shown that video-stimulated reflection is practical form of professional development that can be embedded in teachers’ professional life.

Keywords: novice EFL teachers, practice, professional development, video-stimulated reflection

Procedia PDF Downloads 101
24664 Ontology-Based Approach for Temporal Semantic Modeling of Social Networks

Authors: Souâad Boudebza, Omar Nouali, Faiçal Azouaou

Abstract:

Social networks have recently gained a growing interest on the web. Traditional formalisms for representing social networks are static and suffer from the lack of semantics. In this paper, we will show how semantic web technologies can be used to model social data. The SemTemp ontology aligns and extends existing ontologies such as FOAF, SIOC, SKOS and OWL-Time to provide a temporal and semantically rich description of social data. We also present a modeling scenario to illustrate how our ontology can be used to model social networks.

Keywords: ontology, semantic web, social network, temporal modeling

Procedia PDF Downloads 393
24663 Customer Satisfaction and Retention Strategies in Marketing

Authors: Hassan Adedoyin Rasaq

Abstract:

The marketing efforts of the present day business is not just geared towards meeting the consumer’s needs at a price, but ensuring good customer satisfaction, and strategizing on how to retain such customers. Customer satisfaction and retention is achievable through the co-ordination of the marketing mixes; Product, Price, Promotion and Place; Relationship Marketing; After-Sales Service; Rebates/Discounts/Price reduction policy and Total Quality Management (TQM). A first-hand customer, If well satisfied, will become a company’s repeat customer, proceeds to become a client and goes further to become an advocate of the company by applauding the company’s products/services and encouraging others to buy from it. It is the objective of this paper, therefore, to guide business organizations on how to enhance customer satisfaction, and retain existing customers as a means of long-term survival in marketing. The responses of 72 randomly selected Marketing personnel spread across three (3) food and beverage companies in Nigeria were analyzed. One hypothesis was tested using a one-way analysis of variance (ANOVA) statistical tool, and it was discovered that Relationship marketing contributed to organizational profitability and growth.

Keywords: customer satisfaction, retention strategies, marketing, marketing mixes

Procedia PDF Downloads 557
24662 Consideration of Starlight Waves Redshift as Produced by Friction of These Waves on Its Way through Space

Authors: Angel Pérez Sánchez

Abstract:

In 1929, a light redshift was discovered in distant galaxies and was interpreted as produced by galaxies moving away from each other at high speed. This interpretation led to the consideration of a new source of energy, which was called Dark Energy. Redshift is a loss of light wave frequency produced by galaxies moving away at high speed, but the loss of frequency can also be produced by the friction of light waves on their way to Earth. This friction is impossible because outer space is empty, but if it were not empty and a medium existed in this empty space, it would be possible. The consequences would be extraordinary because Universe acceleration and Dark Energy would be in doubt. This article presents evidence that empty space is actually a medium occupied by different particles, among them the most significant would-be Graviton or Higgs Boson, because let's not forget that gravity also affects empty space.

Keywords: Big Bang, dark energy, doppler effect, redshift, starlight frequency reduction, universe acceleration

Procedia PDF Downloads 67
24661 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases

Authors: Hao-Hsiang Ku, Ching-Ho Chi

Abstract:

Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.

Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system

Procedia PDF Downloads 264
24660 Biophysically Motivated Phylogenies

Authors: Catherine Felce, Lior Pachter

Abstract:

Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.

Keywords: phylogenetics, single-cell, biophysical modeling, transcription

Procedia PDF Downloads 64
24659 Open Educational Resource in Online Mathematics Learning

Authors: Haohao Wang

Abstract:

Technology, multimedia in Open Educational Resources, can contribute positively to student performance in an online instructional environment. Student performance data of past four years were obtained from an online course entitled Applied Calculus (MA139). This paper examined the data to determine whether multimedia (independent variable) had any impact on student performance (dependent variable) in online math learning, and how students felt about the value of the technology. Two groups of student data were analyzed, group 1 (control) from the online applied calculus course that did not use multimedia instructional materials, and group 2 (treatment) of the same online applied calculus course that used multimedia instructional materials. For the MA139 class, results indicate a statistically significant difference (p = .001) between the two groups, where group 1 had a final score mean of 56.36 (out of 100), group 2 of 70.68. Additionally, student testimonials were discussed in which students shared their experience in learning applied calculus online with multimedia instructional materials.

Keywords: online learning, open educational resources, multimedia, technology

Procedia PDF Downloads 380
24658 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 275
24657 The Role of Big Data Analytics and Corporate Social Responsibility in Driving Green Innovation

Authors: Abdeslam Hassani

Abstract:

This study addresses the increasing environmental concerns faced by businesses due to regulatory and stakeholder pressures. It explores how big data analytics (BDA) and advanced technologies, particularly artificial intelligence, combined with corporate social responsibility (CSR), can foster green innovation and sustainable practices. The research builds on existing literature, highlighting the critical role of technologies and CSR in achieving sustainability goals. This research adopts a multidimensional approach, offering a more comprehensive understanding of the interplay between technologies, governance, and environmental policies. A qualitative methodology was chosen, involving a systematic literature review and semi-structured interviews with executives from Canadian companies. NVivo software will be used to analyze interview data, ensuring a rigorous approach to identifying key contextual factors. The cross-analysis of literature findings and interview insights will help validate theoretical constructs and develop a conceptual framework. This study contributes by providing both theoretical insights and practical recommendations. It offers executives actionable guidance on integrating CSR into strategic decision-making and aligning technological capabilities with sustainability objectives. This approach aims to improve firms’ competitiveness, ensure regulatory compliance, and enhance their role in promoting green innovation.

Keywords: big data analytics, corporate social responsibility, green innovation, advanced technology

Procedia PDF Downloads 7
24656 Investigations into the in situ Enterococcus faecalis Biofilm Removal Efficacies of Passive and Active Sodium Hypochlorite Irrigant Delivered into Lateral Canal of a Simulated Root Canal Model

Authors: Saifalarab A. Mohmmed, Morgana E. Vianna, Jonathan C. Knowles

Abstract:

The issue of apical periodontitis has received considerable critical attention. Bacteria is integrated into communities, attached to surfaces and consequently form biofilm. The biofilm structure provides bacteria with a series protection skills against, antimicrobial agents and enhances pathogenicity (e.g. apical periodontitis). Sodium hypochlorite (NaOCl) has become the irrigant of choice for elimination of bacteria from the root canal system based on its antimicrobial findings. The aim of the study was to investigate the effect of different agitation techniques on the efficacy of 2.5% NaOCl to eliminate the biofilm from the surface of the lateral canal using the residual biofilm, and removal rate of biofilm as outcome measures. The effect of canal complexity (lateral canal) on the efficacy of the irrigation procedure was also assessed. Forty root canal models (n = 10 per group) were manufactured using 3D printing and resin materials. Each model consisted of two halves of an 18 mm length root canal with apical size 30 and taper 0.06, and a lateral canal of 3 mm length, 0.3 mm diameter located at 3 mm from the apical terminus. E. faecalis biofilms were grown on the apical 3 mm and lateral canal of the models for 10 days in Brain Heart Infusion broth. Biofilms were stained using crystal violet for visualisation. The model halves were reassembled, attached to an apparatus and tested under a fluorescence microscope. Syringe and needle irrigation protocol was performed using 9 mL of 2.5% NaOCl irrigant for 60 seconds. The irrigant was either left stagnant in the canal or activated for 30 seconds using manual (gutta-percha), sonic and ultrasonic methods. Images were then captured every second using an external camera. The percentages of residual biofilm were measured using image analysis software. The data were analysed using generalised linear mixed models. The greatest removal was associated with the ultrasonic group (66.76%) followed by sonic (45.49%), manual (43.97%), and passive irrigation group (control) (38.67%) respectively. No marked reduction in the efficiency of NaOCl to remove biofilm was found between the simple and complex anatomy models (p = 0.098). The removal efficacy of NaOCl on the biofilm was limited to the 1 mm level of the lateral canal. The agitation of NaOCl results in better penetration of the irrigant into the lateral canals. Ultrasonic agitation of NaOCl improved the removal of bacterial biofilm.

Keywords: 3D printing, biofilm, root canal irrigation, sodium hypochlorite

Procedia PDF Downloads 233
24655 Cervical Cerclage and Neonatal Death

Authors: Zinah Jabbar Mohammed Alrubaye

Abstract:

Objective: The purpose of this study was to compare the efficacy of prophylactic and rescue cervical cerclages for pregnant patients with an incompetent cervix, and to assess the neonatal outcomes of both clinical conditions. Methods: This was a retrospective observational study of all women who had an elective or rescue cerclage between January 2008 and December 2016 in our hospital .Prophylactic cerclage was defined as a cerclage before 16 weeks of gestation, while rescue cerclages were performed between 16 and 23 weeks of gestation. Results: In total, we analyzed the outcomes of 212 cervical interventions; 71% of the recruited patients experienced prophylactic cerclage, while 29% underwent rescue cerclage. Most of the patients delivered vaginally (70%) and were able to leave the hospital with a healthy newborn (78%). The mean pregnancy prolongation time after cerclage in the prophylactic and rescue groups were 21 weeks and 10 weeks, respectively. Conclusion: Prophylactic cerclage interventions are most likely to be associated with a reduction of fetal demise because of the correlation between fetal prognosis and the gestational age at which cerclage is performed. Once the diagnosis of cervical insufficiency is confirmed, cerclage should be recommended as this will help to prolong the pregnancy.

Keywords: cervical, neonate, cerclage, Cervix

Procedia PDF Downloads 58
24654 Numerical Investigation of Seismic Behaviour of Building

Authors: Tinebeb Tefera Ashene

Abstract:

Glass facade systems have gained popularity in recent times. During an earthquake, building frames suffer large inter-story drifts, causing racking of building facade systems. A facade system is highly vulnerable and fails more frequently than a building with significant devastating effects. The usage of Metallic yield damper connections (Added Damping Stiffness) is proposed in this study to mitigate the aforementioned problems. Results showed as compared to control, usage of Metallic yield damper connections (Added-Damping-And-Stiffness) exhibited a reduction of connection deformation and axial force; differential displacement between frame and facade; and facade distortion by 44.35%, 43.33%, and 51.45% respectively. Also, employing proposed energy-absorbing connections reduced inter-story link joint drift by 71.11% and mitigated detrimental seismic effects on the entire building facade system.

Keywords: damper, energy dissipation, metallic yield, facades

Procedia PDF Downloads 57
24653 How OXA GENE Expression is Implicated in the Treatment Resistance and Poor Prognosis in Glioblastoma

Authors: Naomi Seidu, Edward Poluyi, Chibuikem Ikwuegbuenyi, Eghosa Morgan

Abstract:

The current poor prognosis of glioblastoma has called for the need for an improvement in treatment methods in order to improve its survival rate. Despite the different interventions currently available for this tumor, the average survival is still only a few months. (12-15). The aim is to create a more favorable prognosis and have a reduction in the resistance to treatment currently being experienced, even with surgical interventions and chemotherapy. From the available literature, there is a relationship between the presence of HOX genes (Homeobox genes) and glioblastoma, which could be attributable to the increasing treatment resistance. Hence silencing these genes can be a key to improving survival rates of glioblastoma. A series of studies have highlighted the role that HOX genes play in glioblastoma prognosis. Promotion of human glioblastoma initiation, aggressiveness, and resistance to Temozolomide has been associated with HOXA9. The role of HOX gene expression in cancer stem cells should be studied as it could provide a means of designing CSC-targeted therapies, as CSCs play a part in the initiation and progression of solid tumors.

Keywords: GBM- glioblastoma, HOXA gene- homeobox genes cluster, signaling pathways, temozolomide

Procedia PDF Downloads 109
24652 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle

Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar

Abstract:

As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the Central Processing Unit (CPU), operational (RAM), and permanent (ROM) memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.

Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles

Procedia PDF Downloads 120
24651 Deep learning with Noisy Labels : Learning True Labels as Discrete Latent Variable

Authors: Azeddine El-Hassouny, Chandrashekhar Meshram, Geraldin Nanfack

Abstract:

In recent years, learning from data with noisy labels (Label Noise) has been a major concern in supervised learning. This problem has become even more worrying in Deep Learning, where the generalization capabilities have been questioned lately. Indeed, deep learning requires a large amount of data that is generally collected by search engines, which frequently return data with unreliable labels. In this paper, we investigate the Label Noise in Deep Learning using variational inference. Our contributions are : (1) exploiting Label Noise concept where the true labels are learnt using reparameterization variational inference, while observed labels are learnt discriminatively. (2) the noise transition matrix is learnt during the training without any particular process, neither heuristic nor preliminary phases. The theoretical results shows how true label distribution can be learned by variational inference in any discriminate neural network, and the effectiveness of our approach is proved in several target datasets, such as MNIST and CIFAR32.

Keywords: label noise, deep learning, discrete latent variable, variational inference, MNIST, CIFAR32

Procedia PDF Downloads 132
24650 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa

Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua

Abstract:

Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.

Keywords: retina images, MoDRIA, image repository, African database

Procedia PDF Downloads 136
24649 Jointly Learning Python Programming and Analytic Geometry

Authors: Cristina-Maria Păcurar

Abstract:

The paper presents an original Python-based application that outlines the advantages of combining some elementary notions of mathematics with the study of a programming language. The application support refers to some of the first lessons of analytic geometry, meaning conics and quadrics and their reduction to a standard form, as well as some related notions. The chosen programming language is Python, not only for its closer to an everyday language syntax – and therefore, enhanced readability – but also for its highly reusable code, which is of utmost importance for a mathematician that is accustomed to exploit already known and used problems to solve new ones. The purpose of this paper is, on one hand, to support the idea that one of the most appropriate means to initiate one into programming is throughout mathematics, and reciprocal, one of the most facile and handy ways to assimilate some basic knowledge in the study of mathematics is to apply them in a personal project. On the other hand, besides being a mean of learning both programming and analytic geometry, the application subject to this paper is itself a useful tool for it can be seen as an independent original Python package for analytic geometry.

Keywords: analytic geometry, conics, python, quadrics

Procedia PDF Downloads 302
24648 Assessment of Environmental Quality of an Urban Setting

Authors: Namrata Khatri

Abstract:

The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.

Keywords: environmental quality, UEQ, remote sensing, GIS

Procedia PDF Downloads 84
24647 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 331
24646 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 70
24645 Analysis of Education Faculty Students’ Attitudes towards E-Learning According to Different Variables

Authors: Eyup Yurt, Ahmet Kurnaz, Ismail Sahin

Abstract:

The purpose of the study is to investigate the education faculty students’ attitudes towards e-learning according to different variables. In current study, the data were collected from 393 students of an education faculty in Turkey. In this study, theattitude towards e‐learning scale and the demographic information form were used to collect data. The collected data were analyzed by t-test, ANOVA and Pearson correlation coefficient. It was found that there is a significant difference in students’ tendency towards e-learning and avoidance from e-learning based on gender. Male students have more positive attitudes towards e-learning than female students. Also, the students who used the internet lesshave higher levels of avoidance from e-learning. Additionally, it is found that there is a positive and significant relationship between the number of personal mobile learning devices and tendency towards e-learning. On the other hand, there is a negative and significant relationship between the number of personal mobile learning devices and avoidance from e-learning. Also, suggestions were presented according to findings.

Keywords: education faculty students, attitude towards e-learning, gender, daily internet usage time, m-learning

Procedia PDF Downloads 313
24644 Physics-Informed Machine Learning for Displacement Estimation in Solid Mechanics Problem

Authors: Feng Yang

Abstract:

Machine learning (ML), especially deep learning (DL), has been extensively applied to many applications in recently years and gained great success in solving different problems, including scientific problems. However, conventional ML/DL methodologies are purely data-driven which have the limitations, such as need of ample amount of labelled training data, lack of consistency to physical principles, and lack of generalizability to new problems/domains. Recently, there is a growing consensus that ML models need to further take advantage of prior knowledge to deal with these limitations. Physics-informed machine learning, aiming at integration of physics/domain knowledge into ML, has been recognized as an emerging area of research, especially in the recent 2 to 3 years. In this work, physics-informed ML, specifically physics-informed neural network (NN), is employed and implemented to estimate the displacements at x, y, z directions in a solid mechanics problem that is controlled by equilibrium equations with boundary conditions. By incorporating the physics (i.e. the equilibrium equations) into the learning process of NN, it is showed that the NN can be trained very efficiently with a small set of labelled training data. Experiments with different settings of the NN model and the amount of labelled training data were conducted, and the results show that very high accuracy can be achieved in fulfilling the equilibrium equations as well as in predicting the displacements, e.g. in setting the overall displacement of 0.1, a root mean square error (RMSE) of 2.09 × 10−4 was achieved.

Keywords: deep learning, neural network, physics-informed machine learning, solid mechanics

Procedia PDF Downloads 154
24643 Effect of Freight Transport Intensity on Firm Performance: Mediating Role of Operational Capability

Authors: Bonaventure Naab Dery, Abdul Muntaka Samad

Abstract:

During the past two decades, huge population growth has been recorded in developing countries. Thisled to an increase in the demand for transport services for human and merchandises. The study sought to examine the effect of freight transport intensity on firm performance. Among others, this study sought to examine the link between freight transport intensity and firm performance; the link between operational capability and firm performance, and the mediating role of operational capability on the relationship between freight transport intensity and firm performance. The study used a descriptive research design and a quantitative research approach. Questionnaireswereusedfor the data collection through snowball sampling and purposive sampling. SPSS and Mplus are being used to analyze the data. It is anticipated that, when the data is analyzed, it would validate the hypotheses that have been proposed by the researchers. Base on the findings, relevant recommendations would be made for managerial implications and future studies.

Keywords: freight transport intensity, freight economy transport intensity, freight efficiency transport intensity, operational capability, firm performance

Procedia PDF Downloads 151
24642 The Impact of Artificial Intelligence on Qualty Conrol and Quality

Authors: Mary Moner Botros Fanawel

Abstract:

Many companies use the statistical tool named as statistical quality control, and which can have a high cost for the companies interested on these statistical tools. The evaluation of the quality of products and services is an important topic, but the reduction of the cost of the implantation of the statistical quality control also has important benefits for the companies. For this reason, it is important to implement a economic design for the various steps included into the statistical quality control. In this paper, we describe some relevant aspects related to the economic design of a quality control chart for the proportion of defective items. They are very important because the suggested issues can reduce the cost of implementing a quality control chart for the proportion of defective items. Note that the main purpose of this chart is to evaluate and control the proportion of defective items of a production process.

Keywords: model predictive control, hierarchical control structure, genetic algorithm, water quality with DBPs objectives proportion, type I error, economic plan, distribution function bootstrap control limit, p-value method, out-of-control signals, p-value, quality characteristics

Procedia PDF Downloads 66
24641 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges

Authors: Jonathan Nash, Allen Hartt, Catherine Plante

Abstract:

In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.

Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting

Procedia PDF Downloads 105