Search results for: machine failures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3305

Search results for: machine failures

1775 Evaluating and Reducing Aircraft Technical Delays and Cancellations Impact on Reliability Operational: Case Study of Airline Operator

Authors: Adel A. Ghobbar, Ahmad Bakkar

Abstract:

Although special care is given to maintenance, aircraft systems fail, and these failures cause delays and cancellations. The occurrence of Delays and Cancellations affects operators and manufacturers negatively. To reduce technical delays and cancellations, one should be able to determine the important systems causing them. The goal of this research is to find a method to define the most expensive delays and cancellations systems for Airline operators. A predictive model was introduced to forecast the failure and their impact after carrying out research that identifies relevant information to tackle the problems faced while answering the questions of this paper. Data were obtained from the manufacturers’ services reliability team database. Subsequently, delays and cancellations evaluation methods were identified. No cost estimation methods were used due to their complexity. The model was developed, and it takes into account the frequency of delays and cancellations and uses weighting factors to give an indication of the severity of their duration. The weighting factors are based on customer experience. The data Analysis approach has shown that delays and cancellations events are not seasonal and do not follow any specific trends. The use of weighting factor does have an influence on the shortlist over short periods (Monthly) but not the analyzed period of three years. Landing gear and the navigation system are among the top 3 factors causing delays and cancellations for all three aircraft types. The results did confirm that the cooperation between certain operators and manufacture reduce the impact of delays and cancellations.

Keywords: reliability, availability, delays & cancellations, aircraft maintenance

Procedia PDF Downloads 132
1774 High Rise Building Vibration Control Using Tuned Mass Damper

Authors: T. Vikneshvaran, A. Aminudin, U. Alyaa Hashim, Waziralilah N. Fathiah, D. Shakirah Shukor

Abstract:

This paper presents the experimental study conducted on a structure of three-floor height building model. Most vibrations are undesirable and can cause damages to the buildings, machines and people all around us. The vibration wave from earthquakes, construction and winds have high potential to bring damage to the buildings. Excessive vibrations can result in structural and machinery failures. This failure is related to the human life and environment around it. The effect of vibration which causes failure and damage to the high rise buildings can be controlled in real life by implementing tuned mass damper (TMD) into the structure of the buildings. This research aims to study the effect and performance improvement achieved by applying TMD into the building structure. A structure model of three degrees of freedom (3DOF) is designed to demonstrate the performance of TMD to the designed model. The model designed is the physical representation of actual building structure in real life. It is constructed at a reduced scale and will be used for the experiment. Thus, the result obtained will be more accurate to compared with the real life effect. Based on the result from experimental study, by applying TMD to the structure model, the forces of vibration and the displacement mode of the building reduced. Thus, the reduced in vibration of the building helps to maintain the good condition of the building.

Keywords: degrees-of-freedom, displacement mode, natural frequency, tuned mass damper

Procedia PDF Downloads 340
1773 Clustering and Modelling Electricity Conductors from 3D Point Clouds in Complex Real-World Environments

Authors: Rahul Paul, Peter Mctaggart, Luke Skinner

Abstract:

Maintaining public safety and network reliability are the core objectives of all electricity distributors globally. For many electricity distributors, managing vegetation clearances from their above ground assets (poles and conductors) is the most important and costly risk mitigation control employed to meet these objectives. Light Detection And Ranging (LiDAR) is widely used by utilities as a cost-effective method to inspect their spatially-distributed assets at scale, often captured using high powered LiDAR scanners attached to fixed wing or rotary aircraft. The resulting 3D point cloud model is used by these utilities to perform engineering grade measurements that guide the prioritisation of vegetation cutting programs. Advances in computer vision and machine-learning approaches are increasingly applied to increase automation and reduce inspection costs and time; however, real-world LiDAR capture variables (e.g., aircraft speed and height) create complexity, noise, and missing data, reducing the effectiveness of these approaches. This paper proposes a method for identifying each conductor from LiDAR data via clustering methods that can precisely reconstruct conductors in complex real-world configurations in the presence of high levels of noise. It proposes 3D catenary models for individual clusters fitted to the captured LiDAR data points using a least square method. An iterative learning process is used to identify potential conductor models between pole pairs. The proposed method identifies the optimum parameters of the catenary function and then fits the LiDAR points to reconstruct the conductors.

Keywords: point cloud, LİDAR data, machine learning, computer vision, catenary curve, vegetation management, utility industry

Procedia PDF Downloads 99
1772 Thermomechanical Behavior of Asphalt Modified with Thermoplastic Polymer and Nanoclay Dellite 43B

Authors: L. F. Tamele Jr., G. Buonocore, H. F. Muiambo

Abstract:

Asphalt binders play an essential role in the performance and properties of asphalt mixtures. The increase in heavy loads, greater traffic volume, and high tire pressure, combined with a substantial variation in daily and seasonal pavement temperatures, are the main responsible for the failure of asphalt pavements. To avoid or mitigate these failures, the present research proposes the use of thermoplastic polymers, HDPE and LLDPE, and nanoclay Dellite 43B for modification of asphalt in order to improve its thermomechanical and rheological properties. The nanocomposites were prepared by the solution intercalation method in a high shear mixer for a mixing time of 2 h, at 180℃ and 5000 rpm. The addition of Dellite 43B improved the physical, rheological, and thermal properties of asphalt, either separated or in the form of polymer/bitumen blends. The results of the physical characterization showed a decrease in penetration and an increase in softening point, thermal susceptibility, viscosity, and stiffness. On the other hand, thermal characterization showed that the nanocomposites have greater stability at higher temperatures by exhibiting greater amounts of residues and improved initial and final decomposition temperatures. Thus, the modification of asphalt by polymers and nanoclays seems to be a suitable solution for road pavement in countries which experiment with high temperatures combined with long heavy rain seasons.

Keywords: asphalt, nanoclay dellite 43B, polymer modified asphalt, thermal and rheological properties

Procedia PDF Downloads 147
1771 A Comparative Assessment of Information Value, Fuzzy Expert System Models for Landslide Susceptibility Mapping of Dharamshala and Surrounding, Himachal Pradesh, India

Authors: Kumari Sweta, Ajanta Goswami, Abhilasha Dixit

Abstract:

Landslide is a geomorphic process that plays an essential role in the evolution of the hill-slope and long-term landscape evolution. But its abrupt nature and the associated catastrophic forces of the process can have undesirable socio-economic impacts, like substantial economic losses, fatalities, ecosystem, geomorphologic and infrastructure disturbances. The estimated fatality rate is approximately 1person /100 sq. Km and the average economic loss is more than 550 crores/year in the Himalayan belt due to landslides. This study presents a comparative performance of a statistical bivariate method and a machine learning technique for landslide susceptibility mapping in and around Dharamshala, Himachal Pradesh. The final produced landslide susceptibility maps (LSMs) with better accuracy could be used for land-use planning to prevent future losses. Dharamshala, a part of North-western Himalaya, is one of the fastest-growing tourism hubs with a total population of 30,764 according to the 2011 census and is amongst one of the hundred Indian cities to be developed as a smart city under PM’s Smart Cities Mission. A total of 209 landslide locations were identified in using high-resolution linear imaging self-scanning (LISS IV) data. The thematic maps of parameters influencing landslide occurrence were generated using remote sensing and other ancillary data in the GIS environment. The landslide causative parameters used in the study are slope angle, slope aspect, elevation, curvature, topographic wetness index, relative relief, distance from lineaments, land use land cover, and geology. LSMs were prepared using information value (Info Val), and Fuzzy Expert System (FES) models. Info Val is a statistical bivariate method, in which information values were calculated as the ratio of the landslide pixels per factor class (Si/Ni) to the total landslide pixel per parameter (S/N). Using this information values all parameters were reclassified and then summed in GIS to obtain the landslide susceptibility index (LSI) map. The FES method is a machine learning technique based on ‘mean and neighbour’ strategy for the construction of fuzzifier (input) and defuzzifier (output) membership function (MF) structure, and the FR method is used for formulating if-then rules. Two types of membership structures were utilized for membership function Bell-Gaussian (BG) and Trapezoidal-Triangular (TT). LSI for BG and TT were obtained applying membership function and if-then rules in MATLAB. The final LSMs were spatially and statistically validated. The validation results showed that in terms of accuracy, Info Val (83.4%) is better than BG (83.0%) and TT (82.6%), whereas, in terms of spatial distribution, BG is best. Hence, considering both statistical and spatial accuracy, BG is the most accurate one.

Keywords: bivariate statistical techniques, BG and TT membership structure, fuzzy expert system, information value method, machine learning technique

Procedia PDF Downloads 127
1770 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions

Authors: Vikrant Gupta, Amrit Goswami

Abstract:

The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.

Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition

Procedia PDF Downloads 136
1769 An Efficient Traceability Mechanism in the Audited Cloud Data Storage

Authors: Ramya P, Lino Abraham Varghese, S. Bose

Abstract:

By cloud storage services, the data can be stored in the cloud, and can be shared across multiple users. Due to the unexpected hardware/software failures and human errors, which make the data stored in the cloud be lost or corrupted easily it affected the integrity of data in cloud. Some mechanisms have been designed to allow both data owners and public verifiers to efficiently audit cloud data integrity without retrieving the entire data from the cloud server. But public auditing on the integrity of shared data with the existing mechanisms will unavoidably reveal confidential information such as identity of the person, to public verifiers. Here a privacy-preserving mechanism is proposed to support public auditing on shared data stored in the cloud. It uses group signatures to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept confidential from public verifiers, who are easily verifying shared data integrity without retrieving the entire file. But on demand, the signer of the each block is reveal to the owner alone. Group private key is generated once by the owner in the static group, where as in the dynamic group, the group private key is change when the users revoke from the group. When the users leave from the group the already signed blocks are resigned by cloud service provider instead of owner is efficiently handled by efficient proxy re-signature scheme.

Keywords: data integrity, dynamic group, group signature, public auditing

Procedia PDF Downloads 392
1768 Development of Knitted Seersucker Fabric for Improved Comfort Properties

Authors: Waqas Ashraf, Yasir Nawab, Haritham Khan, Habib Awais, Shahbaz Ahmad

Abstract:

Seersucker is a popular lightweight fabric widely used in men’s and women’s suiting, casual wear, children’s clothing, house robes, bed spreads and for spring and summer wear. The puckered effect generates air spaces between body and the fabric, keeping the wearer cool in hot conditions. The aim of this work was to develop knitted seersucker fabric on single cylinder weft knitting machine using plain jersey structure. Core spun cotton yarn and cotton spun yarn of same linear density were used. Core spun cotton yarn, contains cotton fiber in the sheath and elastase filament in the core. The both yarn were fed at regular interval to feeders on the machine. The loop length and yarn tension were kept constant at each feeder. The samples were then scoured and bleached. After wet processing, the fabric samples were washed and tumble dried. Parameters like loop length, stitch density and areal density were measured after conditioning these samples for 24 hours in Standard atmospheric condition. Produced sample has a regular puckering stripe along the width of the fabric with same height. The stitch density of both the flat and puckered area of relaxed fabric was found to be different .Air permeability and moisture management tests were performed. The results indicated that the knitted seersucker fabric has better wicking and moisture management properties as the flat area contact, whereas puckered area held away from the skin. Seersucker effect in knitted fabric was achieved by the difference of contraction of both sets of courses produced from different types of yarns. The seer sucker fabric produce by knitting technique is less expensive as compared to woven seer sucker fabric as there is no need of yarn preparation. The knitted seersucker fabric is more practicable for summer dresses, skirts, blouses, shirts, trousers and shorts.

Keywords: air permeability, knitted structure, moisture management, seersucker

Procedia PDF Downloads 325
1767 Tectono-Stratigraphic Architecture, Depositional Systems and Salt Tectonics to Strike-Slip Faulting in Kribi-Campo-Cameroon Atlantic Margin with an Unsupervised Machine Learning Approach (West African Margin)

Authors: Joseph Bertrand Iboum Kissaaka, Charles Fonyuy Ngum Tchioben, Paul Gustave Fowe Kwetche, Jeannette Ngo Elogan Ntem, Joseph Binyet Njebakal, Ribert Yvan Makosso-Tchapi, François Mvondo Owono, Marie Joseph Ntamak-Nida

Abstract:

Located in the Gulf of Guinea, the Kribi-Campo sub-basin belongs to the Aptian salt basins along the West African Margin. In this paper, we investigated the tectono-stratigraphic architecture of the basin, focusing on the role of salt tectonics and strike-slip faults along the Kribi Fracture Zone with implications for reservoir prediction. Using 2D seismic data and well data interpreted through sequence stratigraphy with integrated seismic attributes analysis with Python Programming and unsupervised Machine Learning, at least six second-order sequences, indicating three main stages of tectono-stratigraphic evolution, were determined: pre-salt syn-rift, post-salt rift climax and post-rift stages. The pre-salt syn-rift stage with KTS1 tectonosequence (Barremian-Aptian) reveals a transform rifting along NE-SW transfer faults associated with N-S to NNE-SSW syn-rift longitudinal faults bounding a NW-SE half-graben filled with alluvial to lacustrine-fan delta deposits. The post-salt rift-climax stage (Lower to Upper Cretaceous) includes two second-order tectonosequences (KTS2 and KTS3) associated with the salt tectonics and Campo High uplift. During the rift-climax stage, the growth of salt diapirs developed syncline withdrawal basins filled by early forced regression, mid transgressive and late normal regressive systems tracts. The early rift climax underlines some fine-grained hangingwall fans or delta deposits and coarse-grained fans from the footwall of fault scarps. The post-rift stage (Paleogene to Neogene) contains at least three main tectonosequences KTS4, KTS5 and KTS6-7. The first one developed some turbiditic lobe complexes considered as mass transport complexes and feeder channel-lobe complexes cutting the unstable shelf edge of the Campo High. The last two developed submarine Channel Complexes associated with lobes towards the southern part and braided delta to tidal channels towards the northern part of the Kribi-Campo sub-basin. The reservoir distribution in the Kribi-Campo sub-basin reveals some channels, fan lobes reservoirs and stacked channels reaching up to the polygonal fault systems.

Keywords: tectono-stratigraphic architecture, Kribi-Campo sub-basin, machine learning, pre-salt sequences, post-salt sequences

Procedia PDF Downloads 56
1766 AI Features in Netflix

Authors: Dona Abdulwassi, Dhaee Dahlawi, Yara Zainy, Leen Joharji

Abstract:

The relationship between Netflix and artificial intelligence is discussed in this paper. Netflix uses the most effective and efficient approaches to apply artificial intelligence, machine learning, and data science. Netflix employs the personalization tool for their users, recommending or suggesting shows based on what those users have already watched. The researchers conducted an experiment to learn more about how Netflix is used and how AI affects the user experience. The main conclusions of this study are that Netflix has a wide range of AI features, most users are happy with their Netflix subscriptions, and the majority prefer Netflix to alternative apps.

Keywords: easy accessibility, recommends, accuracy, privacy

Procedia PDF Downloads 64
1765 The Analysis of Own Signals of PM Electrical Machines – Example of Eccentricity

Authors: Marcin Baranski

Abstract:

This article presents a vibration diagnostic method designed for permanent magnets (PM) traction motors. Those machines are commonly used in traction drives of electrical vehicles. Specific structural properties of machines excited by permanent magnets are used in this method - electromotive force (EMF) generated due to vibrations. This work presents: field-circuit model, results of static tests, results of calculations and simulations.

Keywords: electrical vehicle, permanent magnet, traction drive, vibrations, electrical machine, eccentricity

Procedia PDF Downloads 628
1764 Wear Resistance of 20MnCr5 Steel Nitrided by Plasma

Authors: Okba Belahssen, Said Benramache

Abstract:

This paper presents wear behavior of the plasma-nitrided 20MnCr5 steel. Untreated and plasma nitrided samples were tested. The morphology was observed by scanning electron microscopy (SEM). The plasma nitriding behaviors of 20MnCr5 steel have been assessed by evaluating tribological properties and surface hardness by using a pin-on-disk wear machine and microhardness tester. Experimental results showed that the nitrides ε-Fe2−3N and γ′-Fe4N present in the white layer improve the wear resistance.

Keywords: plasma-nitriding, alloy 20mncr5, steel, friction, wear

Procedia PDF Downloads 557
1763 Corporate Collapses and (Legal) Ethics

Authors: Elizabeth Snyman-Van Deventer

Abstract:

Numerous corporate scandals, which included investment scams, corporate malfeasance, unethical conduct and conflicts of interest, contributed to the collapse of WorldCom, Global Crossing, Xerox, Tyco, Enron, Sprint, AbbVie and Imclone and led to alarmed investors abandoning public securities markets and the tumbling of U.S stock markets. These companies suffered significant financial losses due to substantial and fraudulent misstatements and other illegal, corrupt or unethical practices. Executives were convicted of fraud and sentenced to prison. The corporate financial scandals, governance failures, and the ensuing public outcries led to mandatory legislation, e.g. the Sarbanes-Oxley Act in the USA. In European corporate scandals such as Parmalat, Royal Dutch Ahold, Vivendi, Adecco and Elan, the boards missed financial misrepresentations. In South Africa, Steinhoff is the most well-known example of corporate collapse, but now we can also add Tongaat Hulett. It seems as if fraud and corruption may be the major sources of these corporate collapses. In most instances, there is either the active involvement of the directors and managers in these fraudulent or corrupt practices, or there is a negligent or even intentional failure to act by directors to prevent these activities. However, besides directors and managers, auditors and lawyers failed in most of these companies to fulfil their professional duties. In most of these major collapses, the ethics of especially auditors and directors could be questioned. This paper will first provide a brief overview of corporate collapses. Secondly, the reasons for these collapses, with a focus on unethical conduct, will be discussed.

Keywords: professional duties, corporate collapses, ethical conduct, legal ethics, directors, auditors

Procedia PDF Downloads 63
1762 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning

Authors: Ali Kazemi

Abstract:

The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.

Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis

Procedia PDF Downloads 57
1761 Voltage Stability Margin-Based Approach for Placement of Distributed Generators in Power Systems

Authors: Oludamilare Bode Adewuyi, Yanxia Sun, Isaiah Gbadegesin Adebayo

Abstract:

Voltage stability analysis is crucial to the reliable and economic operation of power systems. The power system of developing nations is more susceptible to failures due to the continuously increasing load demand, which is not matched with generation increase and efficient transmission infrastructures. Thus, most power systems are heavily stressed, and the planning of extra generation from distributed generation sources needs to be efficiently done so as to ensure the security of the power system. Some voltage stability index-based approach for DG siting has been reported in the literature. However, most of the existing voltage stability indices, though sufficient, are found to be inaccurate, especially for overloaded power systems. In this paper, the performance of a relatively different approach using a line voltage stability margin indicator, which has proven to have better accuracy, has been presented and compared with a conventional line voltage stability index for DG siting using the Nigerian 28 bus system. Critical boundary index (CBI) for voltage stability margin estimation was deployed to identify suitable locations for DG placement, and the performance was compared with DG placement using the Novel Line Stability Index (NLSI) approach. From the simulation results, both CBI and NLSI agreed greatly on suitable locations for DG on the test system; while CBI identified bus 18 as the most suitable at system overload, NLSI identified bus 8 to be the most suitable. Considering the effect of the DG placement at the selected buses on the voltage magnitude profile, the result shows that the DG placed on bus 18 identified by CBI improved the performance of the power system better.

Keywords: voltage stability analysis, voltage collapse, voltage stability index, distributed generation

Procedia PDF Downloads 93
1760 Multi Attribute Failure Mode Analysis of the Catering Systems: A Case Study of Sefako Makgatho Health Sciences University in South Africa

Authors: Mokoena Oratilwe Penwell, Seeletse Solly Matshonisa

Abstract:

The demand for quality products is a vital factor determining the success of a producing company, and the reality of this demand influences customer satisfaction. In Sefako Makgatho Health Sciences University (SMU), concerns over the quality of food being sold have been raised by mostly students and staff who are primary consumers of food being sold by the cafeteria. Suspicions of food poisoning and the occurrence of diarrhea-related to food from the cafeteria, amongst others, have been raised. However, minimal measures have been taken to resolve the issue of food quality. New service providers have been appointed, and still, the same trends are being observed, the quality of food seems to depreciate continuously. This paper uses multi-attribute failure mode analysis (MAFMA) for failure detection and minimization on the machines used for food production by SMU catering company before being sold to both staff, and students so as to improve production plant reliability, and performance. Analytical Hierarchy Process (AHP) will be used for the severity ranking of the weight criterions and development of the hierarchical structure for the cafeteria company. Amongst other potential issues detected, maintenance of the machines and equipment used for food preparations was of concern. Also, the staff lacked sufficient hospitality skills, supervision, and management in the cafeteria needed greater attention to mitigate some of the failures occurring in the food production plant.

Keywords: MAFMA, food quality, maintenance, supervision

Procedia PDF Downloads 135
1759 Effects of AI-driven Applications on Bank Performance in West Africa

Authors: Ani Wilson Uchenna, Ogbonna Chikodi

Abstract:

This study examined the impact of artificial intelligence driven applications on banks’ performance in West Africa using Nigeria and Ghana as case studies. Specifically, the study examined the extent to which deployment of smart automated teller machine impacts the banks’ net worth within the reference period in Nigeria and Ghana. It ascertained the impact of point of sale on banks’ net worth within the reference period in Nigeria and Ghana. Thirdly, it verified the extent to which webpay services can influence banks’ performance in Nigeria and Ghana and finally, determined the impact of mobile pay services on banks’ performance in Nigeria and Ghana. The study used automated teller machine (ATM), Point of sale services (POS), Mobile pay services (MOP) and Web pay services (WBP) as proxies for explanatory variables while Bank net worth was used as explained variable for the study. The data for this study were sourced from central bank of Nigeria (CBN) Statistical Bulletin as well as Bank of Ghana (BoGH) Statistical Bulletin, Ghana payment systems oversight annual report and world development indicator (WDI). Furthermore, the mixed order of integration observed from the panel unit test result justified the use of autoregressive distributed lag (ARDL) approach to data analysis which the study adopted. While the cointegration test showed the existence of cointegration among the studied variables, bound test result justified the presence of long-run relationship among the series. Again, ARDL error correction estimate established satisfactory (13.92%) speed of adjustment from long run disequilibrium back to short run dynamic relationship. The study found that while Automated teller machine (ATM) had statistically significant impact on bank net worth (BNW) of Nigeria and Ghana, point of sale services application (POS) statistically and significantly impact on bank net worth within the study period, mobile pay services application was statistically significant in impacting the changes in the bank net worth of the countries of study while web pay services (WBP) had no statistically significant impact on bank net worth of the countries of reference. The study concluded that artificial intelligence driven application have significant an positive impact on bank performance with exception of web pay which had negative impact on bank net worth. The study recommended that management of banks both in Nigerian and Ghanaian should encourage more investments in AI-powered smart ATMs aimed towards delivering more secured banking services in order to increase revenue, discourage excessive queuing in the banking hall, reduced fraud and minimize error in processing transaction. Banks within the scope of this study should leverage on modern technologies to checkmate the excesses of the private operators POS in order to build more confidence on potential customers. Government should convert mobile pay services to a counter terrorism tool by ensuring that restrictions on over-the-counter withdrawals to a minimum amount is maintained and place sanctions on withdrawals above that limit.

Keywords: artificial intelligence (ai), bank performance, automated teller machines (atm), point of sale (pos)

Procedia PDF Downloads 7
1758 Integrative Omics-Portrayal Disentangles Molecular Heterogeneity and Progression Mechanisms of Cancer

Authors: Binder Hans

Abstract:

Cancer is no longer seen as solely a genetic disease where genetic defects such as mutations and copy number variations affect gene regulation and eventually lead to aberrant cell functioning which can be monitored by transcriptome analysis. It has become obvious that epigenetic alterations represent a further important layer of (de-)regulation of gene activity. For example, aberrant DNA methylation is a hallmark of many cancer types, and methylation patterns were successfully used to subtype cancer heterogeneity. Hence, unraveling the interplay between different omics levels such as genome, transcriptome and epigenome is inevitable for a mechanistic understanding of molecular deregulation causing complex diseases such as cancer. This objective requires powerful downstream integrative bioinformatics methods as an essential prerequisite to discover the whole genome mutational, transcriptome and epigenome landscapes of cancer specimen and to discover cancer genesis, progression and heterogeneity. Basic challenges and tasks arise ‘beyond sequencing’ because of the big size of the data, their complexity, the need to search for hidden structures in the data, for knowledge mining to discover biological function and also systems biology conceptual models to deduce developmental interrelations between different cancer states. These tasks are tightly related to cancer biology as an (epi-)genetic disease giving rise to aberrant genomic regulation under micro-environmental control and clonal evolution which leads to heterogeneous cellular states. Machine learning algorithms such as self organizing maps (SOM) represent one interesting option to tackle these bioinformatics tasks. The SOMmethod enables recognizing complex patterns in large-scale data generated by highthroughput omics technologies. It portrays molecular phenotypes by generating individualized, easy to interpret images of the data landscape in combination with comprehensive analysis options. Our image-based, reductionist machine learning methods provide one interesting perspective how to deal with massive data in the discovery of complex diseases, gliomas, melanomas and colon cancer on molecular level. As an important new challenge, we address the combined portrayal of different omics data such as genome-wide genomic, transcriptomic and methylomic ones. The integrative-omics portrayal approach is based on the joint training of the data and it provides separate personalized data portraits for each patient and data type which can be analyzed by visual inspection as one option. The new method enables an integrative genome-wide view on the omics data types and the underlying regulatory modes. It is applied to high and low-grade gliomas and to melanomas where it disentangles transversal and longitudinal molecular heterogeneity in terms of distinct molecular subtypes and progression paths with prognostic impact.

Keywords: integrative bioinformatics, machine learning, molecular mechanisms of cancer, gliomas and melanomas

Procedia PDF Downloads 148
1757 Knowledge, Attitude, and Practice among Medical Students Regarding Basic Life Support

Authors: Sumia Fatima, Tayyaba Idrees

Abstract:

Cardiac Arrest and Heart Failures are an important causes of mortality in developed and developing countries and even a second spent without Cardiopulmonary Resuscitation (CPR) increases the risk of mortality. Youngs doctors are expected to partake in CPR from the first day and if they are not taught basic life support (BLS) skills during their studies. They have next to no opportunity to learn them in clinical settings. To determine the exact level of knowledge of Basic Life Support among medical students. To compare the degree of knowledge among 1st and 2nd year medical students of RMU (Rawalpindi Medical University), using self-structured questionnaires. A cross sectional, qualitative primary study was conducted in March 2020 in order to analyse theoretical and practical knowledge of Basic Life Support among Medical Students of 1st and 2nd year MBBS. Self-Structured Questionnaires were distributed among 300 students, 150 from 1st year and 150 from 2nd year. Data was analysed using SPSS v 22. Chi Square test was employed. The results showed that only 13 (4%) students had received formal BLS training.129 (42%) students had encountered accidents in real life but had not known how to react. Majority responded that Basic Life Support should be made part of medical college curriculum (189 students), 194 participants (64%) had moderate knowledge of both theoretical and practical aspects of BLS. 75-80% students of both 1st and 2nd year had only moderate knowledge, which must be improved for them to be better healthcare providers in future. It was also found that male students had more practical knowledge than females, but both had almost the same proficiency in theoretical knowledge. The study concluded that the level of knowledge of BLS among the students was not up to the mark, and there is a dire need to include BLS training in the medical colleges’ curriculum.

Keywords: basic cardiac life support, cardiac arrest, awareness, medical students

Procedia PDF Downloads 93
1756 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes

Authors: Madushani Rodrigo, Banuka Athuraliya

Abstract:

In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.

Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16

Procedia PDF Downloads 120
1755 To Handle Data-Driven Software Development Projects Effectively

Authors: Shahnewaz Khan

Abstract:

Machine learning (ML) techniques are often used in projects for creating data-driven applications. These tasks typically demand additional research and analysis. The proper technique and strategy must be chosen to ensure the success of data-driven projects. Otherwise, even exerting a lot of effort, the necessary development might not always be possible. In this post, an effort to examine the workflow of data-driven software development projects and its implementation process in order to describe how to manage a project successfully. Which will assist in minimizing the added workload.

Keywords: data, data-driven projects, data science, NLP, software project

Procedia PDF Downloads 83
1754 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis

Authors: Jigg Pelayo, Ricardo Villar

Abstract:

The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.

Keywords: high value crop, LiDAR, OBIA, precision agriculture

Procedia PDF Downloads 402
1753 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 82
1752 Solving Mean Field Problems: A Survey of Numerical Methods and Applications

Authors: Amal Machtalay

Abstract:

In this survey, we aim to review the rapidly growing literature on numerical methods to solve different forms of mean field problems, namely mean field games (MFG), mean field controls (MFC), potential MFGs, and master equations, as well as their corresponding recent applications. Here, we distinguish two families of numerical methods: iterative methods based on mesh generation and those called mesh-free, normally related to neural networking and learning frameworks.

Keywords: mean-field games, numerical schemes, partial differential equations, complex systems, machine learning

Procedia PDF Downloads 113
1751 Power Control of a Doubly-Fed Induction Generator Used in Wind Turbine by RST Controller

Authors: A. Boualouch, A. Frigui, T. Nasser, A. Essadki, A.Boukhriss

Abstract:

This work deals with the vector control of the active and reactive powers of a Double-Fed Induction generator DFIG used as a wind generator by the polynomial RST controller. The control of the statoric power transfer between the machine and the grid is achieved by acting on the rotor parameters and control is provided by the polynomial controller RST. The performance and robustness of the controller are compared with PI controller and evaluated by simulation results in MATLAB/simulink.

Keywords: DFIG, RST, vector control, wind turbine

Procedia PDF Downloads 658
1750 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity

Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib

Abstract:

Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.

Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function

Procedia PDF Downloads 135
1749 Machine Learning Based Digitalization of Validated Traditional Cognitive Tests and Their Integration to Multi-User Digital Support System for Alzheimer’s Patients

Authors: Ramazan Bakir, Gizem Kayar

Abstract:

It is known that Alzheimer and Dementia are the two most common types of Neurodegenerative diseases and their visibility is getting accelerated for the last couple of years. As the population sees older ages all over the world, researchers expect to see the rate of this acceleration much higher. However, unfortunately, there is no known pharmacological cure for both, although some help to reduce the rate of cognitive decline speed. This is why we encounter with non-pharmacological treatment and tracking methods more for the last five years. Many researchers, including well-known associations and hospitals, lean towards using non-pharmacological methods to support cognitive function and improve the patient’s life quality. As the dementia symptoms related to mind, learning, memory, speaking, problem-solving, social abilities and daily activities gradually worsen over the years, many researchers know that cognitive support should start from the very beginning of the symptoms in order to slow down the decline. At this point, life of a patient and caregiver can be improved with some daily activities and applications. These activities include but not limited to basic word puzzles, daily cleaning activities, taking notes. Later, these activities and their results should be observed carefully and it is only possible during patient/caregiver and M.D. in-person meetings in hospitals. These meetings can be quite time-consuming, exhausting and financially ineffective for hospitals, medical doctors, caregivers and especially for patients. On the other hand, digital support systems are showing positive results for all stakeholders of healthcare systems. This can be observed in countries that started Telemedicine systems. The biggest potential of our system is setting the inter-user communication up in the best possible way. In our project, we propose Machine Learning based digitalization of validated traditional cognitive tests (e.g. MOCA, Afazi, left-right hemisphere), their analyses for high-quality follow-up and communication systems for all stakeholders. R. Bakir and G. Kayar are with Gefeasoft, Inc, R&D – Software Development and Health Technologies company. Emails: ramazan, gizem @ gefeasoft.com This platform has a high potential not only for patient tracking but also for making all stakeholders feel safe through all stages. As the registered hospitals assign corresponding medical doctors to the system, these MDs are able to register their own patients and assign special tasks for each patient. With our integrated machine learning support, MDs are able to track the failure and success rates of each patient and also see general averages among similarly progressed patients. In addition, our platform also supports multi-player technology which helps patients play with their caregivers so that they feel much safer at any point they are uncomfortable. By also gamifying the daily household activities, the patients will be able to repeat their social tasks and we will provide non-pharmacological reminiscence therapy (RT – life review therapy). All collected data will be mined by our data scientists and analyzed meaningfully. In addition, we will also add gamification modules for caregivers based on Naomi Feil’s Validation Therapy. Both are behaving positively to the patient and keeping yourself mentally healthy is important for caregivers. We aim to provide a therapy system based on gamification for them, too. When this project accomplishes all the above-written tasks, patients will have the chance to do many tasks at home remotely and MDs will be able to follow them up very effectively. We propose a complete platform and the whole project is both time and cost-effective for supporting all stakeholders.

Keywords: alzheimer’s, dementia, cognitive functionality, cognitive tests, serious games, machine learning, artificial intelligence, digitalization, non-pharmacological, data analysis, telemedicine, e-health, health-tech, gamification

Procedia PDF Downloads 137
1748 Study of Acoustic Resonance of Model Liquid Rocket Combustion Chamber and Its Suppression

Authors: Vimal O. Kumar, C. K. Muthukumaran, P. Rakesh

Abstract:

Liquid rocket engine (LRE) combustion chamber is subjected to pressure oscillation during the combustion process. The combustion noise (acoustic noise) is a broad band, small amplitude, high frequency component pressure oscillation. They constitute only a minor fraction ( < 1%) of the entire combustion process. However, this high frequency oscillation is huge concern during the design phase of LRE combustion chamber as it would cause catastrophic failure of the chamber. Depends on the chamber geometry, certain frequencies form standing wave pattern, and they resonate with high amplitude and are known as Eigen modes. These Eigen modes could cause failures unless it is suppressed to be within safe limits. These modes are categorized into radial, tangential, and azimuthal modes, and their structure inside the combustion chamber is of interest to the researchers. In the present proposal, experimental as well as numerical simulation will be performed to obtain the frequency-amplitude characteristics of the model combustion chamber for different baffle configuration. The main objective of this study is to find effect of baffle configuration that would provide better suppression of acoustic modes. The experimental study aims at measuring the frequency amplitude characteristics at certain points in the chamber wall. The experimental measurement will be also used for scheme used in numerical simulation. In addition to experiments, numerical simulation would provide detailed structure of the Eigenmodes exhibited and their level of suppression with the aid of different baffle configurations.

Keywords: baffle, instability, liquid rocket engine, pressure response of chamber

Procedia PDF Downloads 122
1747 Using Support Vector Machines for Measuring Democracy

Authors: Tommy Krieger, Klaus Gruendler

Abstract:

We present a novel approach for measuring democracy, which enables a very detailed and sensitive index. This method is based on Support Vector Machines, a mathematical algorithm for pattern recognition. Our implementation evaluates 188 countries in the period between 1981 and 2011. The Support Vector Machines Democracy Index (SVMDI) is continuously on the 0-1-Interval and robust to variations in the numerical process parameters. The algorithm introduced here can be used for every concept of democracy without additional adjustments, and due to its flexibility it is also a valuable tool for comparison studies.

Keywords: democracy, democracy index, machine learning, support vector machines

Procedia PDF Downloads 378
1746 A Data-Driven Compartmental Model for Dengue Forecasting and Covariate Inference

Authors: Yichao Liu, Peter Fransson, Julian Heidecke, Jonas Wallin, Joacim Rockloev

Abstract:

Dengue, a mosquito-borne viral disease, poses a significant public health challenge in endemic tropical or subtropical countries, including Sri Lanka. To reveal insights into the complexity of the dynamics of this disease and study the drivers, a comprehensive model capable of both robust forecasting and insightful inference of drivers while capturing the co-circulating of several virus strains is essential. However, existing studies mostly focus on only one aspect at a time and do not integrate and carry insights across the siloed approach. While mechanistic models are developed to capture immunity dynamics, they are often oversimplified and lack integration of all the diverse drivers of disease transmission. On the other hand, purely data-driven methods lack constraints imposed by immuno-epidemiological processes, making them prone to overfitting and inference bias. This research presents a hybrid model that combines machine learning techniques with mechanistic modelling to overcome the limitations of existing approaches. Leveraging eight years of newly reported dengue case data, along with socioeconomic factors, such as human mobility, weekly climate data from 2011 to 2018, genetic data detecting the introduction and presence of new strains, and estimates of seropositivity for different districts in Sri Lanka, we derive a data-driven vector (SEI) to human (SEIR) model across 16 regions in Sri Lanka at the weekly time scale. By conducting ablation studies, the lag effects allowing delays up to 12 weeks of time-varying climate factors were determined. The model demonstrates superior predictive performance over a pure machine learning approach when considering lead times of 5 and 10 weeks on data withheld from model fitting. It further reveals several interesting interpretable findings of drivers while adjusting for the dynamics and influences of immunity and introduction of a new strain. The study uncovers strong influences of socioeconomic variables: population density, mobility, household income and rural vs. urban population. The study reveals substantial sensitivity to the diurnal temperature range and precipitation, while mean temperature and humidity appear less important in the study location. Additionally, the model indicated sensitivity to vegetation index, both max and average. Predictions on testing data reveal high model accuracy. Overall, this study advances the knowledge of dengue transmission in Sri Lanka and demonstrates the importance of incorporating hybrid modelling techniques to use biologically informed model structures with flexible data-driven estimates of model parameters. The findings show the potential to both inference of drivers in situations of complex disease dynamics and robust forecasting models.

Keywords: compartmental model, climate, dengue, machine learning, social-economic

Procedia PDF Downloads 84