Search results for: machine failures
1957 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service
Authors: Lai Wenfang
Abstract:
Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.Keywords: artificial intelligence, natural language processing, machine learning, visualization
Procedia PDF Downloads 1741956 Chemical Hazards Impact on Efficiency of Energy Storage Battery and its Possible Mitigation's
Authors: Abirham Simeneh Ayalew, Seada Hussen Adem, Frie Ayalew Yimam
Abstract:
Battery energy storage has a great role on storing energy harnessed from different alternative resources and greatly benefit the power sector by supply energy back to the system during outage and regular operation in power sectors. Most of the study shows that there is an exponential increase in the quantity of lithium - ion battery energy storage system due to their power density, economical aspects and its performance. But this lithium ion battery failures resulted in fire and explosion due to its having flammable electrolytes (chemicals) which can create those hazards. Hazards happen in these energy storage system lead to minimize battery life spans or efficiency. Identifying the real cause of these hazards and its mitigation techniques can be the solution to improve the efficiency of battery technologies and the electrode materials should have high electrical conductivity, large surface area, stable structure and low resistance. This paper asses the real causes of chemical hazards, its impact on efficiency, proposed solution for mitigating those hazards associated with efficiency improvement and summery of researchers new finding related to the field.Keywords: battery energy storage, battery energy storage efficiency, chemical hazards, lithium ion battery
Procedia PDF Downloads 781955 Examination of Public Hospital Unions Technical Efficiencies Using Data Envelopment Analysis and Machine Learning Techniques
Authors: Songul Cinaroglu
Abstract:
Regional planning in health has gained speed for developing countries in recent years. In Turkey, 89 different Public Hospital Unions (PHUs) were conducted based on provincial levels. In this study technical efficiencies of 89 PHUs were examined by using Data Envelopment Analysis (DEA) and machine learning techniques by dividing them into two clusters in terms of similarities of input and output indicators. Number of beds, physicians and nurses determined as input variables and number of outpatients, inpatients and surgical operations determined as output indicators. Before performing DEA, PHUs were grouped into two clusters. It is seen that the first cluster represents PHUs which have higher population, demand and service density than the others. The difference between clusters was statistically significant in terms of all study variables (p ˂ 0.001). After clustering, DEA was performed for general and for two clusters separately. It was found that 11% of PHUs were efficient in general, additionally 21% and 17% of them were efficient for the first and second clusters respectively. It is seen that PHUs, which are representing urban parts of the country and have higher population and service density, are more efficient than others. Random forest decision tree graph shows that number of inpatients is a determinative factor of efficiency of PHUs, which is a measure of service density. It is advisable for public health policy makers to use statistical learning methods in resource planning decisions to improve efficiency in health care.Keywords: public hospital unions, efficiency, data envelopment analysis, random forest
Procedia PDF Downloads 1261954 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 1671953 Real-Time Generative Architecture for Mesh and Texture
Abstract:
In the evolving landscape of physics-based machine learning (PBML), particularly within fluid dynamics and its applications in electromechanical engineering, robot vision, and robot learning, achieving precision and alignment with researchers' specific needs presents a formidable challenge. In response, this work proposes a methodology that integrates neural transformation with a modified smoothed particle hydrodynamics model for generating transformed 3D fluid simulations. This approach is useful for nanoscale science, where the unique and complex behaviors of viscoelastic medium demand accurate neurally-transformed simulations for materials understanding and manipulation. In electromechanical engineering, the method enhances the design and functionality of fluid-operated systems, particularly microfluidic devices, contributing to advancements in nanomaterial design, drug delivery systems, and more. The proposed approach also aligns with the principles of PBML, offering advantages such as multi-fluid stylization and consistent particle attribute transfer. This capability is valuable in various fields where the interaction of multiple fluid components is significant. Moreover, the application of neurally-transformed hydrodynamical models extends to manufacturing processes, such as the production of microelectromechanical systems, enhancing efficiency and cost-effectiveness. The system's ability to perform neural transfer on 3D fluid scenes using a deep learning algorithm alongside physical models further adds a layer of flexibility, allowing researchers to tailor simulations to specific needs across scientific and engineering disciplines.Keywords: physics-based machine learning, robot vision, robot learning, hydrodynamics
Procedia PDF Downloads 661952 Social Media as a Means of Participation in Democracies
Abstract:
Social media is one of the most important and effective means of social interaction among people in which they create, share and exchange their ideas via photos, videos or voice messages. Although there are lots of communication tools. Social media sites are the most prominent ones that allows the users articulate themselves in a matter of seconds all around the world with almost any expenses and thus, they became very popular and widespread after its emergence. As the usage of social media increases, it becomes an effective instrument in social matters. While it is possible to use social media to emphasize basic human rights and protest some failures of any government as in “Arab Spring”, it is also possible to spread propaganda and misinformation just to cause long lasting insurgency, upheaval, turmoil or disorder as an instrument of intervention to internal affairs and state sovereignty by some hostile groups or countries. It is certain that social media has positive effects on participation in democracies allowing people express themselves freely and limitlessly, but obviously, the misuse of it is very common and it is quite possible that even a five-minute-long video record can topple down a government or give a solid reason to a government to review its policies on some certain areas. As one of the most important and effective means of participation, social media presents some opportunities as well as risks. In this study, the place of social media for participation in democracies will be demonstrated under the light of opportunities and risks.Keywords: social media, democracy, participation, risks, opportunities
Procedia PDF Downloads 4221951 Prophet and Philosopher Mohammed: A Precursor of Feminism
Authors: Mohammad Mozammel Haque
Abstract:
That feminism is nothing but the name of a belief that women should have the same rights as men needs no telling. The history of modern western feminism is divided into three waves and each is described as dealing with different aspects of the same feminist issues. The first wave refers to the movement of the 19th through early 20th centuries, which dealt mainly with suffrage, working conditions and educational rights for women. The second wave (1960s-1980s) dealt with the inequality of laws and the role of women in society. The third wave (late 1980s-early 2000s) is seen as both a continuation of the second wave and a response to the perceived failures. Mary Wollstonecraft struggled for the emancipation and freedom of the women of Europe, Begum Rokeya brought about revolution for the women of the East and West Bengal, Jeremy Bentham wrote for the independence of women in England. But if feminism refers to the movement of giving women what they deserve, then it won’t be an overstatement to state that Mohammad is the precursor of what we call feminism. This paper investigates the background of official starting of feminism, and also the backdrop of the women of Muhammad’s time. The article, besides showing that this great prophet and philosopher firstly brought about a movement for the education and rights of women and took them out of grave where they were buried alive, also delineates Mohammedan endeavours he attempted to give the women what they ought to have.Keywords: education, equality, feminism, precursor
Procedia PDF Downloads 4981950 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 1041949 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models
Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur
Abstract:
In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity
Procedia PDF Downloads 691948 Fuzzy Inference System for Risk Assessment Evaluation of Wheat Flour Product Manufacturing Systems
Authors: Atrin Barzegar, Yas Barzegar, Stefano Marrone, Francesco Bellini, Laura Verde
Abstract:
The aim of this research is to develop an intelligent system to analyze the risk level of wheat flour product manufacturing system. The model consists of five Fuzzy Inference Systems in two different layers to analyse the risk of a wheat flour product manufacturing system. The first layer of the model consists of four Fuzzy Inference Systems with three criteria. The output of each one of the Physical, Chemical, Biological and Environmental Failures will be the input of the final manufacturing systems. The proposed model based on Mamdani Fuzzy Inference Systems gives a performance ranking of wheat flour products manufacturing systems. The first step is obtaining data to identify the failure modes from expert’s opinions. The second step is the fuzzification process to convert crisp input to a fuzzy set., then the IF-then fuzzy rule applied through inference engine, and in the final step, the defuzzification process is applied to convert the fuzzy output into real numbers.Keywords: failure modes, fuzzy rules, fuzzy inference system, risk assessment
Procedia PDF Downloads 751947 Template Design Packages for Repetitive Construction Projects
Authors: Ali Youniss Aidbaiss, G. Unnikrishnan, Anoob Hakim
Abstract:
Scope changes, scope creeps, cost and time overruns have become common in projects in the oil and gas sector. Even in repetitive projects, failure to implement lessons learnt and correct past mistakes have resulted in various setbacks. This paper describes the concept of reusing successfully implemented design packages as templates for repetitive projects, and thereby lowering the instances of project failures. Units or systems successfully installed in projects can be identified and taken up for preparing template design packages. Standardization of units and systems helps to develop templates from successful designs which can be repeatedly used with confidence. These packages can be used with minimum modifications for developing FEED packages faster, saving cost and other valuable resources. Lessons learnt from the completed project incorporated in the templates avoid repeating past mistakes during detailed design, procurement and execution. With template packages, consistent quality can be maintained for similar projects, avoiding scope creep and scope changes which will ultimately result in cost and time savings.Keywords: engineering work package, repetitive construction, template design package, time saving in projects
Procedia PDF Downloads 3181946 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities
Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto
Abstract:
The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP
Procedia PDF Downloads 911945 Design, Synthesis and Evaluation of 4-(Phenylsulfonamido)Benzamide Derivatives as Selective Butyrylcholinesterase Inhibitors
Authors: Sushil Kumar Singh, Ashok Kumar, Ankit Ganeshpurkar, Ravi Singh, Devendra Kumar
Abstract:
In spectrum of neurodegenerative diseases, Alzheimer’s disease (AD) is characterized by the presence of amyloid β plaques and neurofibrillary tangles in the brain. It results in cognitive and memory impairment due to loss of cholinergic neurons, which is considered to be one of the contributing factors. Donepezil, an acetylcholinesterase (AChE) inhibitor which also inhibits butyrylcholinesterase (BuChE) and improves the memory and brain’s cognitive functions, is the most successful and prescribed drug to treat the symptoms of AD. The present work is based on designing of the selective BuChE inhibitors using computational techniques. In this work, machine learning models were trained using classification algorithms followed by screening of diverse chemical library of compounds. The various molecular modelling and simulation techniques were used to obtain the virtual hits. The amide derivatives of 4-(phenylsulfonamido) benzoic acid were synthesized and characterized using 1H & 13C NMR, FTIR and mass spectrometry. The enzyme inhibition assays were performed on equine plasma BuChE and electric eel’s AChE by method developed by Ellman et al. Compounds 31, 34, 37, 42, 49, 52 and 54 were found to be active against equine BuChE. N-(2-chlorophenyl)-4-(phenylsulfonamido)benzamide and N-(2-bromophenyl)-4-(phenylsulfonamido)benzamide (compounds 34 and 37) displayed IC50 of 61.32 ± 7.21 and 42.64 ± 2.17 nM against equine plasma BuChE. Ortho-substituted derivatives were more active against BuChE. Further, the ortho-halogen and ortho-alkyl substituted derivatives were found to be most active among all with minimal AChE inhibition. The compounds were selective toward BuChE.Keywords: Alzheimer disease, butyrylcholinesterase, machine learning, sulfonamides
Procedia PDF Downloads 1391944 Prediction of Covid-19 Cases and Current Situation of Italy and Its Different Regions Using Machine Learning Algorithm
Authors: Shafait Hussain Ali
Abstract:
Since its outbreak in China, the Covid_19 19 disease has been caused by the corona virus SARS N coyote 2. Italy was the first Western country to be severely affected, and the first country to take drastic measures to control the disease. In start of December 2019, the sudden outbreaks of the Coronary Virus Disease was caused by a new Corona 2 virus (SARS-CO2) of acute respiratory syndrome in china city Wuhan. The World Health Organization declared the epidemic a public health emergency of international concern on January 30, 2020,. On February 14, 2020, 49,053 laboratory-confirmed deaths and 1481 deaths have been reported worldwide. The threat of the disease has forced most of the governments to implement various control measures. Therefore it becomes necessary to analyze the Italian data very carefully, in particular to investigates and to find out the present condition and the number of infected persons in the form of positive cases, death, hospitalized or some other features of infected persons will clear in simple form. So used such a model that will clearly shows the real facts and figures and also understandable to every readable person which can get some real benefit after reading it. The model used must includes(total positive cases, current positive cases, hospitalized patients, death, recovered peoples frequency rates ) all features that explains and clear the wide range facts in very simple form and helpful to administration of that country.Keywords: machine learning tools and techniques, rapid miner tool, Naive-Bayes algorithm, predictions
Procedia PDF Downloads 1071943 Digital Preservation: A Need of Tomorrow
Authors: Gaurav Kumar
Abstract:
Digital libraries have been established all over the world to create, maintain and to preserve the digital materials. This paper exhibits the importance and objectives of digital preservation. The necessities of preservation are hardware and software technology to interpret the digital documents and discuss various aspects of digital preservation.Keywords: preservation, digital preservation, conservation, archive, repository, document, information technology, hardware, software, organization, machine readable format
Procedia PDF Downloads 5881942 Reinforced Concrete Foundation for Turbine Generators
Authors: Siddhartha Bhattacharya
Abstract:
Steam Turbine-Generators (STG) and Combustion Turbine-Generator (CTG) are used in almost all modern petrochemical, LNG plants and power plant facilities. The reinforced concrete table top foundations are required to support these high speed rotating heavy machineries and is one of the most critical and challenging structures on any industrial project. The paper illustrates through a practical example, the step by step procedure adopted in designing a table top foundation supported on piles for a steam turbine generator with operating speed of 60 Hz. Finite element model of a table top foundation is generated in ANSYS. Piles are modeled as springs-damper elements (COMBIN14). Basic loads are adopted in analysis and design of the foundation based on the vendor requirements, industry standards, and relevant ASCE & ACI codal provisions. Static serviceability checks are performed with the help of Misalignment Tolerance Matrix (MTM) method in which the percentage of misalignment at a given bearing due to displacement at another bearing is calculated and kept within the stipulated criteria by the vendor so that the machine rotor can sustain the stresses developed due to this misalignment. Dynamic serviceability checks are performed through modal and forced vibration analysis where the foundation is checked for resonance and allowable amplitudes, as stipulated by the machine manufacturer. Reinforced concrete design of the foundation is performed by calculating the axial force, bending moment and shear at each of the critical sections. These values are calculated through area integral of the element stresses at these critical locations. Design is done as per ACI 318-05.Keywords: steam turbine generator foundation, finite element, static analysis, dynamic analysis
Procedia PDF Downloads 2951941 Fatigue Life Estimation Using N-Code for Drive Shaft of Passenger Vehicle
Authors: Tae An Kim, Hyo Lim Kang, Hye Won Han, Seung Ho Han
Abstract:
The drive shaft of passenger vehicle has its own function such as transmitting the engine torque from the gearbox and differential gears to the wheels. It must also compensate for all variations in angle or length resulting from manoeuvring and deflection for perfect synchronization between joints. Torsional fatigue failures occur frequently at the connection parts of the spline joints in the end of the drive shaft. In this study, the fatigue life of a drive shaft of passenger vehicle was estimated by using the finite element analysis. A commercial software of n-Code was applied under twisting load conditions, i.e. 0~134kgf•m and 0~188kgf•m, in which the shear strain range-fatigue life relationship considering Signed Shear method, Smith-Watson-Topper equation, Neuber-Hoffman Seeger method, size sensitivity factor and surface roughness effect was taken into account. The estimated fatigue life was verified by a twisting load test of the real drive shaft in a test rig. (Human Resource Training Project for Industry Matched R & D, KIAT, N036200004).Keywords: drive shaft, fatigue life estimation, passenger vehicle, shear strain range-fatigue life relationship, torsional fatigue failure
Procedia PDF Downloads 2751940 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation
Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch
Abstract:
Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication
Procedia PDF Downloads 91939 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 321938 Unraveling the Enigma of Military Coups through the Lens of State Fragility: A Qualitative Exploration of the Malian and Burkinabe Case
Authors: Deretha Bester
Abstract:
This article explores the recent military coups in Mali (August 2020) and Burkina Faso (January 2022), utilizing qualitative case study analyses to examine the pre-coup domestic contextual conditions that precipitated the events. By framing the research through the conceptual lens of state fragility, the research identifies key political, economic, and societal factors that converge to create an environment conducive for coups to occur. From the analyses, the study discusses several patterns that emerged, all revealing the significance of the core functions of governance. Through an in-depth exploration that brings the state back into the coup debate, the study provides rich insights into the complex dynamics of military intervention in political affairs, highlighting the urgency of understanding the underlying domestic factors that can lead to radical political changes. By illuminating these intricate dynamics, the article seeks to provide detailed insights needed to fully understand the challenges moulding the region's political terrain.Keywords: governance failures, military coups, political dynamics, Sahel region, state fragility
Procedia PDF Downloads 631937 1-D Convolutional Neural Network Approach for Wheel Flat Detection for Freight Wagons
Authors: Dachuan Shi, M. Hecht, Y. Ye
Abstract:
With the trend of digitalization in railway freight transport, a large number of freight wagons in Germany have been equipped with telematics devices, commonly placed on the wagon body. A telematics device contains a GPS module for tracking and a 3-axis accelerometer for shock detection. Besides these basic functions, it is desired to use the integrated accelerometer for condition monitoring without any additional sensors. Wheel flats as a common type of failure on wheel tread cause large impacts on wagons and infrastructure as well as impulsive noise. A large wheel flat may even cause safety issues such as derailments. In this sense, this paper proposes a machine learning approach for wheel flat detection by using car body accelerations. Due to suspension systems, impulsive signals caused by wheel flats are damped significantly and thus could be buried in signal noise and disturbances. Therefore, it is very challenging to detect wheel flats using car body accelerations. The proposed algorithm considers the envelope spectrum of car body accelerations to eliminate the effect of noise and disturbances. Subsequently, a 1-D convolutional neural network (CNN), which is well known as a deep learning method, is constructed to automatically extract features in the envelope-frequency domain and conduct classification. The constructed CNN is trained and tested on field test data, which are measured on the underframe of a tank wagon with a wheel flat of 20 mm length in the operational condition. The test results demonstrate the good performance of the proposed algorithm for real-time fault detection.Keywords: fault detection, wheel flat, convolutional neural network, machine learning
Procedia PDF Downloads 1311936 Milling Process of Rigid Flex Printed Circuit Board to Which Polyimide Covers the Whole Surface
Authors: Daniela Evtimovska, Ivana Srbinovska, Padraig O’Rourke
Abstract:
Kostal Macedonia has the challenge to mill a rigid-flex printed circuit board (PCB). The PCB elaborated in this paper is made of FR4 material covered with polyimide through the whole surface on the one side, including the tabs where PCBs need to be separated. After milling only 1.44 meters, the updraft routing tool isn’t effective and causes polyimide debris on all PCB cuts if it continues to mill with the same tool. Updraft routing tool is used for all another product in Kostal Macedonia, and it is changing after milling 60 meters. Changing the tool adds 80 seconds to the cycle time. One solution is using a laser-cut machine. Buying a laser-cut machine for cutting only one product doesn’t make financial sense. The focus is given to find an internal solution among the options under review to solve the issue with polyimide debris. In the paper, the design of the rigid-flex panel is described deeply. It is evaluated downdraft routing tool as a possible solution which could be used for the flex rigid panel as a specific product. It is done a comparison between updraft and down draft routing tools from a technical and financial aspect of view, taking into consideration the customer requirements for the rigid-flex PCB. The results show that using the downdraft routing tool is the best solution in this case. This tool is more expensive for 0.62 euros per piece than updraft. The downdraft routing tool needs to be changed after milling 43.44 meters in comparison with the updraft tool, which needs to be changed after milling only 1.44 meters. It is done analysis which actions should be taken in order further improvements and the possibility of maximum serving of downdraft routing tool.Keywords: Kostal Macedonia, rigid flex PCB, polyimide, debris, milling process, up/down draft routing tool
Procedia PDF Downloads 1931935 Problems of Innovation Development of Wireless Data Transfer Branch in the Cellular Market of Kazakhstan
Authors: Yessengeldy Kuanyshpayev
Abstract:
Now in some countries of the world the cellular market is on the point of saturation, in others - positive dynamics of development kept on. The reasons for it are also different, but there are united by their general susceptibility to innovation changes, if they are really innovative. If to take as an example the cellular market of Kazakhstan it is defined by the low percent of smart phones at consumers, the low population density, undercapacity of the 3G channel, and absence of universal access to the LTE technology that limits dynamical growth of this branch. These moments are aggravated by failures of starting commercial projects by private companies which prevent to be implemented and widely adopted to a new product among consumers. The object of the research is possible integration of wireless and program technologies at which introduction the idea can regenerate in an innovation. The analysis of existing projects in the market and the possible union of the technologies through a prism of theoretical bases of innovative activity shows that efficiency of the company by development and introduction of innovations is possible only thanks to strict observance of all terms and conditions of the innovative process which main term is profit. Despite that fact that on a global scale the innovativeness issue of companies is very popular, there are no research about possibility of innovative breaks in the field of wireless access to the Internet in the cellular market of Kazakhstan.Keywords: innovation, the effectiveness of company, commercialization, cellular market
Procedia PDF Downloads 3941934 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms
Authors: Bliss Singhal
Abstract:
Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression
Procedia PDF Downloads 821933 Optimization of Bio-Based Mixture of Canarium Luzonicum and Calcium Oxide as Coating Material for Reinforcing Steel Bars
Authors: Charizza D. Montarin, Daryl Jae S. Sigue, Gilford Estores
Abstract:
Philippines was moderately vulnerable to corrosion and to prevent this problem, surface coating should be applied. The main objective of this research was to develop and optimize a bio-based mixture of Pili Resin and Lime as Coating Materials. There are three (3) factors to be considered in choosing the best coating material such as chemical adhesion, friction, and the bearing/shear against the steel bar-concrete interface. Fortunately, both proportions of the Bio-based coating materials (50:50 and 65:35) do not have red rust formation complying with ASTM B117 but failed in terms of ASTM D 3359. Splitting failures of concrete were observed in the Unconfined Reinforced Concrete Samples. All of the steel bars (uncoated and coated) surpassed the Minimum Bond strength (NSCP 2015) about 203% to 285%. The experiments were about 1% to 3% of the results from the ANSYS Simulations with and without Salt Spray Test. Using the bio-based and epoxy coatings, normal splitting strengths were declined. However, there has no significant difference between the results. Thus, the bio-based coating materials can be used as an alternative for the epoxy coating materials and it was highly recommended for Low – Rise Building only.Keywords: Canarium luzonicum, calcium oxide, corrosion, finite element simulations
Procedia PDF Downloads 3231932 Criticality Assessment of Power Transformer by Using Entropy Weight Method
Authors: Rattanakorn Phadungthin, Juthathip Haema
Abstract:
This research presents an assessment of the criticality of the substation's power transformer using the Entropy Weight method to enable more effective maintenance planning. Typically, transformers fail due to heat, electricity, chemical reactions, mechanical stress, and extreme climatic conditions. Effective monitoring of the insulating oil is critical to prevent transformer failure. However, finding appropriate weights for dissolved gases is a major difficulty due to the lack of a defined baseline and the requirement for subjective expert opinion. To decrease expert prejudice and subjectivity, the Entropy Weight method is used to optimise the weightings of eleven key dissolved gases. The algorithm to assess the criticality operates through five steps: create a decision matrix, normalise the decision matrix, compute the entropy, calculate the weight, and calculate the criticality score. This study not only optimises gas weighing but also greatly minimises the need for expert judgment in transformer maintenance. It is expected to improve the efficiency and reliability of power transformers so failures and related economic costs are minimized. Furthermore, maintenance schemes and ranking are accomplished appropriately when the assessment of criticality is reached.Keywords: criticality assessment, dissolved gas, maintenance scheme, power transformer
Procedia PDF Downloads 81931 Increasing the Capacity of Plant Bottlenecks by Using of Improving the Ratio of Mean Time between Failures to Mean Time to Repair
Authors: Jalal Soleimannejad, Mohammad Asadizeidabadi, Mahmoud Koorki, Mojtaba Azarpira
Abstract:
A significant percentage of production costs is the maintenance costs, and analysis of maintenance costs could to achieve greater productivity and competitiveness. With this is mind, the maintenance of machines and installations is considered as an essential part of organizational functions and applying effective strategies causes significant added value in manufacturing activities. Organizations are trying to achieve performance levels on a global scale with emphasis on creating competitive advantage by different methods consist of RCM (Reliability-Center-Maintenance), TPM (Total Productivity Maintenance) etc. In this study, increasing the capacity of Concentration Plant of Golgohar Iron Ore Mining & Industrial Company (GEG) was examined by using of reliability and maintainability analyses. The results of this research showed that instead of increasing the number of machines (in order to solve the bottleneck problems), the improving of reliability and maintainability would solve bottleneck problems in the best way. It should be mention that in the abovementioned study, the data set of Concentration Plant of GEG as a case study, was applied and analyzed.Keywords: bottleneck, golgohar iron ore mining & industrial company, maintainability, maintenance costs, reliability
Procedia PDF Downloads 3631930 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1441929 Analysis of Moment Rotation Curve for Steel Beam Column Joint
Authors: A. J. Shah, G. R. Vesmawala
Abstract:
Connections perform a fundamental role in the steel structures as global behaviour. In order to evaluate the real influence of the physical and geometrical parameters that control their behaviour, many experimental tests and analysis have been developed but a definitive answer to the problem in question still stands. Here, various configurations of bolts were tried and the resulting moment rotation (M-θ) curves were plotted. The connection configuration is such that two bolts are located above each of the flanges and beside each of the webs. The model considers the combined effects of prying action, the formation of yield lines, and failures due to punching shear and beam section failure. For many types of connections, the stiffness at the service load level falls somewhere in between the fully restrained and simple limits and designers need to account for its behaviour. The (M-θ) curves are generally assumed to be the best characterization of connection behaviour. The moment rotation curves are generally derived from experiments on cantilever type specimens. The moments are calculated directly from the statics of the specimen, while the rotations are measured over a distance typically equal to the point of loading. Thus, this paper establishes the relationship between M-θ behaviour of different types of connections tested and presents the relative strength of various possible arrangements of bolts.Keywords: bolt, moment, rotation, stiffness, connections
Procedia PDF Downloads 3921928 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model
Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis
Abstract:
In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data
Procedia PDF Downloads 334