Search results for: machine readable format
1969 Creating Trauma-Sensitive Yoga Programs for University Students With Stress and Anxiety: Lessons From a Program in the United States
Authors: Jessica Gladden
Abstract:
Anxiety remains one of the most common mental health disorders in the United States. Many university students report having a high level of anxiety, with additional life stressors that might include being away from home for the first time, being around unfamiliar people, having new expectations placed on them, and often have financial struggles. Universities have the ability and opportunity to form programs that can involve students with activities that reduce stress and teach coping skills. This research includes one example of using a somatic based group format of yoga to teach these skills and assist students in applying these strategies to their daily lives. This study compared a group of 17 students participating in weekly yoga classes to 34 students who did not attend the program. The students who attended the program reported a larger reduction of anxiety on both the BAI and GAD-7 than the control group, and verbally reported additional benefits in relaxation and coping skills. This presentation will review the results of the program as well as detailing the steps taken in creating a yoga program for university students with stress and anxiety. This will include a discussion on the components of trauma-sensitive yoga and the concerns and strategies to consider when developing a program for students.Keywords: yoga, trauma-sensitive yoga, anxiety, students
Procedia PDF Downloads 1161968 Promoting Teaching and Learning Structures Based on Innovation and Entrepreneurship in Valahia University of Targoviste
Authors: Gabriela Teodorescu, Ioana Daniela Dulama
Abstract:
In an ever-changing society, the education system needs to constantly evolve to meet market demands. During its 30 years of existence, Valahia University of Targoviste (VUT) tried to offer its students a series of teaching-learning schemes that would prepare them for a remarkable career. In VUT, the achievement of performance through innovation can be analyzed by reference to several key indicators (i.e., university climate, university resources, and innovative methods applied to classes), but it is possible to differentiate between activities in the classic format: participate to courses; interactive seminars and tutorials; laboratories, workshops, project-based learning; entrepreneurial activities, through simulated enterprises; mentoring activities. Thus, VUT has implemented over time a series of schemes and projects based on innovation and entrepreneurship, and in this paper, some of them will be briefly presented. All these schemes were implemented by facilitating an effective dialog with students and the opportunity to listen to their views at all levels of the University and in all fields of study, as well as by developing a partnership with students to set out priority areas. VUT demonstrates innovation and entrepreneurial capacity through its new activities for higher education, which will attract more partnerships and projects dedicated to students.Keywords: Romania, project-based learning, entrepreneurial activities, simulated enterprises
Procedia PDF Downloads 1631967 Real-Time Generative Architecture for Mesh and Texture
Abstract:
In the evolving landscape of physics-based machine learning (PBML), particularly within fluid dynamics and its applications in electromechanical engineering, robot vision, and robot learning, achieving precision and alignment with researchers' specific needs presents a formidable challenge. In response, this work proposes a methodology that integrates neural transformation with a modified smoothed particle hydrodynamics model for generating transformed 3D fluid simulations. This approach is useful for nanoscale science, where the unique and complex behaviors of viscoelastic medium demand accurate neurally-transformed simulations for materials understanding and manipulation. In electromechanical engineering, the method enhances the design and functionality of fluid-operated systems, particularly microfluidic devices, contributing to advancements in nanomaterial design, drug delivery systems, and more. The proposed approach also aligns with the principles of PBML, offering advantages such as multi-fluid stylization and consistent particle attribute transfer. This capability is valuable in various fields where the interaction of multiple fluid components is significant. Moreover, the application of neurally-transformed hydrodynamical models extends to manufacturing processes, such as the production of microelectromechanical systems, enhancing efficiency and cost-effectiveness. The system's ability to perform neural transfer on 3D fluid scenes using a deep learning algorithm alongside physical models further adds a layer of flexibility, allowing researchers to tailor simulations to specific needs across scientific and engineering disciplines.Keywords: physics-based machine learning, robot vision, robot learning, hydrodynamics
Procedia PDF Downloads 661966 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 1041965 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models
Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur
Abstract:
In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity
Procedia PDF Downloads 691964 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities
Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto
Abstract:
The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP
Procedia PDF Downloads 911963 Design, Synthesis and Evaluation of 4-(Phenylsulfonamido)Benzamide Derivatives as Selective Butyrylcholinesterase Inhibitors
Authors: Sushil Kumar Singh, Ashok Kumar, Ankit Ganeshpurkar, Ravi Singh, Devendra Kumar
Abstract:
In spectrum of neurodegenerative diseases, Alzheimer’s disease (AD) is characterized by the presence of amyloid β plaques and neurofibrillary tangles in the brain. It results in cognitive and memory impairment due to loss of cholinergic neurons, which is considered to be one of the contributing factors. Donepezil, an acetylcholinesterase (AChE) inhibitor which also inhibits butyrylcholinesterase (BuChE) and improves the memory and brain’s cognitive functions, is the most successful and prescribed drug to treat the symptoms of AD. The present work is based on designing of the selective BuChE inhibitors using computational techniques. In this work, machine learning models were trained using classification algorithms followed by screening of diverse chemical library of compounds. The various molecular modelling and simulation techniques were used to obtain the virtual hits. The amide derivatives of 4-(phenylsulfonamido) benzoic acid were synthesized and characterized using 1H & 13C NMR, FTIR and mass spectrometry. The enzyme inhibition assays were performed on equine plasma BuChE and electric eel’s AChE by method developed by Ellman et al. Compounds 31, 34, 37, 42, 49, 52 and 54 were found to be active against equine BuChE. N-(2-chlorophenyl)-4-(phenylsulfonamido)benzamide and N-(2-bromophenyl)-4-(phenylsulfonamido)benzamide (compounds 34 and 37) displayed IC50 of 61.32 ± 7.21 and 42.64 ± 2.17 nM against equine plasma BuChE. Ortho-substituted derivatives were more active against BuChE. Further, the ortho-halogen and ortho-alkyl substituted derivatives were found to be most active among all with minimal AChE inhibition. The compounds were selective toward BuChE.Keywords: Alzheimer disease, butyrylcholinesterase, machine learning, sulfonamides
Procedia PDF Downloads 1401962 The Attitude of Second Year Pharmacy Students towards Lectures, Exams and E-Learning
Authors: Ahmed T. Alahmar
Abstract:
There is an increasing trend toward student-centred interactive e-learning methods and students’ feedback is a valuable tool for improving learning methods. The aim of this study was to explore the attitude of second year pharmacy students at the University of Babylon, Iraq, towards lectures, exams and e-learning. Materials and methods: Ninety pharmacy students were surveyed by paper questionnaire about their preference for lecture format, use of e-files, theoretical lectures versus practical experiments, lecture and lab time. Students were also asked about their predilection for Moodle-based online exams, different types of exam questions, exam time and other extra academic activities. Results: Students prefer to read lectures on paper (73.3%), use of PowerPoint file (76.7%), short lectures of less than 10 pages (94.5%), practical experiments (66.7%), lectures and lab time of less than two hours (89.9% and 96.6 respectively) and intra-lecture discussions (68.9%). Students also like to have paper-based exam (73.3%), short essay (40%) or MCQ (34.4%) questions and also prefer to do extra activities like reports (22.2%), seminars (18.6%) and posters (10.8%). Conclusion: Second year pharmacy students have different attitudes toward traditional and electronic leaning and assessment methods. Using multimedia, e-learning and Moodle are increasingly preferred methods among some students.Keywords: pharmacy, students, lecture, exam, e-learning, Moodle
Procedia PDF Downloads 1641961 Reinforced Concrete Foundation for Turbine Generators
Authors: Siddhartha Bhattacharya
Abstract:
Steam Turbine-Generators (STG) and Combustion Turbine-Generator (CTG) are used in almost all modern petrochemical, LNG plants and power plant facilities. The reinforced concrete table top foundations are required to support these high speed rotating heavy machineries and is one of the most critical and challenging structures on any industrial project. The paper illustrates through a practical example, the step by step procedure adopted in designing a table top foundation supported on piles for a steam turbine generator with operating speed of 60 Hz. Finite element model of a table top foundation is generated in ANSYS. Piles are modeled as springs-damper elements (COMBIN14). Basic loads are adopted in analysis and design of the foundation based on the vendor requirements, industry standards, and relevant ASCE & ACI codal provisions. Static serviceability checks are performed with the help of Misalignment Tolerance Matrix (MTM) method in which the percentage of misalignment at a given bearing due to displacement at another bearing is calculated and kept within the stipulated criteria by the vendor so that the machine rotor can sustain the stresses developed due to this misalignment. Dynamic serviceability checks are performed through modal and forced vibration analysis where the foundation is checked for resonance and allowable amplitudes, as stipulated by the machine manufacturer. Reinforced concrete design of the foundation is performed by calculating the axial force, bending moment and shear at each of the critical sections. These values are calculated through area integral of the element stresses at these critical locations. Design is done as per ACI 318-05.Keywords: steam turbine generator foundation, finite element, static analysis, dynamic analysis
Procedia PDF Downloads 2971960 Lab-on-Chip Multiplexed qPCR Analysis Utilizing Melting Curve Analysis Detects Up to 144 Alleles with Sub-hour Turn-around Time
Authors: Jeremy Woods, Fanqing Chen
Abstract:
Rapid genome testing can provide results in at best hours to days, though there are certain clinical decisions that could be guided by genetic test results that need results in hours to minutes. As such, methods of genetic Point of Care Testing (POCT) are required if genetic data is to guide management in illnesses in a wide variety of critical and emergent medical situations such as neonatal sepsis, chemotherapy administration in endometrial cancer, and glucose-6-phosphate dehydrogenase deficiency (G6PD)-associated neonatal hyperbilirubinemia. As such, we developed a POCT “lab-on-chip” technology capable of identifying up to 144 alleles in under an hour. This test required no specialized training to utilize and is suitable to deployment in clinics and hospitals for use by non-laboratory personnel such as nurses. We developed a multiplexed qPCR-based sample-to-answer system with melting curve analysis capable of detecting up to 144 alleles utilizing the Kelliop RapidSeq126 PCR platform combined with a single-use microfluidic cartridge. The RapidSeq126 is the size of a standard desktop printer and the microfluidic cartridges are smaller than a deck of playing cards. Thus the system was deployable in the outpatient setting for clinical trials of MT-RNR1 genotyping. The sample (buccal swab from volunteers or plasmids in media) used for DNA extraction was placed in the cartridge sample inlet prior to inserting the cartridge into the RapidSeq126. The microfluidic cartridge was composed of heat resistant polymer with a sample inlet, 100um conduits, liquid and solid reagents, valves, extraction chamber, lyophilization chamber, 12 PCR reaction chambers, and a waste chamber. No human effort was required for processing the sample and performing the assay other than placing the sample in the cartridge and placing the cartridge in the RapidSeq126. The RapidSeq126 has demonstrated ex vivo detection in plasmids and in vivo detection from human volunteer samples of up to 144 alleles per microfluidic cartridge used and did not require specialized laboratory training to operate. Efficacy was proven for several applications, such as multiple microsatellite instability (MSI) sites (SULF/RYR3/MRE11/ACVR2A/DIDO1/SEC31A/BTBD7), endometrial cancer POLE exonuclease domain (EMD) mutation status, and G6PD variants such as those commonly associated with hemolysis (c.202G>A, c.376A>G, c.680G>A>T, c.968T>C, 404A>C, c.871G>A). The RapidSeq126 system was also able to identify the three MT-RNR1 variants associated with aminoglycoside-induced sensorineural hearing loss (m.1555A>G, m.1095T>C, m.1494C>T). Results were provided in under an hour in a sample-to-answer fashion requiring no processing other than inserting the cartridge with the sample into the RapidSeq126. Results were provided in a digital, HL7-compliant format suitable for interfacing with Electronic Healthcare Record (EHR). The RapidSeq126 system provides a solution for emergency and critical medical situations requiring results in a matter of minutes to hours. The HL7-compliant data format of results enables the RapidSeq126 to interface directly with EHRs to generate best practice advisories and further reduce errors and time to diagnosis by providing digital results.Keywords: genetic testing, pharmacogenomics, point of care testing, rapid genetic testing
Procedia PDF Downloads 91959 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method
Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek
Abstract:
Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow
Procedia PDF Downloads 1351958 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 381957 The Development of XML Resume System in Thailand
Authors: Jarumon Nookhong, Thanakorn Uiphanit
Abstract:
This study is a research and development project which aims to develop XML Resume System to be the standard system in Thailand as well as to measure the efficiency of the XML Resume System in Thailand. This research separates into 2 stages: 1) to develop XML Document System to be the standard in Thailand, and 2) to experiment the system performance. The sample in this research is committed by 50 specialists in the field of human resources by selecting specifically. The tool that uses in this research is XML Resume System in Thailand and the performance evaluation format of system while the analysis of the data is calculated by using average and standard deviation. The result of the research found that the development of the XML Resume System that aims to be the standard system in Thailand had the result 2.67 of the average which is in a good level. The evaluation in testing the performance of the system had been done by the specialists of human resources who use the XML Resume system. When analyzing each part, it found out that the abilities according to the user’s requirement from specialists in the field of human resources, the convenience and easiness in usages, and the functional competency are respectively in a good level. The average of the ability according to the user’s need from specialists of human resources is 2.92. The average of the convenience and easiness in usages is 2.56. The average of functional competency is 2.53. These can be used as the standard effectively.Keywords: resume, XML, XML schema, computer science
Procedia PDF Downloads 4101956 1-D Convolutional Neural Network Approach for Wheel Flat Detection for Freight Wagons
Authors: Dachuan Shi, M. Hecht, Y. Ye
Abstract:
With the trend of digitalization in railway freight transport, a large number of freight wagons in Germany have been equipped with telematics devices, commonly placed on the wagon body. A telematics device contains a GPS module for tracking and a 3-axis accelerometer for shock detection. Besides these basic functions, it is desired to use the integrated accelerometer for condition monitoring without any additional sensors. Wheel flats as a common type of failure on wheel tread cause large impacts on wagons and infrastructure as well as impulsive noise. A large wheel flat may even cause safety issues such as derailments. In this sense, this paper proposes a machine learning approach for wheel flat detection by using car body accelerations. Due to suspension systems, impulsive signals caused by wheel flats are damped significantly and thus could be buried in signal noise and disturbances. Therefore, it is very challenging to detect wheel flats using car body accelerations. The proposed algorithm considers the envelope spectrum of car body accelerations to eliminate the effect of noise and disturbances. Subsequently, a 1-D convolutional neural network (CNN), which is well known as a deep learning method, is constructed to automatically extract features in the envelope-frequency domain and conduct classification. The constructed CNN is trained and tested on field test data, which are measured on the underframe of a tank wagon with a wheel flat of 20 mm length in the operational condition. The test results demonstrate the good performance of the proposed algorithm for real-time fault detection.Keywords: fault detection, wheel flat, convolutional neural network, machine learning
Procedia PDF Downloads 1311955 Milling Process of Rigid Flex Printed Circuit Board to Which Polyimide Covers the Whole Surface
Authors: Daniela Evtimovska, Ivana Srbinovska, Padraig O’Rourke
Abstract:
Kostal Macedonia has the challenge to mill a rigid-flex printed circuit board (PCB). The PCB elaborated in this paper is made of FR4 material covered with polyimide through the whole surface on the one side, including the tabs where PCBs need to be separated. After milling only 1.44 meters, the updraft routing tool isn’t effective and causes polyimide debris on all PCB cuts if it continues to mill with the same tool. Updraft routing tool is used for all another product in Kostal Macedonia, and it is changing after milling 60 meters. Changing the tool adds 80 seconds to the cycle time. One solution is using a laser-cut machine. Buying a laser-cut machine for cutting only one product doesn’t make financial sense. The focus is given to find an internal solution among the options under review to solve the issue with polyimide debris. In the paper, the design of the rigid-flex panel is described deeply. It is evaluated downdraft routing tool as a possible solution which could be used for the flex rigid panel as a specific product. It is done a comparison between updraft and down draft routing tools from a technical and financial aspect of view, taking into consideration the customer requirements for the rigid-flex PCB. The results show that using the downdraft routing tool is the best solution in this case. This tool is more expensive for 0.62 euros per piece than updraft. The downdraft routing tool needs to be changed after milling 43.44 meters in comparison with the updraft tool, which needs to be changed after milling only 1.44 meters. It is done analysis which actions should be taken in order further improvements and the possibility of maximum serving of downdraft routing tool.Keywords: Kostal Macedonia, rigid flex PCB, polyimide, debris, milling process, up/down draft routing tool
Procedia PDF Downloads 1931954 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms
Authors: Bliss Singhal
Abstract:
Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression
Procedia PDF Downloads 831953 Survival Struggle: To Be a Female Competitor in Survivor
Authors: Gülbuğ Erol, Gamze Beyge, Hakan Ekemen
Abstract:
In Turkey national TV channels broadcast a wide range of programs to audience attract viewers. Since the year 2000, especially the competition programs were directed towards entertainment and audience has gained. Even today, television channels have just begun to be broadcast on entertainment channels. Except from the news, the TV collects pleasure with its broadcasts aiming to meet the expectation of the Turkish people of TV 8 TV channels. Survivor, one of the TV 8 programs, draws attention with the ratings it receives and the broad target audience it addresses. Survivor, however, is one of the most exciting competitions on the Turkish television scene, which is rightly and ambitiously competitive in television contest programs. It is a format in which women and men struggle their power borders by winning the competition with their names thanks to their intelligence and endurance games. The contestants of the program, which has been running since March 22, 2005, are seen in a platform where they must present their struggle for their various awards. In Survivor, where competition is at stake, courage and strength are reduced by the reduction of sex. In this study, the critical discourse was made taking into consideration the challenges of female competitors competing to the final stage which is behind the male competitors. Secondly, the variables from the beginning to the present day of the adaptation of the judge to Turkey have been debated in a critical context.Keywords: television, meaning, discourse, contest program
Procedia PDF Downloads 2241952 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language
Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot
Abstract:
The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields
Procedia PDF Downloads 1041951 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1461950 An Animation-Based Resource for Screening Emotional and Behavioural Distress in Children Aged 6 to 12
Authors: Zoe Lynch, Kirsty Zieschank
Abstract:
There are several factors that compromise the utility and wide-spread use of existing emotional and behavioural distress screening instruments. Some of these factors include lengthy administration times, high costs, feasibility issues, and a lack of self-report options for children under 12 years of age. This animation-based resource was developed to overcome as many of these factors as possible. Developed for educators and medical and mental health professionals, this resource offers children a self-guided mechanism for reporting any current emotional and behavioural distress. An avatar assistant, selected by the child, accompanies them through each stage of the screening process, offering further instruction if prompted. Children enter their age and gender before viewing comparative animations conveying common childhood emotional and behavioural difficulties. The child then selects the most relatable animations, along with the frequency with which they experience the depicted emotions. From a perspective of intellectual development, an engaging, animated format means that outcomes will not be constrained by children’s reading, writing, cognitive, or verbal expression abilities. Having been user-tested with children aged 6 to 12, this resource shows promising results as a self-guided screening instrument.Keywords: animation-based screening instrument, mental health, primary-aged children, self-guided
Procedia PDF Downloads 1601949 Random Subspace Neural Classifier for Meteor Recognition in the Night Sky
Authors: Carlos Vera, Tetyana Baydyk, Ernst Kussul, Graciela Velasco, Miguel Aparicio
Abstract:
This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.Keywords: contour orientation histogram, meteors, night sky, RSC neural classifier, stars
Procedia PDF Downloads 1401948 Adjunct Placement in Educated Nigerian English
Authors: Juliet Charles Udoudom
Abstract:
In nonnative language use environments, language users have been known to demonstrate marked variations both in the spoken and written productions of the target language. For instance, analyses of the written productions of Nigerian users of English have shown inappropriate sequencing of sentence elements resulting in distortions in meaning and/or other problems of syntax. This study analyses the structure of sentences in the written production of 450 educated Nigerian users of English to establish their sensitivity to adjunct placement and the extent to which it exerts on meaning interpretation. The respondents were selected by a stratified random sampling technique from six universities in south-south Nigeria using education as the main yardstick for stratification. The systemic functional grammar analytic format was used in analyzing the sentences selected from the corpus. Findings from the analyses indicate that of the 8,576 tokens of adjuncts in the entire corpus, 4,550 (53.05%) of circumstantial adjuncts were appropriately placed while 2,839 (33.11%) of modal adjuncts occurred at appropriate locations in the clauses analyzed. Conjunctive adjunct placement accounted for 1,187 occurrences, representing 13.84% of the entire corpus. Further findings revealed that prepositional phrases (PPs) were not well construed by respondents to be capable of realizing adjunct functions, and were inappropriately placed.Keywords: adjunct, adjunct placement, conjunctive adjunct, circumstantial adjunct, systemic grammar
Procedia PDF Downloads 241947 The Impact of Introspective Models on Software Engineering
Authors: Rajneekant Bachan, Dhanush Vijay
Abstract:
The visualization of operating systems has refined the Turing machine, and current trends suggest that the emulation of 32 bit architectures will soon emerge. After years of technical research into Web services, we demonstrate the synthesis of gigabit switches, which embodies the robust principles of theory. Loam, our new algorithm for forward-error correction, is the solution to all of these challenges.Keywords: software engineering, architectures, introspective models, operating systems
Procedia PDF Downloads 5391946 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)
Procedia PDF Downloads 3841945 An In-Situ Integrated Micromachining System for Intricate Micro-Parts Machining
Authors: Shun-Tong Chen, Wei-Ping Huang, Hong-Ye Yang, Ming-Chieh Yeh, Chih-Wei Du
Abstract:
This study presents a novel versatile high-precision integrated micromachining system that combines contact and non-contact micromachining techniques to machine intricate micro-parts precisely. Two broad methods of micro fabrication-1) volume additive (micro co-deposition), and 2) volume subtractive (nanometric flycutting, ultrafine w-EDM (wire Electrical Discharge Machining), and micro honing) - are integrated in the developed micromachining system, and their effectiveness is verified. A multidirectional headstock that supports various machining orientations is designed to evaluate the feasibility of multifunctional micromachining. An exchangeable working-tank that allows for various machining mechanisms is also incorporated into the system. Hence, the micro tool and workpiece need not be unloaded or repositioned until all the planned tasks have been completed. By using the designed servo rotary mechanism, a nanometric flycutting approach with a concentric rotary accuracy of 5-nm is constructed and utilized with the system to machine a diffraction-grating element with a nano-metric scale V-groove array. To improve the wear resistance of the micro tool, the micro co-deposition function is used to provide a micro-abrasive coating by an electrochemical method. The construction of ultrafine w-EDM facilitates the fabrication of micro slots with a width of less than 20-µm on a hardened tool. The hardened tool can thus be employed as a micro honing-tool to hone a micro hole with an internal diameter of 200 µm on SKD-11 molded steel. Experimental results prove that intricate micro-parts can be in-situ manufactured with high-precision by the developed integrated micromachining system.Keywords: integrated micromachining system, in-situ micromachining, nanometric flycutting, ultrafine w-EDM, micro honing
Procedia PDF Downloads 4111944 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM
Authors: Rajpal Kaur, Pooja Choudhary
Abstract:
Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM
Procedia PDF Downloads 3851943 Impact on Underprivileged People Practising Expressive Textile Arts: An Exploratory Study Applied to Ex-Offenders in Hong Kong
Abstract:
This study aims to investigate the impact of practicing expressive textile arts on the underprivileged people namely, ex-offenders after taking a three-month textile arts and fashion creativity workshops from a service-learning subject, offered by the Hong Kong Polytechnic University in May 2016. In this service-learning subject, the subject lecturers, students and ex-offenders co-designed various expressive textile artworks together. During the creative process, the ex-offenders could enhance their self-confidence and rebuild a satisfactory identity through practicing expressive textile arts and fashion creativity. Ten textile arts prototypes in the format of fashion garments were presented in a mini fashion show and an exhibition, both at the Hong Kong Polytechnic University in July 2016. A quantitative research method was adopted and a questionnaire survey was conducted in this study. The research findings suggest that positive impacts are found on the ex-offenders’ perceptions of ‘feelings and thoughts before attending the workshops’, ‘feelings and thoughts during the workshops’, ‘attitude toward the textile arts materials’, and ‘attitude toward the expressive textile artworks’.Keywords: creativity, design, expressive textile arts, fashion, underprivileged people
Procedia PDF Downloads 3891942 Predicting Emerging Agricultural Investment Opportunities: The Potential of Structural Evolution Index
Authors: Kwaku Damoah
Abstract:
The agricultural sector is characterized by continuous transformation, driven by factors such as demographic shifts, evolving consumer preferences, climate change, and migration trends. This dynamic environment presents complex challenges for key stakeholders including farmers, governments, and investors, who must navigate these changes to achieve optimal investment returns. To effectively predict market trends and uncover promising investment opportunities, a systematic, data-driven approach is essential. This paper introduces the Structural Evolution Index (SEI), a machine learning-based methodology. SEI is specifically designed to analyse long-term trends and forecast the potential of emerging agricultural products for investment. Versatile in application, it evaluates various agricultural metrics such as production, yield, trade, land use, and consumption, providing a comprehensive view of the evolution within agricultural markets. By harnessing data from the UN Food and Agricultural Organisation (FAOSTAT), this study demonstrates the SEI's capabilities through Comparative Exploratory Analysis and evaluation of international trade in agricultural products, focusing on Malaysia and Singapore. The SEI methodology reveals intricate patterns and transitions within the agricultural sector, enabling stakeholders to strategically identify and capitalize on emerging markets. This predictive framework is a powerful tool for decision-makers, offering crucial insights that help anticipate market shifts and align investments with anticipated returns.Keywords: agricultural investment, algorithm, comparative exploratory analytics, machine learning, market trends, predictive analytics, structural evolution index
Procedia PDF Downloads 631941 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning
Authors: Xingyu Gao, Qiang Wu
Abstract:
Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.Keywords: patent influence, interpretable machine learning, predictive models, SHAP
Procedia PDF Downloads 501940 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 79