Search results for: cable splicing machine
1637 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 4431636 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images
Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor
Abstract:
Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.Keywords: foot disorder, machine learning, neural network, pes planus
Procedia PDF Downloads 3601635 Thinking about the Loss of Social Networking Sites May Expand the Distress of Social Exclusion
Authors: Wen-Bin Chiou, Hsiao-Chiao Weng
Abstract:
Social networking sites (SNS) such as Facebook and Twitter are low-cost tools that can promote the creation of social connections by providing a convenient platform that can be accessed at any time. In the current research, a laboratory experiment was conducted test the hypothesis that reminders of losing SNS would alter the impact of social events, especially those involving social exclusion. Specifically, this study explored whether losing SNS would intensify perceived social distress induced by exclusionary bogus feedback. Eighty-eight Facebook users (46 females, 42 males; mean age = 22.6 years, SD = 3.1 years) were recruited via campus posters and flyers at a national university in southern Taiwan. After participants provided consent, they were randomly assigned to a 2 (SNS non-use vs. neutral) between-subjects experiment. Participants completed an ostensible survey about online social networking in which we included an item about the time spent on SNS per day. The last question was used to manipulate thoughts about losing SNS access. Participants under the non-use condition were asked to record three conditions that would render them unable to use SNS (e.g., a network adaptor problem, malfunctioning cable modem, or problems with Internet service providers); participants under the neutral condition recorded three conditions that would render them unable to log onto the college website (e.g., server maintenance, local network or firewall problems). Later, this experiment employed a bogus-feedback paradigm to induce social exclusion. Participants then rated their social distress on a four-item scale, identical to that of Experiment 1 (α = .84). The results showed that thoughts of losing SNS intensified distress caused by social exclusion, suggesting that the loss of SNS has a similar effect to the loss of a primary source for social reconnections. Moreover, the priming effects of SNS on perceived distress were more prominent for heavy users. The demonstrated link between the idea of losing SNS use and increased pain of social exclusion manifests the importance of SNS as a crucial gateway for acquiring and rebuilding social connections. Use of online social networking appears to be a two-edged sword for coping with social exclusion in human lives in the e-society.Keywords: online social networking, perceived distress, social exclusion, SNS
Procedia PDF Downloads 4201634 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations
Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso
Abstract:
Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.Keywords: pipeline, leakage, detection, AI
Procedia PDF Downloads 1911633 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)
Procedia PDF Downloads 3671632 Experimental and Theoretical Study on Flexural Behaviors of Reinforced Concrete Cement (RCC) Beams by Using Carbonfiber Reinforcedpolymer (CFRP) Laminate as Retrofitting and Rehabilitation Method
Authors: Fils Olivier Kamanzi
Abstract:
This research Paper shows that materials CFRP were used to rehabilitate 9 Beams and retrofitting of 9 Beams with size (125x250x2300) mm each for M50 grade of concrete with 20% of Volume of Cement replaced by GGBS as a mineral Admixture. Superplasticizer (ForscoConplast SP430) used to reduce the water-cement ratio and maintaining good workability of fresh concrete (Slump test 57mm). Concrete Mix ratio 1:1.56:2.66 with a water-cement ratio of 0.31(ACI codebooks). A sample of 6cubes sized (150X150X150) mm, 6cylinders sized (150ФX300H) mm and 6Prisms sized (100X100X500) mm were cast, cured, and tested for 7,14&28days by compressive, tensile and flexure test; finally, mix design reaches the compressive strength of 59.84N/mm2. 21 Beams were cast and cured for up to 28 days, 3Beams were tested by a two-point loading machine as Control beams. 9 Beams were distressed in flexure by adopting failure up to final Yielding point under two-point loading conditions by taking 90% off Ultimate load. Three sets, each composed of three distressed beams, were rehabilitated by using CFRP sheets, one, two & three layers, respectively, and after being retested up to failure mode. Another three sets were freshly retrofitted also by using CFRP sheets one, two & three layers, respectively, and being tested by a two-point load method of compression strength testing machine. The aim of this study is to determine the flexural Strength & behaviors of repaired and retrofitted Beams by CFRP sheets for gaining good strength and considering economic aspects. The results show that rehabilitated beams increase its strength 47 %, 78 % & 89 %, respectively, to thickness of CFRP sheets and 41%, 51 %& 68 %, respectively too, for retrofitted Beams. The conclusion is that three layers of CFRP sheets are the best applicable in repairing and retrofitting the bonded beams method.Keywords: retrofitting, rehabilitation, cfrp, rcc beam, flexural strength and behaviors, ggbs, and epoxy resin
Procedia PDF Downloads 1081631 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review
Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha
Abstract:
Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text
Procedia PDF Downloads 1151630 E-Learning Platform for School Kids
Authors: Gihan Thilakarathna, Fernando Ishara, Rathnayake Yasith, Bandara A. M. R. Y.
Abstract:
E-learning is a crucial component of intelligent education. Even in the midst of a pandemic, E-learning is becoming increasingly important in the educational system. Several e-learning programs are accessible for students. Here, we decided to create an e-learning framework for children. We've found a few issues that teachers are having with their online classes. When there are numerous students in an online classroom, how does a teacher recognize a student's focus on academics and below-the-surface behaviors? Some kids are not paying attention in class, and others are napping. The teacher is unable to keep track of each and every student. Key challenge in e-learning is online exams. Because students can cheat easily during online exams. Hence there is need of exam proctoring is occurred. In here we propose an automated online exam cheating detection method using a web camera. The purpose of this project is to present an E-learning platform for math education and include games for kids as an alternative teaching method for math students. The game will be accessible via a web browser. The imagery in the game is drawn in a cartoonish style. This will help students learn math through games. Everything in this day and age is moving towards automation. However, automatic answer evaluation is only available for MCQ-based questions. As a result, the checker has a difficult time evaluating the theory solution. The current system requires more manpower and takes a long time to evaluate responses. It's also possible to mark two identical responses differently and receive two different grades. As a result, this application employs machine learning techniques to provide an automatic evaluation of subjective responses based on the keyword provided to the computer as student input, resulting in a fair distribution of marks. In addition, it will save time and manpower. We used deep learning, machine learning, image processing and natural language technologies to develop these research components.Keywords: math, education games, e-learning platform, artificial intelligence
Procedia PDF Downloads 1561629 Contextual Distribution for Textual Alignment
Authors: Yuri Bizzoni, Marianne Reboul
Abstract:
Our program compares French and Italian translations of Homer’s Odyssey, from the XVIth to the XXth century. We focus on the third point, showing how distributional semantics systems can be used both to improve alignment between different French translations as well as between the Greek text and a French translation. Although we focus on French examples, the techniques we display are completely language independent.Keywords: classical receptions, computational linguistics, distributional semantics, Homeric poems, machine translation, translation studies, text alignment
Procedia PDF Downloads 4341628 Exploring the Use of Augmented Reality for Laboratory Lectures in Distance Learning
Authors: Michele Gattullo, Vito M. Manghisi, Alessandro Evangelista, Enricoandrea Laviola
Abstract:
In this work, we explored the use of Augmented Reality (AR) to support students in laboratory lectures in Distance Learning (DL), designing an application that proved to be ready for use next semester. AR could help students in the understanding of complex concepts as well as increase their motivation in the learning process. However, despite many prototypes in the literature, it is still less used in schools and universities. This is mainly due to the perceived limited advantages to the investment costs, especially regarding changes needed in the teaching modalities. However, with the spread of epidemiological emergency due to SARS-CoV-2, schools and universities were forced to a very rapid redefinition of consolidated processes towards forms of Distance Learning. Despite its many advantages, it suffers from the impossibility to carry out practical activities that are of crucial importance in STEM ("Science, Technology, Engineering e Math") didactics. In this context, AR perceived advantages increased a lot since teachers are more prepared for new teaching modalities, exploiting AR that allows students to carry on practical activities on their own instead of being physically present in laboratories. In this work, we designed an AR application for the support of engineering students in the understanding of assembly drawings of complex machines. Traditionally, this skill is acquired in the first years of the bachelor's degree in industrial engineering, through laboratory activities where the teacher shows the corresponding components (e.g., bearings, screws, shafts) in a real machine and their representation in the assembly drawing. This research aims to explore the effectiveness of AR to allow students to acquire this skill on their own without physically being in the laboratory. In a preliminary phase, we interviewed students to understand the main issues in the learning of this subject. This survey revealed that students had difficulty identifying machine components in an assembly drawing, matching between the 2D representation of a component and its real shape, and understanding the functionality of a component within the machine. We developed a mobile application using Unity3D, aiming to solve the mentioned issues. We designed the application in collaboration with the course professors. Natural feature tracking was used to associate the 2D printed assembly drawing with the corresponding 3D virtual model. The application can be displayed on students’ tablets or smartphones. Users could interact with selecting a component from a part list on the device. Then, 3D representations of components appear on the printed drawing, coupled with 3D virtual labels for their location and identification. Users could also interact with watching a 3D animation to learn how components are assembled. Students evaluated the application through a questionnaire based on the System Usability Scale (SUS). The survey was provided to 15 students selected among those we participated in the preliminary interview. The mean SUS score was 83 (SD 12.9) over a maximum of 100, allowing teachers to use the AR application in their courses. Another important finding is that almost all the students revealed that this application would provide significant power for comprehension on their own.Keywords: augmented reality, distance learning, STEM didactics, technology in education
Procedia PDF Downloads 1281627 Ending Wars Over Water: Evaluating the Extent to Which Artificial Intelligence Can Be Used to Predict and Prevent Transboundary Water Conflicts
Authors: Akhila Potluru
Abstract:
Worldwide, more than 250 bodies of water are transboundary, meaning they cross the political boundaries of multiple countries. This creates a system of hydrological, economic, and social interdependence between communities reliant on these water sources. Transboundary water conflicts can occur as a result of this intense interdependence. Many factors contribute to the sparking of transboundary water conflicts, ranging from natural hydrological factors to hydro-political interactions. Previous attempts to predict transboundary water conflicts by analysing changes or trends in the contributing factors have typically failed because patterns in the data are hard to identify. However, there is potential for artificial intelligence and machine learning to fill this gap and identify future ‘hotspots’ up to a year in advance by identifying patterns in data where humans can’t. This research determines the extent to which AI can be used to predict and prevent transboundary water conflicts. This is done via a critical literature review of previous case studies and datasets where AI was deployed to predict water conflict. This research not only delivered a more nuanced understanding of previously undervalued factors that contribute toward transboundary water conflicts (in particular, culture and disinformation) but also by detecting conflict early, governance bodies can engage in processes to de-escalate conflict by providing pre-emptive solutions. Looking forward, this gives rise to significant policy implications and water-sharing agreements, which may be able to prevent water conflicts from developing into wide-scale disasters. Additionally, AI can be used to gain a fuller picture of water-based conflicts in areas where security concerns mean it is not possible to have staff on the ground. Therefore, AI enhances not only the depth of our knowledge about transboundary water conflicts but also the breadth of our knowledge. With demand for water constantly growing, competition between countries over shared water will increasingly lead to water conflict. There has never been a more significant time for us to be able to accurately predict and take precautions to prevent global water conflicts.Keywords: artificial intelligence, machine learning, transboundary water conflict, water management
Procedia PDF Downloads 1051626 Cocoon Characterization of Sericigenous Insects in North-East India and Prospects
Authors: Tarali Kalita, Karabi Dutta
Abstract:
The North Eastern Region of India, with diverse climatic conditions and a wide range of ecological habitats, makes an ideal natural abode for a good number of silk-producing insects. Cocoon is the economically important life stage from where silk of economic importance is obtained. In recent years, silk-based biomaterials have gained considerable attention, which is dependent on the structure and properties of the silkworm cocoons as well as silk yarn. The present investigation deals with the morphological study of cocoons, including cocoon color, cocoon size, shell weight and shell ratio of eleven different species of silk insects collected from different regions of North East India. The Scanning Electron Microscopic study and X-ray photoelectron spectroscopy were performed to know the arrangement of silk threads in cocoons and the atomic elemental analysis, respectively. Further, collected cocoons were degummed and reeled/spun on a reeling machine or spinning wheel to know the filament length, linear density and tensile strength by using Universal Testing Machine. The study showed significant variation in terms of cocoon color, cocoon shape, cocoon weight and filament packaging. XPS analysis revealed the presence of elements (Mass %) C, N, O, Si and Ca in varying amounts. The wild cocoons showed the presence of Calcium oxalate crystals which makes the cocoons hard and needs further treatment to reel. In the present investigation, the highest percentage of strain (%) and toughness (g/den) were observed in Antheraea assamensis, which implies that the muga silk is a more compact packing of molecules. It is expected that this study will be the basis for further biomimetic studies to design and manufacture artificial fiber composites with novel morphologies and associated material properties.Keywords: cocoon characterization, north-east India, prospects, silk characterization
Procedia PDF Downloads 901625 Comprehensive Multilevel Practical Condition Monitoring Guidelines for Power Cables in Industries: Case Study of Mobarakeh Steel Company in Iran
Authors: S. Mani, M. Kafil, E. Asadi
Abstract:
Condition Monitoring (CM) of electrical equipment has gained remarkable importance during the recent years; due to huge production losses, substantial imposed costs and increases in vulnerability, risk and uncertainty levels. Power cables feed numerous electrical equipment such as transformers, motors, and electric furnaces; thus their condition assessment is of a very great importance. This paper investigates electrical, structural and environmental failure sources, all of which influence cables' performances and limit their uptimes; and provides a comprehensive framework entailing practical CM guidelines for maintenance of cables in industries. The multilevel CM framework presented in this study covers performance indicative features of power cables; with a focus on both online and offline diagnosis and test scenarios, and covers short-term and long-term threats to the operation and longevity of power cables. The study, after concisely overviewing the concept of CM, thoroughly investigates five major areas of power quality, Insulation Quality features of partial discharges, tan delta and voltage withstand capabilities, together with sheath faults, shield currents and environmental features of temperature and humidity; and elaborates interconnections and mutual impacts between those areas; using mathematical formulation and practical guidelines. Detection, location, and severity identification methods for every threat or fault source are also elaborated. Finally, the comprehensive, practical guidelines presented in the study are presented for the specific case of Electric Arc Furnace (EAF) feeder MV power cables in Mobarakeh Steel Company (MSC), the largest steel company in MENA region, in Iran. Specific technical and industrial characteristics and limitations of a harsh industrial environment like MSC EAF feeder cable tunnels are imposed on the presented framework; making the suggested package more practical and tangible.Keywords: condition monitoring, diagnostics, insulation, maintenance, partial discharge, power cables, power quality
Procedia PDF Downloads 2281624 Using MALDI-TOF MS to Detect Environmental Microplastics (Polyethylene, Polyethylene Terephthalate, and Polystyrene) within a Simulated Tissue Sample
Authors: Kara J. Coffman-Rea, Karen E. Samonds
Abstract:
Microplastic pollution is an urgent global threat to our planet and human health. Microplastic particles have been detected within our food, water, and atmosphere, and found within the human stool, placenta, and lung tissue. However, most spectrometric microplastic detection methods require chemical digestion which can alter or destroy microplastic particles and makes it impossible to acquire information about their in-situ distribution. MALDI TOF MS (Matrix-assisted laser desorption ionization-time of flight mass spectrometry) is an analytical method using a soft ionization technique that can be used for polymer analysis. This method provides a valuable opportunity to both acquire information regarding the in-situ distribution of microplastics and also minimizes the destructive element of chemical digestion. In addition, MALDI TOF MS allows for expanded analysis of the microplastics including detection of specific additives that may be present within them. MALDI TOF MS is particularly sensitive to sample preparation and has not yet been used to analyze environmental microplastics within their specific location (e.g., biological tissues, sediment, water). In this study, microplastics were created using polyethylene gloves, polystyrene micro-foam, and polyethylene terephthalate cable sleeving. Plastics were frozen using liquid nitrogen and ground to obtain small fragments. An artificial tissue was created using a cellulose sponge as scaffolding coated with a MaxGel Extracellular Matrix to simulate human lung tissue. Optimal preparation techniques (e.g., matrix, cationization reagent, solvent, mixing ratio, laser intensity) were first established for each specific polymer type. The artificial tissue sample was subsequently spiked with microplastics, and specific polymers were detected using MALDI-TOF-MS. This study presents a novel method for the detection of environmental polyethylene, polyethylene terephthalate, and polystyrene microplastics within a complex sample. Results of this study provide an effective method that can be used in future microplastics research and can aid in determining the potential threats to environmental and human health that they pose.Keywords: environmental plastic pollution, MALDI-TOF MS, microplastics, polymer identification
Procedia PDF Downloads 2561623 Sound Analysis of Young Broilers Reared under Different Stocking Densities in Intensive Poultry Farming
Authors: Xiaoyang Zhao, Kaiying Wang
Abstract:
The choice of stocking density in poultry farming is a potential way for determining welfare level of poultry. However, it is difficult to measure stocking densities in poultry farming because of a lot of variables such as species, age and weight, feeding way, house structure and geographical location in different broiler houses. A method was proposed in this paper to measure the differences of young broilers reared under different stocking densities by sound analysis. Vocalisations of broilers were recorded and analysed under different stocking densities to identify the relationship between sounds and stocking densities. Recordings were made continuously for three-week-old chickens in order to evaluate the variation of sounds emitted by the animals at the beginning. The experimental trial was carried out in an indoor reared broiler farm; the audio recording procedures lasted for 5 days. Broilers were divided into 5 groups, stocking density treatments were 8/m², 10/m², 12/m² (96birds/pen), 14/m² and 16/m², all conditions including ventilation and feed conditions were kept same except from stocking densities in every group. The recordings and analysis of sounds of chickens were made noninvasively. Sound recordings were manually analysed and labelled using sound analysis software: GoldWave Digital Audio Editor. After sound acquisition process, the Mel Frequency Cepstrum Coefficients (MFCC) was extracted from sound data, and the Support Vector Machine (SVM) was used as an early detector and classifier. This preliminary study, conducted in an indoor reared broiler farm shows that this method can be used to classify sounds of chickens under different densities economically (only a cheap microphone and recorder can be used), the classification accuracy is 85.7%. This method can predict the optimum stocking density of broilers with the complement of animal welfare indicators, animal productive indicators and so on.Keywords: broiler, stocking density, poultry farming, sound monitoring, Mel Frequency Cepstrum Coefficients (MFCC), Support Vector Machine (SVM)
Procedia PDF Downloads 1621622 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas
Authors: Sahithi Yarlagadda
Abstract:
The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm
Procedia PDF Downloads 1101621 The Effects of a Thin Liquid Layer on the Hydrodynamic Machine Rotor
Authors: Jaroslav Krutil, František Pochylý, Simona Fialová, Vladimír Habán
Abstract:
A mathematical model of the additional effects of the liquid in the hydrodynamic gap is presented in the paper. An in-compressible viscous fluid is considered. Based on computational modeling are determined the matrices of mass, stiffness and damping. The mathematical model is experimentally verified.Keywords: computational modeling, mathematical model, hydrodynamic gap, matrices of mass, stiffness and damping
Procedia PDF Downloads 5571620 The Role of Artificial Intelligence Algorithms in Psychiatry: Advancing Diagnosis and Treatment
Authors: Netanel Stern
Abstract:
Artificial intelligence (AI) algorithms have emerged as powerful tools in the field of psychiatry, offering new possibilities for enhancing diagnosis and treatment outcomes. This article explores the utilization of AI algorithms in psychiatry, highlighting their potential to revolutionize patient care. Various AI algorithms, including machine learning, natural language processing (NLP), reinforcement learning, clustering, and Bayesian networks, are discussed in detail. Moreover, ethical considerations and future directions for research and implementation are addressed.Keywords: AI, software engineering, psychiatry, neuroimaging
Procedia PDF Downloads 1161619 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence
Authors: Muhammad Bilal Shaikh
Abstract:
Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.Keywords: multimodal AI, computer vision, NLP, mineral processing, mining
Procedia PDF Downloads 681618 Liver Regeneration of Small in situ Injury
Authors: Ziwei Song, Junjun Fan, Jeremy Teo, Yang Yu, Yukun Ma, Jie Yan, Shupei Mo, Lisa Tucker-Kellogg, Peter So, Hanry Yu
Abstract:
Liver is the center of detoxification and exposed to toxic metabolites all the time. It is highly regenerative after injury, with the ability to restore even after 70% partial hepatectomy. Most of the previous studies were using hepatectomy as injury models for liver regeneration study. There is limited understanding of small-scale liver injury, which can be caused by either low dose drug consumption or hepatocyte routine metabolism. Although these small in situ injuries do not cause immediate symptoms, repeated injuries will lead to aberrant wound healing in liver. Therefore, the cellular dynamics during liver regeneration is critical for our understanding of liver regeneration mechanism. We aim to study the liver regeneration of small-scale in situ liver injury in transgenic mice labeling actin (Lifeact-GFP). Previous studies have been using sample sections and biopsies of liver, which lack real-time information. In order to trace every individual hepatocyte during the regeneration process, we have developed and optimized an intravital imaging system that allows in vivo imaging of mouse liver for consecutive 5 days, allowing real-time cellular tracking and quantification of hepatocytes. We used femtosecond-laser ablation to make controlled and repeatable liver injury model, which mimics the real-life small in situ liver injury. This injury model is the first case of its kind for in vivo study on liver. We found that small-scale in situ liver injury is repaired by the coordination of hypertrophy and migration of hepatocytes. Hypertrophy is only transient at initial phase, while migration is the main driving force to complete the regeneration process. From cellular aspect, Akt/mTOR pathway is activated immediately after injury, which leads to transient hepatocyte hypertrophy. From mechano-sensing aspect, the actin cable, formed at apical surface of wound proximal hepatocytes, provides mechanical tension for hepatocyte migration. This study provides important information on both chemical and mechanical signals that promote liver regeneration of small in situ injury. We conclude that hypertrophy and migration play a dominant role at different stages of liver regeneration.Keywords: hepatocyte, hypertrophy, intravital imaging, liver regeneration, migration
Procedia PDF Downloads 2051617 Experimental Study on Friction Factor of Oscillating Flow Through a Regenerator
Authors: Mohamed Saïd Kahaleras, François Lanzetta, Mohamed Khan, Guillaume Layes, Philippe Nika
Abstract:
This paper presents an experimental work to characterize the dynamic operation of a metal regenerator crossed by dry compressible air alternating flow. Unsteady dynamic measurements concern the pressure, velocity and temperature of the gas at the ends and inside the channels of the regenerator. The regenerators are tested under isothermal conditions and thermal axial temperature gradient.Keywords: friction factor, oscillating flow, regenerator, stirling machine
Procedia PDF Downloads 5081616 Investigation of Shear Strength, and Dilative Behavior of Coarse-grained Samples Using Laboratory Test and Machine Learning Technique
Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari
Abstract:
Coarse-grained soils are known and commonly used in a wide range of geotechnical projects, including high earth dams or embankments for their high shear strength. The most important engineering property of these soils is friction angle which represents the interlocking between soil particles and can be applied widely in designing and constructing these earth structures. Friction angle and dilative behavior of coarse-grained soils can be estimated from empirical correlations with in-situ testing and physical properties of the soil or measured directly in the laboratory performing direct shear or triaxial tests. Unfortunately, large-scale testing is difficult, challenging, and expensive and is not possible in most soil mechanic laboratories. So, it is common to remove the large particles and do the tests, which cannot be counted as an exact estimation of the parameters and behavior of the original soil. This paper describes a new methodology to simulate particles grading distribution of a well-graded gravel sample to a smaller scale sample as it can be tested in an ordinary direct shear apparatus to estimate the stress-strain behavior, friction angle, and dilative behavior of the original coarse-grained soil considering its confining pressure, and relative density using a machine learning method. A total number of 72 direct shear tests are performed in 6 different sizes, 3 different confining pressures, and 4 different relative densities. Multivariate Adaptive Regression Spline (MARS) technique was used to develop an equation in order to predict shear strength and dilative behavior based on the size distribution of coarse-grained soil particles. Also, an uncertainty analysis was performed in order to examine the reliability of the proposed equation.Keywords: MARS, coarse-grained soil, shear strength, uncertainty analysis
Procedia PDF Downloads 1621615 PhotoRoom App
Authors: Nouf Nasser, Nada Alotaibi, Jazzal Kandiel
Abstract:
This research study is about the use of artificial intelligence in PhotoRoom. When an individual selects a photo, PhotoRoom automagically removes or separates the background from other parts of the photo through the use of artificial intelligence. This will allow an individual to select their desired background and edit it as they wish. The methodology used was an observation, where various reviews and parts of the app were observed. The review section's findings showed that many people actually like the app, and some even rated it five stars. The conclusion was that PhotoRoom is one of the best photo editing apps due to its speed and accuracy in removing backgrounds.Keywords: removing background, app, artificial intelligence, machine learning
Procedia PDF Downloads 1991614 RPM-Synchronous Non-Circular Grinding: An Approach to Enhance Efficiency in Grinding of Non-Circular Workpieces
Authors: Matthias Steffan, Franz Haas
Abstract:
The production process grinding is one of the latest steps in a value-added manufacturing chain. Within this step, workpiece geometry and surface roughness are determined. Up to this process stage, considerable costs and energy have already been spent on components. According to the current state of the art, therefore, large safety reserves are calculated in order to guarantee a process capability. Especially for non-circular grinding, this fact leads to considerable losses of process efficiency. With present technology, various non-circular geometries on a workpiece must be grinded subsequently in an oscillating process where X- and Q-axis of the machine are coupled. With the approach of RPM-Synchronous Noncircular Grinding, such workpieces can be machined in an ordinary plung grinding process. Therefore, the workpieces and the grinding wheels revolutionary rate are in a fixed ratio. A non-circular grinding wheel is used to transfer its geometry onto the workpiece. The authors use a worldwide unique machine tool that was especially designed for this technology. Highest revolution rates on the workpiece spindle (up to 4500 rpm) are mandatory for the success of this grinding process. This grinding approach is performed in a two-step process. For roughing, a highly porous vitrified bonded grinding wheel with medium grain size is used. It ensures high specific material removal rates for efficiently producing the non-circular geometry on the workpiece. This process step is adapted by a force control algorithm, which uses acquired data from a three-component force sensor located in the dead centre of the tailstock. For finishing, a grinding wheel with a fine grain size is used. Roughing and finishing are performed consecutively among the same clamping of the workpiece with two locally separated grinding spindles. The approach of RPM-Synchronous Noncircular Grinding shows great efficiency enhancement in non-circular grinding. For the first time, three-dimensional non-circular shapes can be grinded that opens up various fields of application. Especially automotive industries show big interest in the emerging trend in finishing machining.Keywords: efficiency enhancement, finishing machining, non-circular grinding, rpm-synchronous grinding
Procedia PDF Downloads 2831613 Automatic Approach for Estimating the Protection Elements of Electric Power Plants
Authors: Mahmoud Mohammad Salem Al-Suod, Ushkarenko O. Alexander, Dorogan I. Olga
Abstract:
New algorithms using microprocessor systems have been proposed for protection the diesel-generator unit in autonomous power systems. The software structure is designed to enhance the control automata of the system, in which every protection module of diesel-generator encapsulates the finite state machine.Keywords: diesel-generator unit, protection, state diagram, control system, algorithm, software components
Procedia PDF Downloads 4191612 Contribution to Improving the DFIG Control Using a Multi-Level Inverter
Authors: Imane El Karaoui, Mohammed Maaroufi, Hamid Chaikhy
Abstract:
Doubly Fed Induction Generator (DFIG) is one of the most reliable wind generator. Major problem in wind power generation is to generate Sinusoidal signal with very low THD on variable speed caused by inverter two levels used. This paper presents a multi-level inverter whose objective is to reduce the THD and the dimensions of the output filter. This work proposes a three-level NPC-type inverter, the results simulation are presented demonstrating the efficiency of the proposed inverter.Keywords: DFIG, multilevel inverter, NPC inverter, THD, induction machine
Procedia PDF Downloads 2491611 Factors Affecting Visual Environment in Mine Lighting
Authors: N. Lakshmipathy, Ch. S. N. Murthy, M. Aruna
Abstract:
The design of lighting systems for surface mines is not an easy task because of the unique environment and work procedures encountered in the mines. The primary objective of this paper is to identify the major problems encountered in mine lighting application and to provide guidance in the solution of these problems. In the surface mining reflectance of surrounding surfaces is one of the important factors, which improve the vision, in the night hours. But due to typical working nature in the mines it is very difficult to fulfill these requirements, and also the orientation of the light at work site is a challenging task. Due to this reason machine operator and other workers in a mine need to be able to orient themselves in a difficult visual environment. The haul roads always keep on changing to tune with the mining activity. Other critical area such as dumpyards, stackyards etc. also change their phase with time, and it is difficult to illuminate such areas. Mining is a hazardous occupation, with workers exposed to adverse conditions; apart from the need for hard physical labor, there is exposure to stress and environmental pollutants like dust, noise, heat, vibration, poor illumination, radiation, etc. Visibility is restricted when operating load haul dumper and Heavy Earth Moving Machinery (HEMM) vehicles resulting in a number of serious accidents. one of the leading causes of these accidents is the inability of the equipment operator to see clearly people, objects or hazards around the machine. Results indicate blind spots are caused primarily by posts, the back of the operator's cab, and by lights and light brackets. The careful designed and implemented, lighting systems provide mine workers improved visibility and contribute to improved safety, productivity and morale. Properly designed lighting systems can improve visibility and safety during working in the opencast mines.Keywords: contrast, efficacy, illuminance, illumination, light, luminaire, luminance, reflectance, visibility
Procedia PDF Downloads 3581610 Digital Architectural Practice as a Challenge for Digital Architectural Technology Elements in the Era of Digital Design
Authors: Ling Liyun
Abstract:
In the field of contemporary architecture, complex forms of architectural works continue to emerge in the world, along with some new terminology emerged: digital architecture, parametric design, algorithm generation, building information modeling, CNC construction and so on. Architects gradually mastered the new skills of mathematical logic in the form of exploration, virtual simulation, and the entire design and coordination in the construction process. Digital construction technology has a greater degree in controlling construction, and ensure its accuracy, creating a series of new construction techniques. As a result, the use of digital technology is an improvement and expansion of the practice of digital architecture design revolution. We worked by reading and analyzing information about the digital architecture development process, a large number of cases, as well as architectural design and construction as a whole process. Thus current developments were introduced and discussed in our paper, such as architectural discourse, design theory, digital design models and techniques, material selecting, as well as artificial intelligence space design. Our paper also pays attention to the representative three cases of digital design and construction experiment at great length in detail to expound high-informatization, high-reliability intelligence, and high-technique in constructing a humane space to cope with the rapid development of urbanization. We concluded that the opportunities and challenges of the shift existed in architectural paradigms, such as the cooperation methods, theories, models, technologies and techniques which were currently employed in digital design research and digital praxis. We also find out that the innovative use of space can gradually change the way people learn, talk, and control information. The past two decades, digital technology radically breaks the technology constraints of industrial technical products, digests the publicity on a particular architectural style (era doctrine). People should not adapt to the machine, but in turn, it’s better to make the machine work for users.Keywords: artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction
Procedia PDF Downloads 1361609 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory
Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock
Abstract:
Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.Keywords: subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing
Procedia PDF Downloads 1301608 An Approach on Intelligent Tolerancing of Car Body Parts Based on Historical Measurement Data
Authors: Kai Warsoenke, Maik Mackiewicz
Abstract:
To achieve a high quality of assembled car body structures, tolerancing is used to ensure a geometric accuracy of the single car body parts. There are two main techniques to determine the required tolerances. The first is tolerance analysis which describes the influence of individually tolerated input values on a required target value. Second is tolerance synthesis to determine the location of individual tolerances to achieve a target value. Both techniques are based on classical statistical methods, which assume certain probability distributions. To ensure competitiveness in both saturated and dynamic markets, production processes in vehicle manufacturing must be flexible and efficient. The dimensional specifications selected for the individual body components and the resulting assemblies have a major influence of the quality of the process. For example, in the manufacturing of forming tools as operating equipment or in the higher level of car body assembly. As part of the metrological process monitoring, manufactured individual parts and assemblies are recorded and the measurement results are stored in databases. They serve as information for the temporary adjustment of the production processes and are interpreted by experts in order to derive suitable adjustments measures. In the production of forming tools, this means that time-consuming and costly changes of the tool surface have to be made, while in the body shop, uncertainties that are difficult to control result in cost-intensive rework. The stored measurement results are not used to intelligently design tolerances in future processes or to support temporary decisions based on real-world geometric data. They offer potential to extend the tolerancing methods through data analysis and machine learning models. The purpose of this paper is to examine real-world measurement data from individual car body components, as well as assemblies, in order to develop an approach for using the data in short-term actions and future projects. For this reason, the measurement data will be analyzed descriptively in the first step in order to characterize their behavior and to determine possible correlations. In the following, a database is created that is suitable for developing machine learning models. The objective is to create an intelligent way to determine the position and number of measurement points as well as the local tolerance range. For this a number of different model types are compared and evaluated. The models with the best result are used to optimize equally distributed measuring points on unknown car body part geometries and to assign tolerance ranges to them. The current results of this investigation are still in progress. However, there are areas of the car body parts which behave more sensitively compared to the overall part and indicate that intelligent tolerancing is useful here in order to design and control preceding and succeeding processes more efficiently.Keywords: automotive production, machine learning, process optimization, smart tolerancing
Procedia PDF Downloads 116