Search results for: automated teller machines (atm)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1584

Search results for: automated teller machines (atm)

264 Machine Learning in Agriculture: A Brief Review

Authors: Aishi Kundu, Elhan Raza

Abstract:

"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.

Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting

Procedia PDF Downloads 105
263 Role of Imaging in Alzheimer's Disease Trials: Impact on Trial Planning, Patient Recruitment and Retention

Authors: Kohkan Shamsi

Abstract:

Background: MRI and PET are now extensively utilized in Alzheimer's disease (AD) trials for patient eligibility, efficacy assessment, and safety evaluations but including imaging in AD trials impacts site selection process, patient recruitment, and patient retention. Methods: PET/MRI are performed at baseline and at multiple follow-up timepoints. This requires prospective site imaging qualification, evaluation of phantom data, training and continuous monitoring of machines for acquisition of standardized and consistent data. This also requires prospective patient/caregiver training as patients must go to multiple facilities for imaging examinations. We will share our experience form one of the largest AD programs. Lesson learned: Many neurological diseases have a similar presentation as AD or could confound the assessment of drug therapy. The inclusion of wrong patients has ethical and legal issues, and data could be excluded from the analysis. Centralized eligibility evaluation read process will be discussed. Amyloid related imaging abnormalities (ARIA) were observed in amyloid-β trials. FDA recommended regular monitoring of ARIA. Our experience in ARIA evaluations in large phase III study at > 350 sites will be presented. Efficacy evaluation: MRI is utilized to evaluate various volumes of the brain. FDG PET or amyloid PET agents has been used in AD trials. We will share our experience about site and central independent reads. Imaging logistic issues that need to be handled in the planning phase will also be discussed as it can impact patient compliance thereby increasing missing data and affecting study results. Conclusion: imaging must be prospectively planned to include standardizing imaging methodologies, site selection process and selecting assessment criteria. Training should be transparently conducted and documented. Prospective patient/caregiver awareness of imaging requirement is essential for patient compliance and reduction in missing imaging data.

Keywords: Alzheimer's disease, ARIA, MRI, PET, patient recruitment, retention

Procedia PDF Downloads 115
262 Principal Component Analysis Combined Machine Learning Techniques on Pharmaceutical Samples by Laser Induced Breakdown Spectroscopy

Authors: Kemal Efe Eseller, Göktuğ Yazici

Abstract:

Laser-induced breakdown spectroscopy (LIBS) is a rapid optical atomic emission spectroscopy which is used for material identification and analysis with the advantages of in-situ analysis, elimination of intensive sample preparation, and micro-destructive properties for the material to be tested. LIBS delivers short pulses of laser beams onto the material in order to create plasma by excitation of the material to a certain threshold. The plasma characteristics, which consist of wavelength value and intensity amplitude, depends on the material and the experiment’s environment. In the present work, medicine samples’ spectrum profiles were obtained via LIBS. Medicine samples’ datasets include two different concentrations for both paracetamol based medicines, namely Aferin and Parafon. The spectrum data of the samples were preprocessed via filling outliers based on quartiles, smoothing spectra to eliminate noise and normalizing both wavelength and intensity axis. Statistical information was obtained and principal component analysis (PCA) was incorporated to both the preprocessed and raw datasets. The machine learning models were set based on two different train-test splits, which were 70% training – 30% test and 80% training – 20% test. Cross-validation was preferred to protect the models against overfitting; thus the sample amount is small. The machine learning results of preprocessed and raw datasets were subjected to comparison for both splits. This is the first time that all supervised machine learning classification algorithms; consisting of Decision Trees, Discriminant, naïve Bayes, Support Vector Machines (SVM), k-NN(k-Nearest Neighbor) Ensemble Learning and Neural Network algorithms; were incorporated to LIBS data of paracetamol based pharmaceutical samples, and their different concentrations on preprocessed and raw dataset in order to observe the effect of preprocessing.

Keywords: machine learning, laser-induced breakdown spectroscopy, medicines, principal component analysis, preprocessing

Procedia PDF Downloads 87
261 A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses

Authors: Rashima Mahajan, Dipali Bansal, Shweta Singh

Abstract:

Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotive EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.

Keywords: brain computer interface, electroencephalogram, EEGLab, BCILab, emotive, emotions, interval features, spectral features, artificial neural network, control applications

Procedia PDF Downloads 317
260 The Practice of Low Flow Anesthesia to Reduce Carbon Footprints Sustainability Project

Authors: Ahmed Eid, Amita Gupta

Abstract:

Abstract: Background: Background Medical gases are estimated to contribute to 5% of the carbon footprints produced by hospitals, Desflurane has the largest impact, but all increase significantly when used with N2O admixture. Climate Change Act 2008, we must reduce our carbon emission by 80% of the 1990 baseline by 2050.NHS carbon emissions have reduced by 18.5% (2007-2017). The NHS Long Term Plan has outlined measures to achieve this objective, including a 2% reduction by transforming anaesthetic practices. FGF is an important variable that determines the utilization of inhalational agents and can be tightly controlled by the anaesthetist. Aims and Objectives Environmental safety, Identification of areas of high N20 and different anaesthetic agents used across the St Helier operating theatres and consider improvising on the current practice. Methods: Data was collected from St Helier operating theatres and retrieved daily from Care Station 650 anaesthetic machines. 60 cases were included in the sample. Collected data (average flow rate, amount and type of agent used, duration of surgery, type of surgery, duration, and the total amount of Air, O2 and N2O used. AAGBI impact anaesthesia calculator was used to identify the amount of CO2 produced and also the cost per hour for every pt. Communication via reminder emails to staff emphasized the significance of low-flow anaesthesia and departmental meeting presentations aimed at heightening awareness of LFA, Distribution of AAGBI calculator QR codes in all theatres enables the calculation of volatile anaesthetic consumption and CO2e post each case, facilitating informed environmental impact assessment. Results: A significant reduction in the flow rate use in the 2nd sample was observed, flow rate usage between 0-1L was 60% which means a great reduction of the consumption of volatile anaesthetics and also Co2e. By using LFA we can save money but most importantly we can make our lives much greener and save the planet.

Keywords: low flow anesthesia, sustainability project, N₂0, Co2e

Procedia PDF Downloads 68
259 Influence of Machine Resistance Training on Selected Strength Variables among Two Categories of Body Composition

Authors: Hassan Almoslim

Abstract:

Background: The machine resistance training is an exercise that uses the equipment as loads to strengthen and condition the musculoskeletal system and improving muscle tone. The machine resistance training is easy to use, allow the individual to train with heavier weights without assistance, useful for beginners and elderly populations and specific muscle groups. Purpose: The purpose of this study was to examine the impact of nine weeks of machine resistance training on maximum strength among lean and normal weight male college students. Method: Thirty-six male college students aged between 19 and 21 years from King Fahd University of petroleum & minerals participated in the study. The subjects were divided into two an equal groups called Lean Group (LG, n = 18) and Normal Weight Group (NWG, n = 18). The subjects whose body mass index (BMI) is less than 18.5 kg / m2 is considered lean and who is between 18.5 to 24.9 kg / m2 is normal weight. Both groups performed machine resistance training nine weeks, twice per week for 40 min per training session. The strength measurements, chest press, leg press and abdomen exercises were performed before and after the training period. 1RM test was used to determine the maximum strength of all subjects. The training program consisted of several resistance machines such as leg press, abdomen, chest press, pulldown, seated row, calf raises, leg extension, leg curls and back extension. The data were analyzed using independent t-test (to compare mean differences) and paired t-test. The level of significance was set at 0.05. Results: No change was (P ˃ 0.05) observed in all body composition variables between groups after training. In chest press, the NWG recorded a significantly greater mean different value than the LG (19.33 ± 7.78 vs. 13.88 ± 5.77 kg, respectively, P ˂ 0.023). In leg press and abdomen exercises, both groups revealed similar mean different values (P ˃ 0.05). When the post-test was compared with the pre-test, the NWG showed significant increases in the chest press by 47% (from 41.16 ± 12.41 to 60.49 ± 11.58 kg, P ˂ 001), abdomen by 34% (from 45.46 ± 6.97 to 61.06 ± 6.45 kg, P ˂ 0.001) and leg press by 23.6% (from 85.27 ± 15.94 to 105.48 ± 21.59 kg, P ˂ 0.001). The LG also illustrated significant increases by 42.6% in the chest press (from 32.58 ± 7.36 to 46.47 ± 8.93 kg, P ˂ 0.001), the abdomen by 28.5% (from 38.50 ± 7.84 to 49.50 ± 7.88 kg, P ˂ 0.001) and the leg press by 30.8% (from 70.2 ± 20.57 to 92.01 ± 22.83 kg, P ˂ 0.001). Conclusion: It was concluded that the lean and the normal weight male college students can benefit from the machine resistance-training program remarkably.

Keywords: body composition, lean, machine resistance training, normal weight

Procedia PDF Downloads 356
258 Machine That Provides Mineral Fertilizer Equal to the Soil on the Slopes

Authors: Huseyn Nuraddin Qurbanov

Abstract:

The reliable food supply of the population of the republic is one of the main directions of the state's economic policy. Grain growing, which is the basis of agriculture, is important in this area. In the cultivation of cereals on the slopes, the application of equal amounts of mineral fertilizers the under the soil before sowing is a very important technological process. The low level of technical equipment in this area prevents producers from providing the country with the necessary quality cereals. Experience in the operation of modern technical means has shown that, at present, there is a need to provide an equal amount of fertilizer on the slopes to under the soil, fully meeting the agro-technical requirements. No fundamental changes have been made to the industrial machines that fertilize the under the soil, and unequal application of fertilizers under the soil on the slopes has been applied. This technological process leads to the destruction of new seedlings and reduced productivity due to intolerance to frost during the winter for the plant planted in the fall. In special climatic conditions, there is an optimal fertilization rate for each agricultural product. The application of fertilizers to the soil is one of the conditions that increase their efficiency in the field. As can be seen, the development of a new technical proposal for fertilizing and plowing the slopes in equal amounts on the slopes, improving the technological and design parameters, and taking into account the physical and mechanical properties of fertilizers is very important. Taking into account the above-mentioned issues, a combined plough was developed in our laboratory. Combined plough carries out pre-sowing technological operation in the cultivation of cereals, providing a smooth equal amount of mineral fertilizers under the soil on the slopes. Mathematical models of a smooth spreader that evenly distributes fertilizers in the field have been developed. Thus, diagrams and graphs obtained without distribution on the 8 partitions of the smooth spreader are constructed under the inclined angles of the slopes. Percentage and productivity of equal distribution in the field were noted by practical and theoretical analysis.

Keywords: combined plough, mineral fertilizer, equal sowing, fertilizer norm, grain-crops, sowing fertilizer

Procedia PDF Downloads 138
257 Development and Validation of Cylindrical Linear Oscillating Generator

Authors: Sungin Jeong

Abstract:

This paper presents a linear oscillating generator of cylindrical type for hybrid electric vehicle application. The focus of the study is the suggestion of the optimal model and the design rule of the cylindrical linear oscillating generator with permanent magnet in the back-iron translator. The cylindrical topology is achieved using equivalent magnetic circuit considering leakage elements as initial modeling. This topology with permanent magnet in the back-iron translator is described by number of phases and displacement of stroke. For more accurate analysis of an oscillating machine, it will be compared by moving just one-pole pitch forward and backward the thrust of single-phase system and three-phase system. Through the analysis and comparison, a single-phase system of cylindrical topology as the optimal topology is selected. Finally, the detailed design of the optimal topology takes the magnetic saturation effects into account by finite element analysis. Besides, the losses are examined to obtain more accurate results; copper loss in the conductors of machine windings, eddy-current loss of permanent magnet, and iron-loss of specific material of electrical steel. The considerations of thermal performances and mechanical robustness are essential, because they have an effect on the entire efficiency and the insulations of the machine due to the losses of the high temperature generated in each region of the generator. Besides electric machine with linear oscillating movement requires a support system that can resist dynamic forces and mechanical masses. As a result, the fatigue analysis of shaft is achieved by the kinetic equations. Also, the thermal characteristics are analyzed by the operating frequency in each region. The results of this study will give a very important design rule in the design of linear oscillating machines. It enables us to more accurate machine design and more accurate prediction of machine performances.

Keywords: equivalent magnetic circuit, finite element analysis, hybrid electric vehicle, linear oscillating generator

Procedia PDF Downloads 195
256 Dematerialized Beings in Katherine Dunn's Geek Love: A Corporeal and Ethical Study under Posthumanities

Authors: Anum Javed

Abstract:

This study identifies the dynamical image of human body that continues its metamorphosis in the virtual field of reality. It calls attention to the ways where humans start co-evolving with other life forms; technology in particular and are striving to establish a realm outside the physical framework of matter. The problem exceeds the area of technological ethics by explicably and explanatorily entering the space of literary texts and criticism. Textual analysis of Geek Love (1989) by Katherine Dunn is adjoined with posthumanist perspectives of Pramod K. Nayar to beget psycho-somatic changes in man’s nature of being. It uncovers the meaning people give to their experiences in this budding social and cultural phenomena of material representation tied up with personal practices and technological innovations. It also observes an ethical, physical and psychological reassessment of man within the context of technological evolutions. The study indicates the elements that have rendered morphological freedom and new materialism in man’s consciousness. Moreover this work is inquisitive of what it means to be a human in this time of accelerating change where surgeries, implants, extensions, cloning and robotics have shaped a new sense of being. It attempts to go beyond individual’s body image and explores how objectifying media and culture have influenced people’s judgement of others on new material grounds. It further argues a decentring of the glorified image of man as an independent entity because of his energetic partnership with intelligent machines and external agents. The history of the future progress of technology is also mentioned. The methodology adopted is posthumanist techno-ethical textual analysis. This work necessitates a negotiating relationship between man and technology in order to achieve harmonic and balanced interconnected existence. The study concludes by recommending a call for an ethical set of codes to be cultivated for the techno-human habituation. Posthumanism ushers a strong need of adopting new ethics within the terminology of neo-materialist humanism.

Keywords: corporeality, dematerialism, human ethos, posthumanism

Procedia PDF Downloads 147
255 Species Distribution and Incidence of Inducible Clindamycin Resistance in Coagulase-Negative Staphylococci Isolated from Blood Cultures of Patients with True Bacteremia in Turkey

Authors: Fatma Koksal Cakirlar, Murat Gunaydin, Nevri̇ye Gonullu, Nuri Kiraz

Abstract:

During the last few decades, the increasing prevalence of methicillin resistant-CoNS isolates has become a common problem worldwide. Macrolide-lincosamide-streptogramin B (MLSB) antibiotics are effectively used for the treatment of CoNS infections. However, resistance to MLSB antibiotics is prevalent among staphylococci. The aim of this study is to determine species distribution and the incidence of inducible clindamycin resistance in CoNS isolates caused nosocomial bacteremia in our hospital. Between January 2014 and October 2015, a total of 484 coagulase-negative CoNS isolates were isolated from blood samples of patients with true bacteremia who were hospitalized in intensive care units and in other departments of Istanbul University Cerrahpasa Medical Hospital. Blood cultures were analyzed with the BACTEC 9120 system (Becton Dickinson, USA). The identification and antimicrobial resistance of isolates were determined by Phoenix automated system (BD Diagnostic Systems, Sparks, MD). Inducible clindamycin resistance was detected using D-test. The species distribution was as follows: Staphylococcus epidermidis 211 (43%), S. hominis 154 (32%), S. haemolyticus 69 (14%), S. capitis 28 (6%), S. saprophyticus 11 (2%), S. warnerii 7 (1%), S. schleiferi 5 (1%) and S. lugdunensis 1 (0.2%). Resistance to methicillin was detected in 74.6% of CoNS isolates. Methicillin resistance was highest in S.hemoliticus isolates (89%). Resistance rates of CoNS strains to the antibacterial agents, respectively, were as follows: ampicillin 77%, gentamicin 20%, erythromycin 71%, clindamycin 22%, trimethoprim-sulfamethoxazole 45%, ciprofloxacin 52%, tetracycline 34%, rifampicin 20%, daptomycin 0.2% and linezolid 0.2%. None of the strains were resistant to vancomycin and teicoplanin. Fifteen (3%) CoNS isolates were D-test positive, inducible MLSB resistance type (iMLSB-phenotype), 94 (19%) were constitutively resistant (cMLSB -phenotype), and 237 (46,76%) isolates were found D-test negative, indicating truly clindamycin-susceptible MS phenotype (M-phenotype resistance). The incidence of iMLSB-phenotypes was higher in S. epidermidis isolates (4,7%) compared to other CoNS isolates.

Keywords: bacteremia, inducible MLSB resistance phenotype, methicillin-resistant, staphylococci

Procedia PDF Downloads 239
254 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes

Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker

Abstract:

The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.

Keywords: automation, battery production, carrier, advanced process control, cyber-physical system

Procedia PDF Downloads 338
253 Three Issues for Integrating Artificial Intelligence into Legal Reasoning

Authors: Fausto Morais

Abstract:

Artificial intelligence has been widely used in law. Programs are able to classify suits, to identify decision-making patterns, to predict outcomes, and to formalize legal arguments as well. In Brazil, the artificial intelligence victor has been classifying cases to supreme court’s standards. When those programs act doing those tasks, they simulate some kind of legal decision and legal arguments, raising doubts about how artificial intelligence can be integrated into legal reasoning. Taking this into account, the following three issues are identified; the problem of hypernormatization, the argument of legal anthropocentrism, and the artificial legal principles. Hypernormatization can be seen in the Brazilian legal context in the Supreme Court’s usage of the Victor program. This program generated efficiency and consistency. On the other hand, there is a feasible risk of over standardizing factual and normative legal features. Then legal clerks and programmers should work together to develop an adequate way to model legal language into computational code. If this is possible, intelligent programs may enact legal decisions in easy cases automatically cases, and, in this picture, the legal anthropocentrism argument takes place. Such an argument argues that just humans beings should enact legal decisions. This is so because human beings have a conscience, free will, and self unity. In spite of that, it is possible to argue against the anthropocentrism argument and to show how intelligent programs may work overcoming human beings' problems like misleading cognition, emotions, and lack of memory. In this way, intelligent machines could be able to pass legal decisions automatically by classification, as Victor in Brazil does, because they are binding by legal patterns and should not deviate from them. Notwithstanding, artificial intelligent programs can be helpful beyond easy cases. In hard cases, they are able to identify legal standards and legal arguments by using machine learning. For that, a dataset of legal decisions regarding a particular matter must be available, which is a reality in Brazilian Judiciary. Doing such procedure, artificial intelligent programs can support a human decision in hard cases, providing legal standards and arguments based on empirical evidence. Those legal features claim an argumentative weight in legal reasoning and should serve as references for judges when they must decide to maintain or overcome a legal standard.

Keywords: artificial intelligence, artificial legal principles, hypernormatization, legal anthropocentrism argument, legal reasoning

Procedia PDF Downloads 145
252 Matrix-Based Linear Analysis of Switched Reluctance Generator with Optimum Pole Angles Determination

Authors: Walid A. M. Ghoneim, Hamdy A. Ashour, Asmaa E. Abdo

Abstract:

In this paper, linear analysis of a Switched Reluctance Generator (SRG) model is applied on the most common configurations (4/2, 6/4 and 8/6) for both conventional short-pitched and fully-pitched designs, in order to determine the optimum stator/rotor pole angles at which the maximum output voltage is generated per unit excitation current. This study is focused on SRG analysis and design as a proposed solution for renewable energy applications, such as wind energy conversion systems. The world’s potential to develop the renewable energy technologies through dedicated scientific researches was the motive behind this study due to its positive impact on economy and environment. In addition, the problem of rare earth metals (Permanent magnet) caused by mining limitations, banned export by top producers and environment restrictions leads to the unavailability of materials used for rotating machines manufacturing. This challenge gave authors the opportunity to study, analyze and determine the optimum design of the SRG that has the benefit to be free from permanent magnets, rotor windings, with flexible control system and compatible with any application that requires variable-speed operation. In addition, SRG has been proved to be very efficient and reliable in both low-speed or high-speed applications. Linear analysis was performed using MATLAB simulations based on the (Modified generalized matrix approach) of Switched Reluctance Machine (SRM). About 90 different pole angles combinations and excitation patterns were simulated through this study, and the optimum output results for each case were recorded and presented in detail. This procedure has been proved to be applicable for any SRG configuration, dimension and excitation pattern. The delivered results of this study provide evidence for using the 4-phase 8/6 fully pitched SRG as the main optimum configuration for the same machine dimensions at the same angular speed.

Keywords: generalized matrix approach, linear analysis, renewable applications, switched reluctance generator

Procedia PDF Downloads 198
251 Segmenting 3D Optical Coherence Tomography Images Using a Kalman Filter

Authors: Deniz Guven, Wil Ward, Jinming Duan, Li Bai

Abstract:

Over the past two decades or so, Optical Coherence Tomography (OCT) has been used to diagnose retina and optic nerve diseases. The retinal nerve fibre layer, for example, is a powerful diagnostic marker for detecting and staging glaucoma. With the advances in optical imaging hardware, the adoption of OCT is now commonplace in clinics. More and more OCT images are being generated, and for these OCT images to have clinical applicability, accurate automated OCT image segmentation software is needed. Oct image segmentation is still an active research area, as OCT images are inherently noisy, with the multiplicative speckling noise. Simple edge detection algorithms are unsuitable for detecting retinal layer boundaries in OCT images. Intensity fluctuation, motion artefact, and the presence of blood vessels also decrease further OCT image quality. In this paper, we introduce a new method for segmenting three-dimensional (3D) OCT images. This involves the use of a Kalman filter, which is commonly used in computer vision for object tracking. The Kalman filter is applied to the 3D OCT image volume to track the retinal layer boundaries through the slices within the volume and thus segmenting the 3D image. Specifically, after some pre-processing of the OCT images, points on the retinal layer boundaries in the first image are identified, and curve fitting is applied to them such that the layer boundaries can be represented by the coefficients of the curve equations. These coefficients then form the state space for the Kalman Filter. The filter then produces an optimal estimate of the current state of the system by updating its previous state using the measurements available in the form of a feedback control loop. The results show that the algorithm can be used to segment the retinal layers in OCT images. One of the limitations of the current algorithm is that the curve representation of the retinal layer boundary does not work well when the layer boundary is split into two, e.g., at the optic nerve, the layer boundary split into two. This maybe resolved by using a different approach to representing the boundaries, such as b-splines or level sets. The use of a Kalman filter shows promise to developing accurate and effective 3D OCT segmentation methods.

Keywords: optical coherence tomography, image segmentation, Kalman filter, object tracking

Procedia PDF Downloads 482
250 Transient Freshwater-Saltwater Transition-Zone Dynamics in Heterogeneous Coastal Aquifers

Authors: Antoifi Abdoulhalik, Ashraf Ahmed

Abstract:

The ever growing threat of saltwater intrusion has prompted the need to further advance the understanding of underlying processes related to SWI for effective water resource management. While research efforts have mainly been focused on steady state analysis, studies on the transience of saltwater intrusion mechanism remain very scarce and studies considering transient SWI in heterogeneous medium are, as per our knowledge, simply inexistent. This study provides for the first time a quantitative analysis of the effect of both inland and coastal water level changes on the transition zone under transient conditions in layered coastal aquifer. In all, two sets of four experiments were completed, including a homogeneous case, and four layered cases: case LH and case HL presented were two bi-layered scenarios where a low K layer was set at the top and the bottom, respectively; case HLH and case LHL presented two stratified aquifers with High K–Low K–High K and Low K–High K– Low K pattern, respectively. Experimental automated image analysis technique was used here to quantify the main SWI parameters under high spatial and temporal resolution. The findings of this study provide an invaluable insight on the underlying processes responsible of transition zone dynamics in coastal aquifers. The results show that in all the investigated cases, the width of the transition zone remains almost unchanged throughout the saltwater intrusion process regardless of where the boundary change occurs. However, the results demonstrate that the width of the transition zone considerably increases during the retreat, with largest amplitude observed in cases LH and LHL, where a low K was set at the top of the system. In all the scenarios, the amplitude of widening was slightly smaller when the retreat was prompted by instantaneous drop of the saltwater level than when caused by inland freshwater rise, despite equivalent absolute head change magnitude. The magnitude of head change significantly caused larger widening during the saltwater wedge retreat, while having no impact during the intrusion phase.

Keywords: freshwater-saltwater transition-zone dynamics, heterogeneous coastal aquifers, laboratory experiments, transience seawater intrusion

Procedia PDF Downloads 241
249 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 121
248 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 129
247 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should properly evaluate their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, Neural Networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable to offer an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 80
246 150 KVA Multifunction Laboratory Test Unit Based on Power-Frequency Converter

Authors: Bartosz Kedra, Robert Malkowski

Abstract:

This paper provides description and presentation of laboratory test unit built basing on 150 kVA power frequency converter and Simulink RealTime platform. Assumptions, based on criteria which load and generator types may be simulated using discussed device, are presented, as well as control algorithm structure. As laboratory setup contains transformer with thyristor controlled tap changer, a wider scope of setup capabilities is presented. Information about used communication interface, data maintenance, and storage solution as well as used Simulink real-time features is presented. List and description of all measurements are provided. Potential of laboratory setup modifications is evaluated. For purposes of Rapid Control Prototyping, a dedicated environment was used Simulink RealTime. Therefore, load model Functional Unit Controller is based on a PC computer with I/O cards and Simulink RealTime software. Simulink RealTime was used to create real-time applications directly from Simulink models. In the next step, applications were loaded on a target computer connected to physical devices that provided opportunity to perform Hardware in the Loop (HIL) tests, as well as the mentioned Rapid Control Prototyping process. With Simulink RealTime, Simulink models were extended with I/O cards driver blocks that made automatic generation of real-time applications and performing interactive or automated runs on a dedicated target computer equipped with a real-time kernel, multicore CPU, and I/O cards possible. Results of performed laboratory tests are presented. Different load configurations are described and experimental results are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area and arbitrary active and reactive power regulation basing on defined schedule.

Keywords: MATLAB, power converter, Simulink Real-Time, thyristor-controlled tap changer

Procedia PDF Downloads 323
245 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should evaluate properly their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, neural networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable of offering an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 70
244 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 418
243 The Key Role of a Bystander Improving the Effectiveness of Cardiopulmonary Resuscitation Performed in Extra-Urban Areas

Authors: Leszek Szpakowski, Daniel Celiński, Sławomir Pilip, Grzegorz Michalak

Abstract:

The aim of the study was to analyse the usefulness of the 'E-rescuer' pilot project planned to be implemented in a chosen area of Eastern Poland in the cases of suspected sudden cardiac arrests in the extra-urban areas. Inventing an application allowing to dispatch simultaneously both Medical Emergency Teams and the E-rescuer to the place of the accident is the crucial assumption of the mentioned pilot project. The E-rescuer is defined to be the trained person able to take effective basic life support and to use automated external defibrillator. Having logged in using a smartphone, the E-rescuer's readiness is reported online to provide cardiopulmonary resuscitation exactly at the given location. Due to the accurately defined location of the E-rescuer, his arrival time is possible to be precisely fixed, and the substantive support through the displayed algorithms is capable of being provided as well. Having analysed the medical records in the years 2015-2016, cardiopulmonary resuscitation was considered to be effective when an early indication of circulation was provided, and the patient was taken to hospital. In the mentioned term, there were 2.291 cases of a sudden cardiac arrest. Cardiopulmonary resuscitation was taken in 621 patients in total including 205 people in the urban area and 416 in the extra-urban areas. The effectiveness of cardiopulmonary resuscitation in the extra-urban areas was much lower (33,8%) than in the urban (50,7%). The average ambulance arrival time was respectively longer in the extra-urban areas, and it was 12,3 minutes while in the urban area 3,3 minutes. There was no significant difference in the average age of studied patients - 62,5 and 64,8 years old. However, the average ambulance arrival time was 7,6 minutes for effective resuscitations and 10,5 minutes for ineffective ones. Hence, the ambulance arrival time is a crucial factor influencing on the effectiveness of cardiopulmonary resuscitation, especially in the extra-urban areas where it is much longer than in the urban. The key role of trained E-rescuers being nearby taking basic life support before the ambulance arrival can effectively support Emergency Medical Services System in Poland.

Keywords: basic life support, bystander, effectiveness, resuscitation

Procedia PDF Downloads 203
242 Radiation Protection and Licensing for an Experimental Fusion Facility: The Italian and European Approaches

Authors: S. Sandri, G. M. Contessa, C. Poggi

Abstract:

An experimental nuclear fusion device could be seen as a step toward the development of the future nuclear fusion power plant. If compared with other possible solutions to the energy problem, nuclear fusion has advantages that ensure sustainability and security. In particular considering the radioactivity and the radioactive waste produced, in a nuclear fusion plant the component materials could be selected in order to limit the decay period, making it possible the recycling in a new reactor after about 100 years from the beginning of the decommissioning. To achieve this and other pertinent goals many experimental machines have been developed and operated worldwide in the last decades, underlining that radiation protection and workers exposure are critical aspects of these facilities due to the high flux, high energy neutrons produced in the fusion reactions. Direct radiation, material activation, tritium diffusion and other related issues pose a real challenge to the demonstration that these devices are safer than the nuclear fission facilities. In Italy, a limited number of fusion facilities have been constructed and operated since 30 years ago, mainly at the ENEA Frascati Center, and the radiation protection approach, addressed by the national licensing requirements, shows that it is not always easy to respect the constraints for the workers' exposure to ionizing radiation. In the current analysis, the main radiation protection issues encountered in the Italian Fusion facilities are considered and discussed, and the technical and legal requirements are described. The licensing process for these kinds of devices is outlined and compared with that of other European countries. The following aspects are considered throughout the current study: i) description of the installation, plant and systems, ii) suitability of the area, buildings, and structures, iii) radioprotection structures and organization, iv) exposure of personnel, v) accident analysis and relevant radiological consequences, vi) radioactive wastes assessment and management. In conclusion, the analysis points out the needing of a special attention to the radiological exposure of the workers in order to demonstrate at least the same level of safety as that reached at the nuclear fission facilities.

Keywords: fusion facilities, high energy neutrons, licensing process, radiation protection

Procedia PDF Downloads 353
241 A Focused, High-Intensity Spread-Spectrum Ultrasound Solution to Prevent Biofouling

Authors: Alan T. Sassler

Abstract:

Biofouling is a significant issue for ships, especially those based in warm water ports. Biofouling damages hull coatings, degrades platform hydrodynamics, blocks cooling water intakes, and returns, reduces platform range and speed, and increases fuel consumption. Although platforms are protected to some degree by antifouling paints, these paints are much less effective on stationary platforms, and problematic biofouling can occur on antifouling paint-protected stationary platforms in some environments in as little as a matter of weeks. Remediation hull cleaning operations are possible, but they are very expensive, sometimes result in damage to the vessel’s paint or hull and are generally not completely effective. Ultrasound with sufficient intensity focused on specific frequency ranges can be used to prevent the growth of biofouling organisms. The use of ultrasound to prevent biofouling isn't new, but systems to date have focused on protecting platforms by shaking the hull using internally mounted transducers similar to those used in ultrasonic cleaning machines. While potentially effective, this methodology doesn't scale well to large platforms, and there are significant costs associated with installing and maintaining these systems, which dwarf the initial purchase price. An alternative approach has been developed, which uses highly directional pier-mounted transducers to project high-intensity spread-spectrum ultrasonic energy into the water column focused near the surface. This focused energy has been shown to prevent biofouling at ranges of up to 50 meters from the source. Spreading the energy out over a multi-kilohertz band makes the system both more effective and more environmentally friendly. This system has been shown to be both effective and inexpensive in small-scale testing and is now being characterized on a larger scale in selected marinas. To date, test results have been collected in Florida marinas suggesting that this approach can be used to keep ensonified areas of thousands of square meters free from biofouling, although care must be taken to minimize shaded areas.

Keywords: biofouling, ultrasonic, environmentally friendly antifoulant, marine protection, antifouling

Procedia PDF Downloads 60
240 The Effect of Subsurface Dam on Saltwater Intrusion in Heterogeneous Coastal Aquifers

Authors: Antoifi Abdoulhalik, Ashraf Ahmed

Abstract:

Saltwater intrusion (SWI) in coastal aquifers has become a growing threat for many countries around the world. While various control measures have been suggested to mitigate SWI, the construction of subsurface physical barriers remains one of the most effective solutions for this problem. In this work, we used laboratory experiments and numerical simulations to investigate the effectiveness of subsurface dams in heterogeneous layered coastal aquifer with different layering patterns. Four different cases were investigated, including a homogeneous (case H), and three heterogeneous cases in which a low permeability (K) layer was set in the top part of the system (case LH), in the middle part of the system (case HLH) and the bottom part of the system (case HL). Automated image analysis technique was implemented to quantify the main SWI parameters under high spatial and temporal resolution. The method also provides transient salt concentration maps, allowing for the first time clear visualization of the spillage of saline water over the dam (advancing wedge condition) as well as the flushing of residual saline water from the freshwater area (receding wedge condition). The SEAWAT code was adopted for the numerical simulations. The results show that the presence of an overlying layer of low permeability enhanced the ability of the dam to retain the saline water. In such conditions, the rate of saline water spillage and inland extension may considerably be reduced. Conversely, the presence of an underlying low K layer led to a faster increase of saltwater volume on the seaward side of the wall, therefore considerably facilitating the spillage. The results showed that a complete removal of the residual saline water eventually occurred in all the investigated scenarios, with a rate of removal strongly affected by the hydraulic conductivity of the lower part of the aquifer. The data showed that the addition of the underlying low K layer in case HL caused the complete flushing to be almost twice longer than in the homogeneous scenario.

Keywords: heterogeneous coastal aquifers, laboratory experiments, physical barriers, seawater intrusion control

Procedia PDF Downloads 251
239 Content-Aware Image Augmentation for Medical Imaging Applications

Authors: Filip Rusak, Yulia Arzhaeva, Dadong Wang

Abstract:

Machine learning based Computer-Aided Diagnosis (CAD) is gaining much popularity in medical imaging and diagnostic radiology. However, it requires a large amount of high quality and labeled training image datasets. The training images may come from different sources and be acquired from different radiography machines produced by different manufacturers, digital or digitized copies of film radiographs, with various sizes as well as different pixel intensity distributions. In this paper, a content-aware image augmentation method is presented to deal with these variations. The results of the proposed method have been validated graphically by plotting the removed and added seams of pixels on original images. Two different chest X-ray (CXR) datasets are used in the experiments. The CXRs in the datasets defer in size, some are digital CXRs while the others are digitized from analog CXR films. With the proposed content-aware augmentation method, the Seam Carving algorithm is employed to resize CXRs and the corresponding labels in the form of image masks, followed by histogram matching used to normalize the pixel intensities of digital radiography, based on the pixel intensity values of digitized radiographs. We implemented the algorithms, resized the well-known Montgomery dataset, to the size of the most frequently used Japanese Society of Radiological Technology (JSRT) dataset and normalized our digital CXRs for testing. This work resulted in the unified off-the-shelf CXR dataset composed of radiographs included in both, Montgomery and JSRT datasets. The experimental results show that even though the amount of augmentation is large, our algorithm can preserve the important information in lung fields, local structures, and global visual effect adequately. The proposed method can be used to augment training and testing image data sets so that the trained machine learning model can be used to process CXRs from various sources, and it can be potentially used broadly in any medical imaging applications.

Keywords: computer-aided diagnosis, image augmentation, lung segmentation, medical imaging, seam carving

Procedia PDF Downloads 223
238 The Development of a Digitally Connected Factory Architecture to Enable Product Lifecycle Management for the Assembly of Aerostructures

Authors: Nicky Wilson, Graeme Ralph

Abstract:

Legacy aerostructure assembly is defined by large components, low build rates, and manual assembly methods. With an increasing demand for commercial aircraft and emerging markets such as the eVTOL (electric vertical take-off and landing) market, current methods of manufacturing are not capable of efficiently hitting these higher-rate demands. This project will look at how legacy manufacturing processes can be rate enabled by taking a holistic view of data usage, focusing on how data can be collected to enable fully integrated digital factories and supply chains. The study will focus on how data is flowed both up and down the supply chain to create a digital thread specific to each part and assembly while enabling machine learning through real-time, closed-loop feedback systems. The study will also develop a bespoke architecture to enable connectivity both within the factory and the wider PLM (product lifecycle management) system, moving away from traditional point-to-point systems used to connect IO devices to a hub and spoke architecture that will exploit report-by-exception principles. This paper outlines the key issues facing legacy aircraft manufacturers, focusing on what future manufacturing will look like from adopting Industry 4 principles. The research also defines the data architecture of a PLM system to enable the transfer and control of a digital thread within the supply chain and proposes a standardised communications protocol to enable a scalable solution to connect IO devices within a production environment. This research comes at a critical time for aerospace manufacturers, who are seeing a shift towards the integration of digital technologies within legacy production environments, while also seeing build rates continue to grow. It is vital that manufacturing processes become more efficient in order to meet these demands while also securing future work for many manufacturers.

Keywords: Industry 4, digital transformation, IoT, PLM, automated assembly, connected factories

Procedia PDF Downloads 79
237 An Open-Source Guidance System for an Autonomous Planter Robot in Precision Agriculture

Authors: Nardjes Hamini, Mohamed Bachir Yagoubi

Abstract:

Precision agriculture has revolutionized farming by enabling farmers to monitor their crops remotely in real-time. By utilizing technologies such as sensors, farmers can detect the state of growth, hydration levels, and nutritional status and even identify diseases affecting their crops. With this information, farmers can make informed decisions regarding irrigation, fertilization, and pesticide application. Automated agricultural tasks, such as plowing, seeding, planting, and harvesting, are carried out by autonomous robots and have helped reduce costs and increase production. Despite the advantages of precision agriculture, its high cost makes it inaccessible to small and medium-sized farms. To address this issue, this paper presents an open-source guidance system for an autonomous planter robot. The system is composed of a Raspberry Pi-type nanocomputer equipped with Wi-Fi, a GPS module, a gyroscope, and a power supply module. The accompanying application allows users to enter and calibrate maps with at least four coordinates, enabling the localized contour of the parcel to be captured. The application comprises several modules, such as the mission entry module, which traces the planting trajectory and points, and the action plan entry module, which creates an ordered list of pre-established tasks such as loading, following the plan, returning to the garage, and entering sleep mode. A remote control module enables users to control the robot manually, visualize its location on the map, and use a real-time camera. Wi-Fi coverage is provided by an outdoor access point, covering a 2km circle. This open-source system offers a low-cost alternative for small and medium-sized farms, enabling them to benefit from the advantages of precision agriculture.

Keywords: autonomous robot, guidance system, low-cost, medium farms, open-source system, planter robot, precision agriculture, real-time monitoring, remote control, small farms

Procedia PDF Downloads 110
236 The Harmonious Blend of Digitalization and 3D Printing: Advancing Aerospace Jet Pump Development

Authors: Subrata Sarkar

Abstract:

The aerospace industry is experiencing a profound product development transformation driven by the powerful integration of digitalization and 3D printing technologies. This paper delves into the significant impact of this convergence on aerospace innovation, specifically focusing on developing jet pumps for fuel systems. This case study is a compelling example of the immense potential of these technologies. In response to the industry's increasing demand for lighter, more efficient, and customized components, the combined capabilities of digitalization and 3D printing are reshaping how we envision, design, and manufacture critical aircraft parts, offering a distinct paradigm in aerospace engineering. Consider the development of a jet pump for a fuel system, a task that presents unique and complex challenges. Despite its seemingly simple design, the jet pump's development is hindered by many demanding operating conditions. The qualification process for these pumps involves many analyses and tests, leading to substantial delays and increased costs in fuel system development. However, by harnessing the power of automated simulations and integrating legacy design, manufacturing, and test data through digitalization, we can optimize the jet pump's design and performance, thereby revolutionizing product development. Furthermore, 3D printing's ability to create intricate structures using various materials, from lightweight polymers to high-strength alloys, holds the promise of highly efficient and durable jet pumps. The combined impact of digitalization and 3D printing extends beyond design, as it also reduces material waste and advances sustainability goals, aligning with the industry's increasing commitment to environmental responsibility. In conclusion, the convergence of digitalization and 3D printing is not just a technological advancement but a gateway to a new era in aerospace product development, particularly in the design of jet pumps. This revolution promises to redefine how we create aerospace components, making them safer, more efficient, and environmentally responsible. As we stand at the forefront of this technological revolution, aerospace companies must embrace these technologies as a choice and a strategic imperative for those striving to lead in innovation and sustainability in the 21st century.

Keywords: jet pump, digitalization, 3D printing, aircraft fuel system.

Procedia PDF Downloads 56
235 SynKit: A Event-Driven and Scalable Microservices-Based Kitting System

Authors: Bruno Nascimento, Cristina Wanzeller, Jorge Silva, João A. Dias, André Barbosa, José Ribeiro

Abstract:

The increasing complexity of logistics operations stems from evolving business needs, such as the shift from mass production to mass customization, which demands greater efficiency and flexibility. In response, Industry 4.0 and 5.0 technologies provide improved solutions to enhance operational agility and better meet market demands. The management of kitting zones, combined with the use of Autonomous Mobile Robots, faces challenges related to coordination, resource optimization, and rapid response to customer demand fluctuations. Additionally, implementing lean manufacturing practices in this context must be carefully orchestrated by intelligent systems and human operators to maximize efficiency without sacrificing the agility required in an advanced production environment. This paper proposes and implements a microservices-based architecture integrating principles from Industry 4.0 and 5.0 with lean manufacturing practices. The architecture enhances communication and coordination between autonomous vehicles and kitting management systems, allowing more efficient resource utilization and increased scalability. The proposed architecture focuses on the modularity and flexibility of operations, enabling seamless flexibility to change demands and the efficient allocation of resources in realtime. Conducting this approach is expected to significantly improve logistics operations’ efficiency and scalability by reducing waste and optimizing resource use while improving responsiveness to demand changes. The implementation of this architecture provides a robust foundation for the continuous evolution of kitting management and process optimization. It is designed to adapt to dynamic environments marked by rapid shifts in production demands and real-time decision-making. It also ensures seamless integration with automated systems, aligning with Industry 4.0 and 5.0 needs while reinforcing Lean Manufacturing principles.

Keywords: microservices, event-driven, kitting, AMR, lean manufacturing, industry 4.0, industry 5.0

Procedia PDF Downloads 24