Search results for: automatic processing
3079 Variable Shunt Reactors for Reactive Power Compensation of HV Subsea Cables
Authors: Saeed A. AlGhamdi, Nabil Habli, Vinoj Somasanran
Abstract:
This paper presents an application of 230 kV Variable Shunt Reactors (VSR) used to compensate reactive power of dual 90 KM subsea cables. VSR integrates an on-load tap changer (OLTC) that adjusts reactive power compensation to maintain acceptable bus voltages under variable load profile and network configuration. An automatic voltage regulator (AVR) or a power management system (PMS) that allows VSR rating to be changed in discrete steps typically controls the OLTC. Typical regulation range start as minimum as 20% up to 100% and are available for systems up to 550kV. The regulation speed is normally in the order of seconds per step and approximately a minute from maximum to minimum rating. VSR can be bus or line connected depending on line/cable length and compensation requirements. The flexible reactive compensation ranges achieved by recent VSR technologies have enabled newer facilities design to deploy line connected VSR through either disconnect switches, which saves space and cost, or through circuit breakers. Lines with VSR are typically energized with lower taps (reduced reactive compensation) to minimize or remove the presence of delayed zero crossing.Keywords: power management, reactive power, subsea cables, variable shunt reactors
Procedia PDF Downloads 2523078 An Automated System for the Detection of Citrus Greening Disease Based on Visual Descriptors
Authors: Sidra Naeem, Ayesha Naeem, Sahar Rahim, Nadia Nawaz Qadri
Abstract:
Citrus greening is a bacterial disease that causes considerable damage to citrus fruits worldwide. Efficient method for this disease detection must be carried out to minimize the production loss. This paper presents a pattern recognition system that comprises three stages for the detection of citrus greening from Orange leaves: segmentation, feature extraction and classification. Image segmentation is accomplished by adaptive thresholding. The feature extraction stage comprises of three visual descriptors i.e. shape, color and texture. From shape feature we have used asymmetry index, from color feature we have used histogram of Cb component from YCbCr domain and from texture feature we have used local binary pattern. Classification was done using support vector machines and k nearest neighbors. The best performances of the system is Accuracy = 88.02% and AUROC = 90.1% was achieved by automatic segmented images. Our experiments validate that: (1). Segmentation is an imperative preprocessing step for computer assisted diagnosis of citrus greening, and (2). The combination of shape, color and texture features form a complementary set towards the identification of citrus greening disease.Keywords: citrus greening, pattern recognition, feature extraction, classification
Procedia PDF Downloads 1843077 Smart Side View Mirror Camera for Real Time System
Authors: Nunziata Ivana Guarneri, Arcangelo Bruna, Giuseppe Spampinato, Antonio Buemi
Abstract:
In the last decade, automotive companies have invested a lot in terms of innovation about many aspects regarding the automatic driver assistance systems. One innovation regards the usage of a smart camera placed on the car’s side mirror for monitoring the back and lateral road situation. A common road scenario is the overtaking of the preceding car and, in this case, a brief distraction or a loss of concentration can lead the driver to undertake this action, even if there is an already overtaking vehicle, leading to serious accidents. A valid support for a secure drive can be a smart camera system, which is able to automatically analyze the road scenario and consequentially to warn the driver when another vehicle is overtaking. This paper describes a method for monitoring the side view of a vehicle by using camera optical flow motion vectors. The proposed solution detects the presence of incoming vehicles, assesses their distance from the host car, and warns the driver through different levels of alert according to the estimated distance. Due to the low complexity and computational cost, the proposed system ensures real time performances.Keywords: camera calibration, ego-motion, Kalman filters, object tracking, real time systems
Procedia PDF Downloads 2283076 Component Interface Formalization in Robotic Systems
Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers
Abstract:
Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.Keywords: CPS, robots, software architecture, interface, ROS, autopilot
Procedia PDF Downloads 923075 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2073074 Leukocyte Detection Using Image Stitching and Color Overlapping Windows
Authors: Lina, Arlends Chris, Bagus Mulyawan, Agus B. Dharmawan
Abstract:
Blood cell analysis plays a significant role in the diagnosis of human health. As an alternative to the traditional technique conducted by laboratory technicians, this paper presents an automatic white blood cell (leukocyte) detection system using Image Stitching and Color Overlapping Windows. The advantage of this method is to present a detection technique of white blood cells that are robust to imperfect shapes of blood cells with various image qualities. The input for this application is images from a microscope-slide translation video. The preprocessing stage is performed by stitching the input images. First, the overlapping parts of the images are determined, then stitching and blending processes of two input images are performed. Next, the Color Overlapping Windows is performed for white blood cell detection which consists of color filtering, window candidate checking, window marking, finds window overlaps, and window cropping processes. Experimental results show that this method could achieve an average of 82.12% detection accuracy of the leukocyte images.Keywords: color overlapping windows, image stitching, leukocyte detection, white blood cell detection
Procedia PDF Downloads 3103073 Geographic Information System (GIS) for Structural Typology of Buildings
Authors: Néstor Iván Rojas, Wilson Medina Sierra
Abstract:
Managing spatial information is described through a Geographic Information System (GIS), for some neighborhoods in the city of Tunja, in relation to the structural typology of the buildings. The use of GIS provides tools that facilitate the capture, processing, analysis and dissemination of cartographic information, product quality evaluation of the classification of buildings. Allows the development of a method that unifies and standardizes processes information. The project aims to generate a geographic database that is useful to the entities responsible for planning and disaster prevention and care for vulnerable populations, also seeks to be a basis for seismic vulnerability studies that can contribute in a study of urban seismic microzonation. The methodology consists in capturing the plat including road naming, neighborhoods, blocks and buildings, to which were added as attributes, the product of the evaluation of each of the housing data such as the number of inhabitants and classification, year of construction, the predominant structural systems, the type of mezzanine board and state of favorability, the presence of geo-technical problems, the type of cover, the use of each building, damage to structural and non-structural elements . The above data are tabulated in a spreadsheet that includes cadastral number, through which are systematically included in the respective building that also has that attribute. Geo-referenced data base is obtained, from which graphical outputs are generated, producing thematic maps for each evaluated data, which clearly show the spatial distribution of the information obtained. Using GIS offers important advantages for spatial information management and facilitates consultation and update. Usefulness of the project is recognized as a basis for studies on issues of planning and prevention.Keywords: microzonation, buildings, geo-processing, cadastral number
Procedia PDF Downloads 3343072 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access
Authors: T. Wanyama, B. Far
Abstract:
Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.Keywords: community water usage, fuzzy logic, irrigation, multi-agent system
Procedia PDF Downloads 2983071 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 5893070 Effect of Anisotropy on Steady Creep in a Whisker Reinforced Functionally Graded Composite Disc
Authors: V. K. Gupta, Tejeet Singh
Abstract:
In many whisker reinforced composites, anisotropy may result due to material flow during processing operations such as forging, extrusion etc. The consequence of anisotropy, introduced during processing of disc material, has been investigated on the steady state creep deformations of the rotating disc. The disc material is assumed to undergo plastic deformations according to Hill’s anisotropic criterion. Steady state creep has been analyzed in a constant thickness rotating disc made of functionally graded 6061Al-SiCw (where the subscript ‘w’ stands for whisker) using Hill’s The content of reinforcement (SiCw) in the disc is assumed to decrease linearly from the inner to outer radius. The stresses and strain rates in the disc are estimated by solving the force equilibrium equation along with the constitutive equations describing multi-axial creep. The results obtained for anisotropic FGM disc have been compared with those estimated for isotropic FGM disc having the same average whisker content. The anisotropic constants, appearing in Hill’s yield criterion, have been obtained from the available experimental results. The results show that the presence of anisotropy reduces the tangential stress in the middle of the disc but near the inner and outer radii the tangential stress is higher when compared to isotropic disc. On the other hand, the steady state creep rates in the anisotropic disc are reduced significantly over the entire disc radius, with the maximum reduction observed at the inner radius. Further, in the presence of anisotropy the distribution of strain rate becomes relatively uniform over the entire disc, which may be responsible for reducing the extent of distortion in the disc.Keywords: anisotropy, creep, functionally graded composite, rotating disc
Procedia PDF Downloads 3923069 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 773068 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry
Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood
Abstract:
The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.Keywords: ADV, experimental data, multiple Reynolds number, post-processing
Procedia PDF Downloads 1483067 How to Talk about It without Talking about It: Cognitive Processing Therapy Offers Trauma Symptom Relief without Violating Cultural Norms
Authors: Anne Giles
Abstract:
Humans naturally wish they could forget traumatic experiences. To help prevent future harm, however, the human brain has evolved to retain data about experiences of threat, alarm, or violation. When given compassionate support and assistance with thinking helpfully and realistically about traumatic events, most people can adjust to experiencing hardships, albeit with residual sad, unfortunate memories. Persistent, recurrent, intrusive memories, difficulty sleeping, emotion dysregulation, and avoidance of reminders, however, may be symptoms of Post-traumatic Stress Disorder (PTSD). Brain scans show that PTSD affects brain functioning. We currently have no physical means of restoring the system of brain structures and functions involved with PTSD. Medications may ease some symptoms but not others. However, forms of "talk therapy" with cognitive components have been found by researchers to reduce, even resolve, a broad spectrum of trauma symptoms. Many cultures have taboos against talking about hardships. Individuals may present themselves to mental health care professionals with severe, disabling trauma symptoms but, because of cultural norms, be unable to speak about them. In China, for example, relationship expectations may include the belief, "Bad things happening in the family should stay in the family (jiāchǒu bùkě wàiyán 家丑不可外扬)." The concept of "family (jiā 家)" may include partnerships, close and extended families, communities, companies, and the nation itself. In contrast to many trauma therapies, Cognitive Processing Therapy (CPT) for Post-traumatic Stress Disorder asks its participants to focus not on "what" happened but on "why" they think the trauma(s) occurred. The question "why" activates and exercises cognitive functioning. Brain scans of individuals with PTSD reveal executive functioning portions of the brain inadequately active, with emotion centers overly active. CPT conceptualizes PTSD as a network of cognitive distortions that keep an individual "stuck" in this under-functioning and over-functioning dynamic. Through asking participants forms of the question "why," plus offering a protocol for examining answers and relinquishing unhelpful beliefs, CPT assists individuals in consciously reactivating the cognitive, executive functions of their brains, thus restoring normal functioning and reducing distressing trauma symptoms. The culturally sensitive components of CPT that allow people to "talk about it without talking about it" may offer the possibility for worldwide relief from symptoms of trauma.Keywords: cognitive processing therapy (CPT), cultural norms, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 2133066 Segmentation of Arabic Handwritten Numeral Strings Based on Watershed Approach
Authors: Nidal F. Shilbayeh, Remah W. Al-Khatib, Sameer A. Nooh
Abstract:
Arabic offline handwriting recognition systems are considered as one of the most challenging topics. Arabic Handwritten Numeral Strings are used to automate systems that deal with numbers such as postal code, banking account numbers and numbers on car plates. Segmentation of connected numerals is the main bottleneck in the handwritten numeral recognition system. This is in turn can increase the speed and efficiency of the recognition system. In this paper, we proposed algorithms for automatic segmentation and feature extraction of Arabic handwritten numeral strings based on Watershed approach. The algorithms have been designed and implemented to achieve the main goal of segmenting and extracting the string of numeral digits written by hand especially in a courtesy amount of bank checks. The segmentation algorithm partitions the string into multiple regions that can be associated with the properties of one or more criteria. The numeral extraction algorithm extracts the numeral string digits into separated individual digit. Both algorithms for segmentation and feature extraction have been tested successfully and efficiently for all types of numerals.Keywords: handwritten numerals, segmentation, courtesy amount, feature extraction, numeral recognition
Procedia PDF Downloads 3823065 Air-Coupled Ultrasonic Testing for Non-Destructive Evaluation of Various Aerospace Composite Materials by Laser Vibrometry
Authors: J. Vyas, R. Kazys, J. Sestoke
Abstract:
Air-coupled ultrasonic is the contactless ultrasonic measurement approach which has become widespread for material characterization in Aerospace industry. It is always essential for the requirement of lightest weight, without compromising the durability. To archive the requirements, composite materials are widely used. This paper yields analysis of the air-coupled ultrasonics for composite materials such as CFRP (Carbon Fibre Reinforced Polymer) and GLARE (Glass Fiber Metal Laminate) and honeycombs for the design of modern aircrafts. Laser vibrometry could be the key source of characterization for the aerospace components. The air-coupled ultrasonics fundamentals, including principles, working modes and transducer arrangements used for this purpose is also recounted in brief. The emphasis of this paper is to approach the developed NDT techniques based on the ultrasonic guided waves applications and the possibilities of use of laser vibrometry in different materials with non-contact measurement of guided waves. 3D assessment technique which employs the single point laser head using, automatic scanning relocation of the material to assess the mechanical displacement including pros and cons of the composite materials for aerospace applications with defects and delaminations.Keywords: air-coupled ultrasonics, contactless measurement, laser interferometry, NDT, ultrasonic guided waves
Procedia PDF Downloads 2393064 Challenges That People with Autism and Caregivers Face in Public Environments
Authors: Andrei Pomana, Graham Brewer
Abstract:
Autism is a lifelong developmental disorder that affects verbal and non-verbal communication, behaviour and sensory processing. As a result, people on the autism spectrum have a difficult time when confronted with environments that have high levels of sensory stimulation. This is often compounded by the inability to properly communicate their wants and needs to caregivers. The capacity for people with autism to integrate depends on their ability to at least tolerate highly stimulating public environments for short periods of time. The overall challenges that people on the spectrum and their caregivers face need to be established in order to properly create and assess methods to mitigate the effects of high stimulus public spaces. The paper aims to identify the challenges that people on the autism spectrum and their caregivers face in typical public environments. Nine experienced autism therapists have participated in a semi-structured interview regarding the challenges that people with autism and their caregivers face in public environments. The qualitative data shows that the unpredictability of events and the high sensory stimulation present in public environments, especially auditory, are the two biggest contributors to the difficulties that people on the spectrum face. If the stimuli are not removed in a short period of time, uncontrollable behaviours or 'meltdowns' can occur, which leave the person incapacitated and unable to respond to any outside input. Possible solutions to increase integration in public spaces for people with autism revolve around removing unwanted sensory stimulus, creating personalized barriers for certain stimuli, equipping people with autism with better tools to communicate their needs or to orient themselves to a safe location and providing a predictable pattern of events that would prepare individuals for tasks ahead of time.Keywords: autism, built environment, meltdown, public environment, sensory processing disorders
Procedia PDF Downloads 1633063 Data Modeling and Calibration of In-Line Pultrusion and Laser Ablation Machine Processes
Authors: David F. Nettleton, Christian Wasiak, Jonas Dorissen, David Gillen, Alexandr Tretyak, Elodie Bugnicourt, Alejandro Rosales
Abstract:
In this work, preliminary results are given for the modeling and calibration of two inline processes, pultrusion, and laser ablation, using machine learning techniques. The end product of the processes is the core of a medical guidewire, manufactured to comply with a user specification of diameter and flexibility. An ensemble approach is followed which requires training several models. Two state of the art machine learning algorithms are benchmarked: Kernel Recursive Least Squares (KRLS) and Support Vector Regression (SVR). The final objective is to build a precise digital model of the pultrusion and laser ablation process in order to calibrate the resulting diameter and flexibility of a medical guidewire, which is the end product while taking into account the friction on the forming die. The result is an ensemble of models, whose output is within a strict required tolerance and which covers the required range of diameter and flexibility of the guidewire end product. The modeling and automatic calibration of complex in-line industrial processes is a key aspect of the Industry 4.0 movement for cyber-physical systems.Keywords: calibration, data modeling, industrial processes, machine learning
Procedia PDF Downloads 2993062 Design of Target Selection for Pedestrian Autonomous Emergency Braking System
Authors: Tao Song, Hao Cheng, Guangfeng Tian, Chuang Xu
Abstract:
An autonomous emergency braking system is an advanced driving assistance system that enables vehicle collision avoidance and pedestrian collision avoidance to improve vehicle safety. At present, because the pedestrian target is small, and the mobility is large, the pedestrian AEB system is faced with more technical difficulties and higher functional requirements. In this paper, a method of pedestrian target selection based on a variable width funnel is proposed. Based on the current position and predicted position of pedestrians, the relative position of vehicle and pedestrian at the time of collision is calculated, and different braking strategies are adopted according to the hazard level of pedestrian collisions. In the CNCAP standard operating conditions, comparing the method of considering only the current position of pedestrians and the method of considering pedestrian prediction position, as well as the method based on fixed width funnel and variable width funnel, the results show that, based on variable width funnel, the choice of pedestrian target will be more accurate and the opportunity of the intervention of AEB system will be more reasonable by considering the predicted position of the pedestrian target and vehicle's lateral motion.Keywords: automatic emergency braking system, pedestrian target selection, TTC, variable width funnel
Procedia PDF Downloads 1573061 Potassium-Phosphorus-Nitrogen Detection and Spectral Segmentation Analysis Using Polarized Hyperspectral Imagery and Machine Learning
Authors: Nicholas V. Scott, Jack McCarthy
Abstract:
Military, law enforcement, and counter terrorism organizations are often tasked with target detection and image characterization of scenes containing explosive materials in various types of environments where light scattering intensity is high. Mitigation of this photonic noise using classical digital filtration and signal processing can be difficult. This is partially due to the lack of robust image processing methods for photonic noise removal, which strongly influence high resolution target detection and machine learning-based pattern recognition. Such analysis is crucial to the delivery of reliable intelligence. Polarization filters are a possible method for ambient glare reduction by allowing only certain modes of the electromagnetic field to be captured, providing strong scene contrast. An experiment was carried out utilizing a polarization lens attached to a hyperspectral imagery camera for the purpose of exploring the degree to which an imaged polarized scene of potassium, phosphorus, and nitrogen mixture allows for improved target detection and image segmentation. Preliminary imagery results based on the application of machine learning algorithms, including competitive leaky learning and distance metric analysis, to polarized hyperspectral imagery, suggest that polarization filters provide a slight advantage in image segmentation. The results of this work have implications for understanding the presence of explosive material in dry, desert areas where reflective glare is a significant impediment to scene characterization.Keywords: explosive material, hyperspectral imagery, image segmentation, machine learning, polarization
Procedia PDF Downloads 1423060 Determination of Selected Engineering Properties of Giant Palm Seeds (Borassus Aethiopum) in Relation to Its Oil Potential
Authors: Rasheed Amao Busari, Ahmed Ibrahim
Abstract:
The engineering properties of giant palms are crucial for the reasonable design of the processing and handling systems. The research was conducted to investigate some engineering properties of giant palm seeds in relation to their oil potential. The ripe giant palm fruit was sourced from some parts of Zaria in Kaduna State and Ado Ekiti in Ekiti State, Nigeria. The mesocarps of the fruits collected were removed to obtain the nuts, while the collected nuts were dried under ambient conditions for several days. The actual moisture content of the nuts at the time of the experiment was determined using KT100S Moisture Meter, with moisture content ranged 17.9% to 19.15%. The physical properties determined are axial dimension, geometric mean diameter, arithmetic mean diameter, sphericity, true and bulk densities, porosity, angles of repose, and coefficients of friction. The nuts were measured using a vernier caliper for physical assessment of their sizes. The axial dimensions of 100 nuts were taken and the result shows that the size ranges from 7.30 to 9.32cm for major diameter, 7.2 to 8.9 cm for intermediate diameter, and 4.2 to 6.33 for minor diameter. The mechanical properties determined were compressive force, compressive stress, and deformation both at peak and break using Instron hydraulic universal tensile testing machine. The work also revealed that giant palm seed can be classified as an oil-bearing seed. The seed gave 18% using the solvent extraction method. The results obtained from the study will help in solving the problem of equipment design, handling, and further processing of the seeds.Keywords: giant palm seeds, engineering properties, oil potential, moisture content, and giant palm fruit
Procedia PDF Downloads 783059 Enhanced Production of Endo-β-1,4-Xylanase from a Newly Isolated Thermophile Geobacillus stearothermophilus KIBGE-IB29 for Prospective Industrial Applications
Authors: Zainab Bibi, Afsheen Aman, Shah Ali Ul Qader
Abstract:
Endo-β-1,4-xylanases [EC 3.2.1.8] are one of the major groups of enzymes that are involved in degradation process of xylan and have several applications in food, textile and paper processing industries. Due to broad utility of endo-β-1,4-xylanase, researchers are focusing to increase the productivity of this hydrolase from various microbial species. Harsh industrial condition, faster reaction rate and efficient hydrolysis of xylan with low risk of contamination are critical requirements of industry that can be fulfilled by synthesizing the enzyme with efficient properties. In the current study, a newly isolated thermophile Geobacillus stearothermophilus KIBGE-IB29 was used in order to attain the maximum production of endo-1,4-β-xylanase. Bacterial culture was isolated from soil, collected around the blast furnace site of a steel processing mill, Karachi. Optimization of various nutritional and physical factors resulted the maximum synthesis of endo-1,4-β-xylanase from a thermophile. High production yield was achieved at 60°C and pH-6.0 after 24 hours of incubation period. Various nitrogen sources viz. peptone, yeast extract and meat extract improved the enzyme synthesis with 0.5%, 0.2% and 0.1% optimum concentrations. Dipotassium hydrogen phosphate (0.25%), potassium dihydrogen phosphate (0.05%), ammonium sulfate (0.05%) and calcium chloride (0.01%) were noticed as valuable salts to improve the production of enzyme. The thermophilic nature of isolate, with its broad pH stability profile and reduced fermentation time indicates its importance for effective xylan saccharification and for large scale production of endo-1,4-β-xylanase.Keywords: geobacillus, optimization, production, xylanase
Procedia PDF Downloads 3083058 The Effect of an Abnormal Prefrontal Cortex on the Symptoms of Attention Deficit/Hyperactivity Disorder
Authors: Irene M. Arora
Abstract:
Hypothesis: Attention Deficit Hyperactivity Disorder is the result of an underdeveloped prefrontal cortex which is the primary cause for the signs and symptoms seen as defining features of ADHD. Methods: Through ‘PubMed’, ‘Wiley’ and ‘Google Scholar’ studies published between 2011-2018 were evaluated, determining if a dysfunctional prefrontal cortex caused the characteristic symptoms associated with ADHD. The search terms "prefrontal cortex", "Attention-Deficit/Hyperactivity Disorder", "cognitive control", "frontostriatal tract" among others, were used to maximize the assortment of relevant studies. Excluded papers were systematic reviews, meta-analyses and publications published before 2010 to ensure clinical relevance. Results: Nine publications were analyzed in this review, all of which were non-randomized matched control studies. Three studies found a decrease in the functional integrity of the frontostriatal tract fibers in conjunction with four studies finding impaired frontal cortex stimulation. Prefrontal dysfunction, specifically medial and orbitofrontal areas, displayed abnormal functionality of reward processing in ADHD patients when compared to their normal counterparts. A total of 807 subjects were studied in this review, yielding that a little over half (54%) presented with remission of symptoms in adulthood. Conclusion: While the prefrontal cortex shows the highest consistency of impaired activity and thinner volumes in patients with ADHD, this is a heterogenous disorder implicating its pathophysiology to the dysfunction of other neural structures as well. However, remission of ADHD symptomatology in adulthood was found to be attributable to increased prefrontal functional connectivity and integration, suggesting a key role for the prefrontal cortex in the development of ADHD.Keywords: prefrontal cortex, ADHD, inattentive, impulsivity, reward processing
Procedia PDF Downloads 1203057 Autism Spectrum Disorder Classification Algorithm Using Multimodal Data Based on Graph Convolutional Network
Authors: Yuntao Liu, Lei Wang, Haoran Xia
Abstract:
Machine learning has shown extensive applications in the development of classification models for autism spectrum disorder (ASD) using neural image data. This paper proposes a fusion multi-modal classification network based on a graph neural network. First, the brain is segmented into 116 regions of interest using a medical segmentation template (AAL, Anatomical Automatic Labeling). The image features of sMRI and the signal features of fMRI are extracted, which build the node and edge embedding representations of the brain map. Then, we construct a dynamically updated brain map neural network and propose a method based on a dynamic brain map adjacency matrix update mechanism and learnable graph to further improve the accuracy of autism diagnosis and recognition results. Based on the Autism Brain Imaging Data Exchange I dataset(ABIDE I), we reached a prediction accuracy of 74% between ASD and TD subjects. Besides, to study the biomarkers that can help doctors analyze diseases and interpretability, we used the features by extracting the top five maximum and minimum ROI weights. This work provides a meaningful way for brain disorder identification.Keywords: autism spectrum disorder, brain map, supervised machine learning, graph network, multimodal data, model interpretability
Procedia PDF Downloads 673056 For Post-traumatic Stress Disorder Counselors in China, the United States, and around the Globe, Cultural Beliefs Offer Challenges and Opportunities
Authors: Anne Giles
Abstract:
Trauma is generally defined as an experience, or multiple experiences, overwhelming a person's ability to cope. Over time, many people recover from the neurobiological, physical, and emotional effects of trauma on their own. For some people, however, troubling symptoms develop over time that can result in distress and disability. This cluster of symptoms is classified as Post-traumatic Stress Disorder (PTSD). People who meet the criteria for PTSD and other trauma-related disorder diagnoses often hold a set of understandable but unfounded beliefs about traumatic events that cause undue suffering. Becoming aware of unhelpful beliefs—termed "cognitive distortions"—and challenging them is the realm of Cognitive Behavior Therapy (CBT). A form of CBT found by researchers to be especially effective for PTSD is Cognitive Processing Therapy (CPT). Through the compassionate use of CPT, people identify, examine, challenge, and relinquish unhelpful beliefs, thereby reducing symptoms and suffering. Widely-held cultural beliefs can interfere with the progress of recovery from trauma-related disorders. Although highly revered, largely unquestioned, and often stabilizing, cultural beliefs can be founded in simplistic, dichotomous thinking, i.e., things are all right, or all wrong, all good, or all bad. The reality, however, is nuanced and complex. After studying examples of cultural beliefs from China and the United States and how these might interfere with trauma recovery, trauma counselors can help clients derive criteria for preserving helpful beliefs, discover, examine, and jettison unhelpful beliefs, reduce trauma symptoms, and live their lives more freely and fully.Keywords: cognitive processing therapy (CPT), cultural beliefs, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 2503055 Competition between Verb-Based Implicit Causality and Theme Structure's Influence on Anaphora Bias in Mandarin Chinese Sentences: Evidence from Corpus
Authors: Linnan Zhang
Abstract:
Linguists, as well as psychologists, have shown great interests in implicit causality in reference processing. However, most frequently-used approaches to this issue are psychological experiments (such as eye tracking or self-paced reading, etc.). This research is a corpus-based one and is assisted with statistical tool – software R. The main focus of the present study is about the competition between verb-based implicit causality and theme structure’s influence on anaphora bias in Mandarin Chinese sentences. In Accessibility Theory, it is believed that salience, which is also known as accessibility, and relevance are two important factors in reference processing. Theme structure, which is a special syntactic structure in Chinese, determines the salience of an antecedent on the syntactic level while verb-based implicit causality is a key factor to the relevance between antecedent and anaphora. Therefore, it is a study about anaphora, combining psychology with linguistics. With analysis of the sentences from corpus as well as the statistical analysis of Multinomial Logistic Regression, major findings of the present study are as follows: 1. When the sentence is stated in a ‘cause-effect’ structure, the theme structure will always be the antecedent no matter forward biased verbs or backward biased verbs co-occur; in non-theme structure, the anaphora bias will tend to be the opposite of the verb bias; 2. When the sentence is stated in a ‘effect-cause’ structure, theme structure will not always be the antecedent and the influence of verb-based implicit causality will outweigh that of theme structure; moreover, the anaphora bias will be the same with the bias of verbs. All the results indicate that implicit causality functions conditionally and the noun in theme structure will not be the high-salience antecedent under any circumstances.Keywords: accessibility theory, anaphora, theme strcture, verb-based implicit causality
Procedia PDF Downloads 1983054 Effect of Extrusion Processing Parameters on Protein in Banana Flour Extrudates: Characterisation Using Fourier-Transform Infrared Spectroscopy
Authors: Surabhi Pandey, Pavuluri Srinivasa Rao
Abstract:
Extrusion processing is a high-temperature short time (HTST) treatment which can improve protein quality and digestibility together with retaining active nutrients. In-vitro protein digestibility of plant protein-based foods is generally enhanced by extrusion. The current study aimed to investigate the effect of extrusion cooking on in-vitro protein digestibility (IVPD) and conformational modification of protein in green banana flour extrudates. Green banana flour was extruded through a co-rotating twin-screw extruder varying the moisture content, barrel temperature, screw speed in the range of 10-20 %, 60-80 °C, 200-300 rpm, respectively, at constant feed rate. Response surface methodology was used to optimise the result for IVPD. Fourier-transform infrared spectroscopy (FTIR) analysis provided a convenient and powerful means to monitor interactions and changes in functional and conformational properties of extrudates. Results showed that protein digestibility was highest in extrudate produced at 80°C, 250 rpm and 15% feed moisture. FTIR analysis was done for the optimised sample having highest IVPD. FTIR analysis showed that there were no changes in primary structure of protein while the secondary protein structure changed. In order to explain this behaviour, infrared spectroscopy analysis was carried out, mainly in the amide I and II regions. Moreover, curve fitting analysis showed the conformational changes produced in the flour due to protein denaturation. The quantitative analysis of the changes in the amide I and II regions provided information about the modifications produced in banana flour extrudates.Keywords: extrusion, FTIR, protein conformation, raw banana flour, SDS-PAGE method
Procedia PDF Downloads 1623053 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine
Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen
Abstract:
Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.Keywords: cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma
Procedia PDF Downloads 1553052 Microencapsulation of Phenobarbital by Ethyl Cellulose Matrix
Authors: S. Bouameur, S. Chirani
Abstract:
The aim of this study was to evaluate the potential use of EthylCellulose in the preparation of microspheres as a Drug Delivery System for sustained release of phenobarbital. The microspheres were prepared by solvent evaporation technique using ethylcellulose as polymer matrix with a ratio 1:2, dichloromethane as solvent and Polyvinyl alcohol 1% as processing medium to solidify the microspheres. Size, shape, drug loading capacity and entrapement efficiency were studied.Keywords: phenobarbital, microspheres, ethylcellulose, polyvinylacohol
Procedia PDF Downloads 3613051 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 903050 Preserving Urban Cultural Heritage with Deep Learning: Color Planning for Japanese Merchant Towns
Authors: Dongqi Li, Yunjia Huang, Tomo Inoue, Kohei Inoue
Abstract:
With urbanization, urban cultural heritage is facing the impact and destruction of modernization and urbanization. Many historical areas are losing their historical information and regional cultural characteristics, so it is necessary to carry out systematic color planning for historical areas in conservation. As an early focus on urban color planning, Japan has a systematic approach to urban color planning. Hence, this paper selects five merchant towns from the category of important traditional building preservation areas in Japan as the subject of this study to explore the color structure and emotion of this type of historic area. First, the image semantic segmentation method identifies the buildings, roads, and landscape environments. Their color data were extracted for color composition and emotion analysis to summarize their common features. Second, the obtained Internet evaluations were extracted by natural language processing for keyword extraction. The correlation analysis of the color structure and keywords provides a valuable reference for conservation decisions for this historic area in the town. This paper also combines the color structure and Internet evaluation results with generative adversarial networks to generate predicted images of color structure improvements and color improvement schemes. The methods and conclusions of this paper can provide new ideas for the digital management of environmental colors in historic districts and provide a valuable reference for the inheritance of local traditional culture.Keywords: historic districts, color planning, semantic segmentation, natural language processing
Procedia PDF Downloads 88